Compare commits
No commits in common. "main" and "master" have entirely different histories.
59
.env.example
59
.env.example
|
|
@ -1,39 +1,38 @@
|
||||||
# Server
|
# JIRA AI Fixer v2.0 - Environment Configuration
|
||||||
JIRA_URL=https://gojira.yourcompany.com
|
# Copy this file to .env and fill in your values
|
||||||
JIRA_TOKEN=your_jira_token
|
|
||||||
JIRA_WEBHOOK_SECRET=random_secret_for_webhook_validation
|
|
||||||
|
|
||||||
BITBUCKET_URL=https://bitbucket.yourcompany.com
|
# ===== REQUIRED =====
|
||||||
BITBUCKET_TOKEN=your_bitbucket_token
|
|
||||||
|
|
||||||
# LLM (Production - Azure OpenAI)
|
# Database (PostgreSQL)
|
||||||
AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com
|
DATABASE_URL=postgresql+asyncpg://postgres:postgres@localhost:5432/jira_fixer
|
||||||
AZURE_OPENAI_KEY=your_azure_key
|
|
||||||
AZURE_OPENAI_MODEL=gpt-4o
|
|
||||||
AZURE_OPENAI_EMBEDDING_MODEL=text-embedding-3-large
|
|
||||||
|
|
||||||
# LLM (Development - OpenRouter Free)
|
# Security (generate with: openssl rand -hex 32)
|
||||||
OPENROUTER_API_KEY=your_openrouter_key
|
SECRET_KEY=change-me-in-production
|
||||||
OPENROUTER_MODEL=meta-llama/llama-3.3-70b-instruct:free
|
JWT_SECRET=change-me-in-production
|
||||||
|
|
||||||
# Use Azure (production) or OpenRouter (development)
|
# ===== OPTIONAL =====
|
||||||
LLM_PROVIDER=openrouter
|
|
||||||
|
|
||||||
# Database
|
# Redis (for job queue)
|
||||||
DATABASE_URL=postgresql://jira:jira@localhost:5432/jira_fixer
|
REDIS_URL=redis://localhost:6379/0
|
||||||
# For development with SQLite:
|
|
||||||
# DATABASE_URL=sqlite:///./jira_fixer.db
|
|
||||||
|
|
||||||
# Redis
|
# Email notifications (https://resend.com)
|
||||||
REDIS_URL=redis://localhost:6379
|
RESEND_API_KEY=
|
||||||
|
EMAIL_FROM=JIRA AI Fixer <noreply@yourdomain.com>
|
||||||
|
|
||||||
# Embeddings (local MiniLM or Azure)
|
# AI Analysis (https://openrouter.ai)
|
||||||
EMBEDDING_PROVIDER=local
|
OPENROUTER_API_KEY=
|
||||||
# EMBEDDING_PROVIDER=azure
|
|
||||||
|
|
||||||
# Portal
|
# Git Integration
|
||||||
PORTAL_SECRET_KEY=change_this_to_random_string
|
GITEA_URL=
|
||||||
PORTAL_ADMIN_EMAIL=admin@example.com
|
GITEA_TOKEN=
|
||||||
|
|
||||||
# Logging
|
# Application URL (for emails and callbacks)
|
||||||
LOG_LEVEL=INFO
|
APP_URL=http://localhost:8000
|
||||||
|
|
||||||
|
# JIRA Cloud OAuth
|
||||||
|
JIRA_CLIENT_ID=
|
||||||
|
JIRA_CLIENT_SECRET=
|
||||||
|
|
||||||
|
# GitHub OAuth
|
||||||
|
GITHUB_CLIENT_ID=
|
||||||
|
GITHUB_CLIENT_SECRET=
|
||||||
|
|
|
||||||
|
|
@ -1,96 +1,11 @@
|
||||||
# Byte-compiled / optimized / DLL files
|
frontend/node_modules/
|
||||||
|
frontend/dist/
|
||||||
__pycache__/
|
__pycache__/
|
||||||
*.py[cod]
|
*.pyc
|
||||||
*$py.class
|
|
||||||
|
|
||||||
# C extensions
|
|
||||||
*.so
|
|
||||||
|
|
||||||
# Distribution / packaging
|
|
||||||
.Python
|
|
||||||
build/
|
|
||||||
develop-eggs/
|
|
||||||
dist/
|
|
||||||
downloads/
|
|
||||||
eggs/
|
|
||||||
.eggs/
|
|
||||||
lib/
|
|
||||||
lib64/
|
|
||||||
parts/
|
|
||||||
sdist/
|
|
||||||
var/
|
|
||||||
wheels/
|
|
||||||
*.egg-info/
|
|
||||||
.installed.cfg
|
|
||||||
*.egg
|
|
||||||
|
|
||||||
# PyInstaller
|
|
||||||
*.manifest
|
|
||||||
*.spec
|
|
||||||
|
|
||||||
# Installer logs
|
|
||||||
pip-log.txt
|
|
||||||
pip-delete-this-directory.txt
|
|
||||||
|
|
||||||
# Unit test / coverage reports
|
|
||||||
htmlcov/
|
|
||||||
.tox/
|
|
||||||
.nox/
|
|
||||||
.coverage
|
|
||||||
.coverage.*
|
|
||||||
.cache
|
|
||||||
nosetests.xml
|
|
||||||
coverage.xml
|
|
||||||
*.cover
|
|
||||||
*.py,cover
|
|
||||||
.hypothesis/
|
|
||||||
.pytest_cache/
|
|
||||||
|
|
||||||
# Translations
|
|
||||||
*.mo
|
|
||||||
*.pot
|
|
||||||
|
|
||||||
# Environments
|
|
||||||
.env
|
.env
|
||||||
.env.local
|
.venv/
|
||||||
.venv
|
*.egg-info/
|
||||||
env/
|
package-lock.json
|
||||||
venv/
|
frontend/node_modules/
|
||||||
ENV/
|
frontend/dist/
|
||||||
env.bak/
|
frontend/package-lock.json
|
||||||
venv.bak/
|
|
||||||
|
|
||||||
# IDEs
|
|
||||||
.idea/
|
|
||||||
.vscode/
|
|
||||||
*.swp
|
|
||||||
*.swo
|
|
||||||
*~
|
|
||||||
|
|
||||||
# Node.js
|
|
||||||
node_modules/
|
|
||||||
npm-debug.log*
|
|
||||||
yarn-debug.log*
|
|
||||||
yarn-error.log*
|
|
||||||
|
|
||||||
# Build outputs
|
|
||||||
portal/dist/
|
|
||||||
portal/build/
|
|
||||||
*.log
|
|
||||||
|
|
||||||
# Database
|
|
||||||
*.db
|
|
||||||
*.sqlite
|
|
||||||
|
|
||||||
# Local config
|
|
||||||
.env.local
|
|
||||||
.env.*.local
|
|
||||||
|
|
||||||
# OS
|
|
||||||
.DS_Store
|
|
||||||
Thumbs.db
|
|
||||||
|
|
||||||
# Secrets
|
|
||||||
*.pem
|
|
||||||
*.key
|
|
||||||
secrets/
|
|
||||||
|
|
|
||||||
|
|
@ -0,0 +1,128 @@
|
||||||
|
# JIRA AI Fixer v2.0 - Credenciais e Acesso
|
||||||
|
|
||||||
|
## 🌐 URLs
|
||||||
|
|
||||||
|
- **Frontend:** https://jira-fixer.startdata.com.br
|
||||||
|
- **API:** https://jira-fixer.startdata.com.br/api
|
||||||
|
- **Repositório:** https://gitea.startdata.com.br/startdata/jira-ai-fixer
|
||||||
|
|
||||||
|
## 🔐 Credenciais Padrão
|
||||||
|
|
||||||
|
### Primeira vez (criar conta):
|
||||||
|
```bash
|
||||||
|
POST https://jira-fixer.startdata.com.br/api/auth/register
|
||||||
|
{
|
||||||
|
"email": "admin@startdata.com.br",
|
||||||
|
"password": "JiraFixer2026!",
|
||||||
|
"name": "Admin User"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Login:
|
||||||
|
```bash
|
||||||
|
POST https://jira-fixer.startdata.com.br/api/auth/login
|
||||||
|
{
|
||||||
|
"email": "admin@startdata.com.br",
|
||||||
|
"password": "JiraFixer2026!"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Retorno:** `{ "access_token": "..." }`
|
||||||
|
|
||||||
|
### Usar token:
|
||||||
|
```bash
|
||||||
|
curl -H "Authorization: Bearer YOUR_TOKEN" https://jira-fixer.startdata.com.br/api/issues
|
||||||
|
```
|
||||||
|
|
||||||
|
## 📊 Banco de Dados
|
||||||
|
|
||||||
|
- **Type:** PostgreSQL 15
|
||||||
|
- **Host:** postgres_database (internal Docker network)
|
||||||
|
- **Port:** 5432
|
||||||
|
- **Database:** jira_fixer_v2
|
||||||
|
- **User:** postgres
|
||||||
|
- **Password:** postgres
|
||||||
|
|
||||||
|
**Connection string:**
|
||||||
|
```
|
||||||
|
postgresql+asyncpg://postgres:postgres@postgres_database:5432/jira_fixer_v2
|
||||||
|
```
|
||||||
|
|
||||||
|
## 🧪 Teste End-to-End
|
||||||
|
|
||||||
|
### Cenário: Bug no COBOL → Análise AI → Fix automático
|
||||||
|
|
||||||
|
1. **Repo de teste:** https://gitea.startdata.com.br/startdata/cobol-sample-app
|
||||||
|
2. **Bug introduzido:** `src/cobol/VALIDATE.CBL` - aceita cartões com 10+ dígitos ao invés de exatamente 16
|
||||||
|
3. **Ticket criado:** TicketHub SUPP-6
|
||||||
|
|
||||||
|
### Fluxo esperado:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# 1. Criar issue no JIRA Fixer
|
||||||
|
curl -X POST https://jira-fixer.startdata.com.br/api/issues \
|
||||||
|
-H "Authorization: Bearer $TOKEN" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-d '{
|
||||||
|
"title": "Card validation accepts invalid card numbers",
|
||||||
|
"description": "VALIDATE.CBL accepts cards with >= 10 digits instead of exactly 16",
|
||||||
|
"source": "tickethub",
|
||||||
|
"external_key": "SUPP-6",
|
||||||
|
"external_url": "https://tickethub.startdata.com.br/tickets/6",
|
||||||
|
"priority": "high"
|
||||||
|
}'
|
||||||
|
|
||||||
|
# 2. AI analisa automaticamente (background task)
|
||||||
|
# - Identifica arquivo: src/cobol/VALIDATE.CBL
|
||||||
|
# - Root cause: >= 10 ao invés de = 16
|
||||||
|
# - Confiança: ~95%
|
||||||
|
|
||||||
|
# 3. Se confiança >= 70%, cria PR automaticamente
|
||||||
|
# - Fork/branch: fix/SUPP-6-auto-fix
|
||||||
|
# - Commit: correção da validação
|
||||||
|
# - PR no Gitea
|
||||||
|
|
||||||
|
# 4. Atualiza ticket TicketHub
|
||||||
|
PATCH https://tickethub.startdata.com.br/api/tickets/6 \
|
||||||
|
-d '{
|
||||||
|
"status": "in_progress",
|
||||||
|
"description": "AI analysis complete. PR created: [link]"
|
||||||
|
}'
|
||||||
|
```
|
||||||
|
|
||||||
|
## ⚙️ Integrações
|
||||||
|
|
||||||
|
### TicketHub
|
||||||
|
- **URL:** https://tickethub.startdata.com.br
|
||||||
|
- **Webhook:** `https://jira-fixer.startdata.com.br/api/webhooks/tickethub`
|
||||||
|
- **Project:** SUPP (ID: 1)
|
||||||
|
|
||||||
|
### Gitea
|
||||||
|
- **URL:** https://gitea.startdata.com.br
|
||||||
|
- **Token:** (configurar em Settings → API Keys)
|
||||||
|
- **Repo de teste:** startdata/cobol-sample-app
|
||||||
|
|
||||||
|
### AI Model
|
||||||
|
- **Provider:** OpenRouter
|
||||||
|
- **Model:** claude-3.5-sonnet / llama-3.3-70b-instruct
|
||||||
|
- **API Key:** (variável de ambiente OPENROUTER_API_KEY)
|
||||||
|
|
||||||
|
## 🚨 Status Atual (2026-02-18)
|
||||||
|
|
||||||
|
**Frontend:** ✅ Deployado e funcional
|
||||||
|
**Backend API:** ⚠️ Stack precisa ser atualizado para incluir serviço API Python
|
||||||
|
|
||||||
|
### Próximo passo para completar deploy:
|
||||||
|
1. Atualizar stack 308 para incluir serviço `api` (Python FastAPI)
|
||||||
|
2. Configurar DB connection string
|
||||||
|
3. Configurar OPENROUTER_API_KEY
|
||||||
|
4. Testar registro/login
|
||||||
|
5. Executar teste end-to-end
|
||||||
|
|
||||||
|
## 📝 Notas
|
||||||
|
|
||||||
|
- Frontend é SPA React, serve via nginx
|
||||||
|
- Backend é FastAPI async com PostgreSQL
|
||||||
|
- Análise AI roda em background tasks
|
||||||
|
- PR creation usa Gitea API
|
||||||
|
- Webhooks bidirecional com TicketHub
|
||||||
|
|
@ -0,0 +1,32 @@
|
||||||
|
# Stage 1: Build frontend
|
||||||
|
FROM node:20-alpine AS frontend-builder
|
||||||
|
WORKDIR /build
|
||||||
|
|
||||||
|
COPY frontend/package.json frontend/package-lock.json* ./
|
||||||
|
RUN npm install
|
||||||
|
|
||||||
|
COPY frontend/ ./
|
||||||
|
RUN npm run build
|
||||||
|
|
||||||
|
# Stage 2: Python backend
|
||||||
|
FROM python:3.11-slim
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Install dependencies
|
||||||
|
COPY requirements.txt .
|
||||||
|
RUN pip install --no-cache-dir -r requirements.txt
|
||||||
|
|
||||||
|
# Copy backend
|
||||||
|
COPY app/ ./app/
|
||||||
|
|
||||||
|
# Copy built frontend
|
||||||
|
COPY --from=frontend-builder /build/dist ./frontend/
|
||||||
|
|
||||||
|
# Environment
|
||||||
|
ENV PYTHONUNBUFFERED=1
|
||||||
|
ENV PYTHONDONTWRITEBYTECODE=1
|
||||||
|
|
||||||
|
EXPOSE 8000
|
||||||
|
|
||||||
|
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]
|
||||||
|
|
@ -0,0 +1,295 @@
|
||||||
|
# JIRA AI Fixer v2.0 - Installation Guide
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
JIRA AI Fixer is an enterprise AI-powered platform that automatically analyzes issues from JIRA, ServiceNow, GitHub, GitLab and other platforms, generates root cause analysis, and creates Pull Requests with fixes.
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
|
||||||
|
```
|
||||||
|
┌─────────────┐ ┌──────────────┐ ┌────────────┐
|
||||||
|
│ Frontend │────▶│ Backend │────▶│ PostgreSQL │
|
||||||
|
│ (Nginx) │ │ (FastAPI) │ │ │
|
||||||
|
│ React SPA │ │ Python 3.11 │ └────────────┘
|
||||||
|
└─────────────┘ └──────┬───────┘
|
||||||
|
│
|
||||||
|
┌──────▼───────┐
|
||||||
|
│ Redis │
|
||||||
|
│ (Queue) │
|
||||||
|
└──────────────┘
|
||||||
|
```
|
||||||
|
|
||||||
|
### Tech Stack
|
||||||
|
|
||||||
|
**Backend:**
|
||||||
|
- Python 3.11 + FastAPI
|
||||||
|
- PostgreSQL (async via SQLAlchemy + asyncpg)
|
||||||
|
- Redis (job queue)
|
||||||
|
- JWT Authentication
|
||||||
|
- Resend (email notifications)
|
||||||
|
|
||||||
|
**Frontend:**
|
||||||
|
- React 18 + Vite
|
||||||
|
- TailwindCSS + shadcn/ui components
|
||||||
|
- React Query (data fetching)
|
||||||
|
- Recharts (analytics)
|
||||||
|
- React Router (SPA routing)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Prerequisites
|
||||||
|
|
||||||
|
- Docker & Docker Compose (or Docker Swarm)
|
||||||
|
- PostgreSQL 14+ (or use existing instance)
|
||||||
|
- Redis (or use existing instance)
|
||||||
|
- A domain with SSL (recommended)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Quick Start (Docker Compose)
|
||||||
|
|
||||||
|
### 1. Clone the repository
|
||||||
|
|
||||||
|
```bash
|
||||||
|
git clone https://gitea.startdata.com.br/startdata/jira-ai-fixer.git
|
||||||
|
cd jira-ai-fixer
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Configure environment
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cp .env.example .env
|
||||||
|
```
|
||||||
|
|
||||||
|
Edit `.env` with your settings:
|
||||||
|
|
||||||
|
```env
|
||||||
|
# Database
|
||||||
|
DATABASE_URL=postgresql+asyncpg://postgres:postgres@db:5432/jira_fixer
|
||||||
|
|
||||||
|
# Redis
|
||||||
|
REDIS_URL=redis://redis:6379/0
|
||||||
|
|
||||||
|
# Security (generate with: openssl rand -hex 32)
|
||||||
|
SECRET_KEY=your-secret-key-here
|
||||||
|
JWT_SECRET=your-jwt-secret-here
|
||||||
|
|
||||||
|
# Email (optional - Resend.com)
|
||||||
|
RESEND_API_KEY=re_xxxxx
|
||||||
|
EMAIL_FROM=JIRA AI Fixer <noreply@yourdomain.com>
|
||||||
|
|
||||||
|
# AI Analysis (optional - OpenRouter.ai)
|
||||||
|
OPENROUTER_API_KEY=sk-or-xxxxx
|
||||||
|
|
||||||
|
# Git Integration (optional - Gitea/GitHub)
|
||||||
|
GITEA_URL=https://gitea.yourdomain.com
|
||||||
|
GITEA_TOKEN=your-token
|
||||||
|
|
||||||
|
# OAuth Integrations (optional)
|
||||||
|
JIRA_CLIENT_ID=
|
||||||
|
JIRA_CLIENT_SECRET=
|
||||||
|
GITHUB_CLIENT_ID=
|
||||||
|
GITHUB_CLIENT_SECRET=
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3. Start with Docker Compose
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker compose up -d
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4. Access the application
|
||||||
|
|
||||||
|
- **Frontend:** http://localhost (or your domain)
|
||||||
|
- **API Docs:** http://localhost/api/docs
|
||||||
|
- **Health Check:** http://localhost/api/health
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Production Deployment (Docker Swarm + Traefik)
|
||||||
|
|
||||||
|
### 1. Create the stack file
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
version: '3.8'
|
||||||
|
|
||||||
|
services:
|
||||||
|
api:
|
||||||
|
image: python:3.11-slim
|
||||||
|
command: >
|
||||||
|
bash -c "
|
||||||
|
apt-get update && apt-get install -y curl &&
|
||||||
|
pip install fastapi uvicorn[standard] sqlalchemy[asyncio] asyncpg
|
||||||
|
pydantic[email] pydantic-settings python-jose[cryptography]
|
||||||
|
passlib[bcrypt] httpx python-multipart email-validator &&
|
||||||
|
mkdir -p /app && cd /app &&
|
||||||
|
curl -sL 'https://gitea.yourdomain.com/org/jira-ai-fixer/archive/master.tar.gz' |
|
||||||
|
tar xz --strip-components=1 &&
|
||||||
|
uvicorn app.main:app --host 0.0.0.0 --port 8000
|
||||||
|
"
|
||||||
|
environment:
|
||||||
|
- DATABASE_URL=postgresql+asyncpg://user:pass@db_host:5432/jira_fixer
|
||||||
|
- REDIS_URL=redis://redis_host:6379
|
||||||
|
- JWT_SECRET=your-jwt-secret
|
||||||
|
- RESEND_API_KEY=re_xxxxx
|
||||||
|
- APP_URL=https://jira-fixer.yourdomain.com
|
||||||
|
networks:
|
||||||
|
- internal
|
||||||
|
- db_network
|
||||||
|
deploy:
|
||||||
|
replicas: 1
|
||||||
|
restart_policy:
|
||||||
|
condition: on-failure
|
||||||
|
delay: 25s
|
||||||
|
|
||||||
|
frontend:
|
||||||
|
image: nginx:alpine
|
||||||
|
command: >
|
||||||
|
sh -c "apk add --no-cache curl &&
|
||||||
|
mkdir -p /app && cd /app &&
|
||||||
|
curl -sL 'https://gitea.yourdomain.com/org/jira-ai-fixer/archive/master.tar.gz' |
|
||||||
|
tar xz --strip-components=1 &&
|
||||||
|
cp -r frontend_build/* /usr/share/nginx/html/ &&
|
||||||
|
echo 'c2VydmVyIHsKICBsaXN0ZW4gODA7...' | base64 -d > /etc/nginx/conf.d/default.conf &&
|
||||||
|
nginx -g 'daemon off;'"
|
||||||
|
networks:
|
||||||
|
- proxy_network
|
||||||
|
- internal
|
||||||
|
deploy:
|
||||||
|
labels:
|
||||||
|
- traefik.enable=true
|
||||||
|
- traefik.http.routers.jira-fixer.rule=Host(`jira-fixer.yourdomain.com`)
|
||||||
|
- traefik.http.routers.jira-fixer.entrypoints=websecure
|
||||||
|
- traefik.http.routers.jira-fixer.tls.certresolver=le
|
||||||
|
- traefik.http.services.jira-fixer.loadbalancer.server.port=80
|
||||||
|
|
||||||
|
networks:
|
||||||
|
proxy_network:
|
||||||
|
external: true
|
||||||
|
db_network:
|
||||||
|
external: true
|
||||||
|
internal:
|
||||||
|
driver: overlay
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Nginx Config (base64 encoded in command)
|
||||||
|
|
||||||
|
```nginx
|
||||||
|
server {
|
||||||
|
listen 80;
|
||||||
|
root /usr/share/nginx/html;
|
||||||
|
index index.html;
|
||||||
|
location / {
|
||||||
|
try_files $uri $uri/ /index.html;
|
||||||
|
}
|
||||||
|
location /api {
|
||||||
|
proxy_pass http://api:8000;
|
||||||
|
proxy_set_header Host $host;
|
||||||
|
proxy_set_header X-Real-IP $remote_addr;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3. Deploy
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker stack deploy -c docker-compose.yml jira-fixer
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Local Development
|
||||||
|
|
||||||
|
### Backend
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd app
|
||||||
|
python -m venv .venv
|
||||||
|
source .venv/bin/activate
|
||||||
|
pip install -r requirements.txt # or install manually (see stack command)
|
||||||
|
uvicorn app.main:app --reload --port 8000
|
||||||
|
```
|
||||||
|
|
||||||
|
### Frontend
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd frontend
|
||||||
|
npm install
|
||||||
|
npm run dev
|
||||||
|
```
|
||||||
|
|
||||||
|
Frontend dev server runs on http://localhost:5173 with proxy to backend.
|
||||||
|
|
||||||
|
### Build Frontend
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd frontend
|
||||||
|
npm run build
|
||||||
|
cp -r dist/* ../frontend_build/
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## API Endpoints
|
||||||
|
|
||||||
|
| Method | Endpoint | Description |
|
||||||
|
|--------|----------|-------------|
|
||||||
|
| POST | `/api/auth/register` | Register new user |
|
||||||
|
| POST | `/api/auth/login` | Login |
|
||||||
|
| GET | `/api/organizations` | List organizations |
|
||||||
|
| POST | `/api/organizations` | Create organization |
|
||||||
|
| GET | `/api/issues` | List issues |
|
||||||
|
| POST | `/api/issues` | Create issue |
|
||||||
|
| GET | `/api/issues/:id` | Get issue detail |
|
||||||
|
| PATCH | `/api/issues/:id` | Update issue |
|
||||||
|
| POST | `/api/webhooks/jira` | JIRA webhook |
|
||||||
|
| POST | `/api/webhooks/servicenow` | ServiceNow webhook |
|
||||||
|
| POST | `/api/webhooks/github` | GitHub webhook |
|
||||||
|
| GET | `/api/reports/summary` | Report summary |
|
||||||
|
| GET | `/api/health` | Health check |
|
||||||
|
|
||||||
|
Full API documentation available at `/api/docs` (Swagger UI).
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Integrations
|
||||||
|
|
||||||
|
### JIRA Cloud
|
||||||
|
1. Go to Settings > Integrations > JIRA
|
||||||
|
2. Enter your Atlassian domain, email, and API token
|
||||||
|
3. Configure webhook in JIRA to point to `https://your-domain/api/webhooks/jira`
|
||||||
|
|
||||||
|
### GitHub
|
||||||
|
1. Create a GitHub App or use personal access token
|
||||||
|
2. Configure in Settings > Integrations > GitHub
|
||||||
|
3. Set webhook URL: `https://your-domain/api/webhooks/github`
|
||||||
|
|
||||||
|
### ServiceNow
|
||||||
|
1. Configure REST integration in ServiceNow
|
||||||
|
2. Point to: `https://your-domain/api/webhooks/servicenow`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Environment Variables Reference
|
||||||
|
|
||||||
|
| Variable | Required | Default | Description |
|
||||||
|
|----------|----------|---------|-------------|
|
||||||
|
| `DATABASE_URL` | Yes | - | PostgreSQL connection string |
|
||||||
|
| `REDIS_URL` | No | `redis://localhost:6379` | Redis connection string |
|
||||||
|
| `SECRET_KEY` | Yes | - | App secret key |
|
||||||
|
| `JWT_SECRET` | Yes | - | JWT signing key |
|
||||||
|
| `JWT_EXPIRE_MINUTES` | No | `1440` | Token expiry (24h) |
|
||||||
|
| `RESEND_API_KEY` | No | - | Email service API key |
|
||||||
|
| `OPENROUTER_API_KEY` | No | - | AI analysis API key |
|
||||||
|
| `GITEA_URL` | No | - | Git server URL |
|
||||||
|
| `GITEA_TOKEN` | No | - | Git server access token |
|
||||||
|
| `JIRA_CLIENT_ID` | No | - | JIRA OAuth client ID |
|
||||||
|
| `JIRA_CLIENT_SECRET` | No | - | JIRA OAuth client secret |
|
||||||
|
| `GITHUB_CLIENT_ID` | No | - | GitHub OAuth client ID |
|
||||||
|
| `GITHUB_CLIENT_SECRET` | No | - | GitHub OAuth client secret |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## License
|
||||||
|
|
||||||
|
MIT © StartData
|
||||||
145
README.md
145
README.md
|
|
@ -1,118 +1,75 @@
|
||||||
# JIRA AI Fixer
|
# JIRA AI Fixer v2.0
|
||||||
|
|
||||||
AI system for automated JIRA Support Case analysis with COBOL/SQL/JCL code intelligence.
|
Enterprise AI-powered issue analysis and automated fix generation platform.
|
||||||
|
|
||||||
## Overview
|
## 🚀 Features
|
||||||
|
|
||||||
JIRA AI Fixer monitors JIRA Support Cases, analyzes the reported issues, searches relevant code in Bitbucket repositories, and proposes fixes using AI-powered code understanding.
|
### Issue Analysis
|
||||||
|
- 🤖 **AI-Powered Analysis** — Automatic root cause analysis using LLMs
|
||||||
|
- 🔀 **Auto PR Generation** — Creates Pull Requests with suggested fixes
|
||||||
|
- 🎯 **Confidence Scoring** — AI confidence level for each analysis
|
||||||
|
- 📊 **Analytics Dashboard** — Track trends, resolution rates, and team performance
|
||||||
|
|
||||||
## Architecture
|
### Multi-Source Integration
|
||||||
|
- 🔵 **JIRA Cloud** — Full bidirectional sync
|
||||||
|
- ⚙️ **ServiceNow** — Incident and change management
|
||||||
|
- 🐙 **GitHub** — Issues and repository integration
|
||||||
|
- 🦊 **GitLab** — Issues and merge requests
|
||||||
|
- 💚 **Zendesk** — Support ticket analysis
|
||||||
|
- 🎫 **TicketHub** — Native integration
|
||||||
|
|
||||||
```
|
### Enterprise Features
|
||||||
JIRA (webhook) → Event Processor → Code Intelligence → Fix Generator → Output (JIRA + PR)
|
- 🏢 **Multi-Organization** — Manage multiple teams/projects
|
||||||
```
|
- 🔐 **JWT Authentication** — Secure token-based auth
|
||||||
|
- 👥 **Team Management** — Role-based access control
|
||||||
|
- 📧 **Email Notifications** — Automated alerts via Resend
|
||||||
|
- 📈 **Reports & Analytics** — Performance metrics and insights
|
||||||
|
- 🔌 **Webhooks** — Incoming webhooks from any platform
|
||||||
|
- 📝 **Audit Logs** — Complete action history
|
||||||
|
|
||||||
## Stack
|
### Modern UI
|
||||||
|
- ⚡ **React 18** + Vite (fast builds)
|
||||||
|
- 🎨 **shadcn/ui** components (Button, Dialog, Command, Toast, Skeleton...)
|
||||||
|
- 📊 **Recharts** interactive charts
|
||||||
|
- 🌙 **Dark Mode** by default
|
||||||
|
- 📱 **Responsive** layout
|
||||||
|
|
||||||
- **Backend:** Python 3.11+ / FastAPI
|
## 📦 Tech Stack
|
||||||
- **Vector DB:** Qdrant (embeddings)
|
|
||||||
- **Queue:** Redis + Bull
|
|
||||||
- **Database:** PostgreSQL
|
|
||||||
- **LLM:** Azure OpenAI GPT-4o / OpenRouter (dev)
|
|
||||||
- **Embeddings:** MiniLM-L6-v2 (local) / Azure OpenAI (prod)
|
|
||||||
|
|
||||||
## Project Structure
|
| Layer | Technology |
|
||||||
|
|-------|-----------|
|
||||||
|
| **Frontend** | React 18, Vite, TailwindCSS, shadcn/ui, Recharts |
|
||||||
|
| **Backend** | Python 3.11, FastAPI, SQLAlchemy (async) |
|
||||||
|
| **Database** | PostgreSQL 14+ |
|
||||||
|
| **Queue** | Redis |
|
||||||
|
| **Email** | Resend |
|
||||||
|
| **AI** | OpenRouter (Llama, Claude, GPT) |
|
||||||
|
|
||||||
```
|
## 🛠 Quick Start
|
||||||
jira-ai-fixer/
|
|
||||||
├── api/ # FastAPI backend
|
|
||||||
│ ├── main.py
|
|
||||||
│ ├── routers/
|
|
||||||
│ │ ├── webhook.py # JIRA/Bitbucket webhooks
|
|
||||||
│ │ ├── issues.py # Issue management
|
|
||||||
│ │ └── config.py # Configuration API
|
|
||||||
│ ├── services/
|
|
||||||
│ │ ├── jira.py # JIRA client
|
|
||||||
│ │ ├── bitbucket.py # Bitbucket client
|
|
||||||
│ │ ├── llm.py # LLM orchestration
|
|
||||||
│ │ └── embeddings.py # Code indexing
|
|
||||||
│ └── models/
|
|
||||||
├── portal/ # React admin UI
|
|
||||||
│ ├── src/
|
|
||||||
│ └── package.json
|
|
||||||
├── workers/ # Background processors
|
|
||||||
│ ├── analyzer.py
|
|
||||||
│ └── indexer.py
|
|
||||||
├── tests/
|
|
||||||
├── docker-compose.yml
|
|
||||||
├── .env.example
|
|
||||||
└── README.md
|
|
||||||
```
|
|
||||||
|
|
||||||
## Quick Start
|
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
# Clone
|
# Clone
|
||||||
git clone https://gitea.startdata.com.br/startdata/jira-ai-fixer.git
|
git clone https://gitea.startdata.com.br/startdata/jira-ai-fixer.git
|
||||||
cd jira-ai-fixer
|
cd jira-ai-fixer
|
||||||
|
|
||||||
# Configure
|
|
||||||
cp .env.example .env
|
|
||||||
# Edit .env with your credentials
|
|
||||||
|
|
||||||
# Run (development)
|
|
||||||
docker compose up -d
|
|
||||||
|
|
||||||
# Access portal
|
|
||||||
open https://localhost:8080
|
|
||||||
```
|
|
||||||
|
|
||||||
## Development
|
|
||||||
|
|
||||||
### Requirements
|
|
||||||
|
|
||||||
- Python 3.11+
|
|
||||||
- Node.js 20+
|
|
||||||
- Docker & Docker Compose
|
|
||||||
- Redis
|
|
||||||
- PostgreSQL (or SQLite for dev)
|
|
||||||
|
|
||||||
### Local Setup
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Backend
|
# Backend
|
||||||
cd api
|
pip install fastapi uvicorn sqlalchemy[asyncio] asyncpg pydantic-settings python-jose passlib httpx
|
||||||
python -m venv venv
|
uvicorn app.main:app --reload
|
||||||
source venv/bin/activate
|
|
||||||
pip install -r requirements.txt
|
|
||||||
uvicorn main:app --reload
|
|
||||||
|
|
||||||
# Portal
|
# Frontend
|
||||||
cd portal
|
cd frontend && npm install && npm run dev
|
||||||
npm install
|
|
||||||
npm run dev
|
|
||||||
```
|
```
|
||||||
|
|
||||||
## Configuration
|
## 📖 Documentation
|
||||||
|
|
||||||
All configuration is done via the Admin Portal or environment variables:
|
- **[Installation Guide](INSTALL.md)** — Full setup instructions
|
||||||
|
- **[API Documentation](https://jira-fixer.startdata.com.br/api/docs)** — Swagger UI
|
||||||
|
|
||||||
| Variable | Description | Required |
|
## 🌐 Live Demo
|
||||||
|----------|-------------|----------|
|
|
||||||
| `JIRA_URL` | JIRA Server URL | Yes |
|
|
||||||
| `JIRA_TOKEN` | JIRA API Token | Yes |
|
|
||||||
| `BITBUCKET_URL` | Bitbucket Server URL | Yes |
|
|
||||||
| `BITBUCKET_TOKEN` | Bitbucket Access Token | Yes |
|
|
||||||
| `AZURE_OPENAI_ENDPOINT` | Azure OpenAI endpoint | Yes (prod) |
|
|
||||||
| `AZURE_OPENAI_KEY` | Azure OpenAI API key | Yes (prod) |
|
|
||||||
| `OPENROUTER_API_KEY` | OpenRouter key | Yes (dev) |
|
|
||||||
| `DATABASE_URL` | PostgreSQL connection | Yes |
|
|
||||||
| `REDIS_URL` | Redis connection | Yes |
|
|
||||||
|
|
||||||
## License
|
- **App:** https://jira-fixer.startdata.com.br
|
||||||
|
- **API:** https://jira-fixer.startdata.com.br/api/docs
|
||||||
|
|
||||||
Proprietary - Ricel Leite
|
## 📄 License
|
||||||
|
|
||||||
## Contact
|
MIT © StartData
|
||||||
|
|
||||||
- **Developer:** Ricel Leite
|
|
||||||
|
|
|
||||||
|
|
@ -1,21 +0,0 @@
|
||||||
FROM python:3.11-slim
|
|
||||||
|
|
||||||
WORKDIR /app
|
|
||||||
|
|
||||||
# Install system dependencies
|
|
||||||
RUN apt-get update && apt-get install -y \
|
|
||||||
gcc \
|
|
||||||
&& rm -rf /var/lib/apt/lists/*
|
|
||||||
|
|
||||||
# Install Python dependencies
|
|
||||||
COPY requirements.txt .
|
|
||||||
RUN pip install --no-cache-dir -r requirements.txt
|
|
||||||
|
|
||||||
# Copy application
|
|
||||||
COPY . .
|
|
||||||
|
|
||||||
# Expose port
|
|
||||||
EXPOSE 8000
|
|
||||||
|
|
||||||
# Run
|
|
||||||
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000"]
|
|
||||||
|
|
@ -1,37 +0,0 @@
|
||||||
# Nova função post_analysis_comment com formatação melhor
|
|
||||||
async def post_analysis_comment(ticket: dict, result: dict):
|
|
||||||
"""Post analysis result back to TicketHub as a comment"""
|
|
||||||
ticket_id = ticket.get("id")
|
|
||||||
if not ticket_id:
|
|
||||||
return
|
|
||||||
|
|
||||||
confidence_pct = int(result.get("confidence", 0) * 100)
|
|
||||||
files = ", ".join(result.get("affected_files", ["Unknown"]))
|
|
||||||
|
|
||||||
# Formatação texto plano com quebras de linha claras
|
|
||||||
comment = f"""🤖 AI ANALYSIS COMPLETE
|
|
||||||
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
|
||||||
|
|
||||||
📋 ROOT CAUSE:
|
|
||||||
{result.get('analysis', 'Unable to determine')}
|
|
||||||
|
|
||||||
📁 AFFECTED FILES: {files}
|
|
||||||
|
|
||||||
🔧 SUGGESTED FIX:
|
|
||||||
────────────────────────────────────────
|
|
||||||
{result.get('suggested_fix', 'No fix suggested')}
|
|
||||||
────────────────────────────────────────
|
|
||||||
|
|
||||||
📊 CONFIDENCE: {confidence_pct}%
|
|
||||||
|
|
||||||
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
|
||||||
Analyzed by JIRA AI Fixer"""
|
|
||||||
|
|
||||||
async with httpx.AsyncClient(timeout=10.0) as client:
|
|
||||||
try:
|
|
||||||
await client.post(
|
|
||||||
f"https://tickethub.startdata.com.br/api/tickets/{ticket_id}/comments",
|
|
||||||
json={"author": "AI Fixer", "content": comment}
|
|
||||||
)
|
|
||||||
except:
|
|
||||||
pass
|
|
||||||
53
api/main.py
53
api/main.py
|
|
@ -1,53 +0,0 @@
|
||||||
"""
|
|
||||||
JIRA AI Fixer - FastAPI Backend
|
|
||||||
"""
|
|
||||||
from fastapi import FastAPI
|
|
||||||
from fastapi.middleware.cors import CORSMiddleware
|
|
||||||
from contextlib import asynccontextmanager
|
|
||||||
import logging
|
|
||||||
|
|
||||||
from routers import webhook, issues, config
|
|
||||||
|
|
||||||
logging.basicConfig(level=logging.INFO)
|
|
||||||
logger = logging.getLogger(__name__)
|
|
||||||
|
|
||||||
|
|
||||||
@asynccontextmanager
|
|
||||||
async def lifespan(app: FastAPI):
|
|
||||||
"""Startup and shutdown events."""
|
|
||||||
logger.info("🚀 JIRA AI Fixer starting up...")
|
|
||||||
# Initialize database, connections, etc.
|
|
||||||
yield
|
|
||||||
logger.info("👋 JIRA AI Fixer shutting down...")
|
|
||||||
|
|
||||||
|
|
||||||
app = FastAPI(
|
|
||||||
title="JIRA AI Fixer",
|
|
||||||
description="AI system for automated JIRA Support Case analysis",
|
|
||||||
version="0.1.0",
|
|
||||||
lifespan=lifespan,
|
|
||||||
)
|
|
||||||
|
|
||||||
# CORS for portal
|
|
||||||
app.add_middleware(
|
|
||||||
CORSMiddleware,
|
|
||||||
allow_origins=["*"], # Configure properly in production
|
|
||||||
allow_credentials=True,
|
|
||||||
allow_methods=["*"],
|
|
||||||
allow_headers=["*"],
|
|
||||||
)
|
|
||||||
|
|
||||||
# Include routers
|
|
||||||
app.include_router(webhook.router, prefix="/api/webhook", tags=["webhook"])
|
|
||||||
app.include_router(issues.router, prefix="/api/issues", tags=["issues"])
|
|
||||||
app.include_router(config.router, prefix="/api/config", tags=["config"])
|
|
||||||
|
|
||||||
|
|
||||||
@app.get("/")
|
|
||||||
async def root():
|
|
||||||
return {"status": "ok", "service": "JIRA AI Fixer", "version": "0.1.0"}
|
|
||||||
|
|
||||||
|
|
||||||
@app.get("/health")
|
|
||||||
async def health():
|
|
||||||
return {"status": "healthy"}
|
|
||||||
883
api/main_v2.py
883
api/main_v2.py
|
|
@ -1,883 +0,0 @@
|
||||||
"""
|
|
||||||
JIRA AI Fixer - Intelligent Support Case Resolution
|
|
||||||
Complete API with webhook handling and AI analysis
|
|
||||||
"""
|
|
||||||
import os
|
|
||||||
import json
|
|
||||||
import httpx
|
|
||||||
import asyncio
|
|
||||||
from datetime import datetime
|
|
||||||
from contextlib import asynccontextmanager
|
|
||||||
from fastapi import FastAPI, HTTPException, BackgroundTasks
|
|
||||||
from fastapi.middleware.cors import CORSMiddleware
|
|
||||||
from fastapi.responses import HTMLResponse
|
|
||||||
from pydantic import BaseModel
|
|
||||||
from typing import Optional, List, Dict, Any
|
|
||||||
import asyncpg
|
|
||||||
|
|
||||||
# Config
|
|
||||||
DATABASE_URL = os.getenv("DATABASE_URL", "postgresql://jira:jira_secret_2026@postgres:5432/jira_fixer")
|
|
||||||
OPENROUTER_API_KEY = os.getenv("OPENROUTER_API_KEY", "")
|
|
||||||
GITEA_URL = os.getenv("GITEA_URL", "https://gitea.startdata.com.br")
|
|
||||||
COBOL_REPO = os.getenv("COBOL_REPO", "startdata/cobol-sample-app")
|
|
||||||
|
|
||||||
# Database pool
|
|
||||||
db_pool = None
|
|
||||||
|
|
||||||
async def init_db():
|
|
||||||
global db_pool
|
|
||||||
db_pool = await asyncpg.create_pool(DATABASE_URL, min_size=2, max_size=10)
|
|
||||||
|
|
||||||
async with db_pool.acquire() as conn:
|
|
||||||
await conn.execute("""
|
|
||||||
CREATE TABLE IF NOT EXISTS issues (
|
|
||||||
id SERIAL PRIMARY KEY,
|
|
||||||
external_id TEXT,
|
|
||||||
external_key TEXT,
|
|
||||||
source TEXT,
|
|
||||||
title TEXT,
|
|
||||||
description TEXT,
|
|
||||||
status TEXT DEFAULT 'pending',
|
|
||||||
analysis TEXT,
|
|
||||||
confidence FLOAT,
|
|
||||||
affected_files TEXT,
|
|
||||||
suggested_fix TEXT,
|
|
||||||
created_at TIMESTAMP DEFAULT NOW(),
|
|
||||||
analyzed_at TIMESTAMP
|
|
||||||
);
|
|
||||||
CREATE TABLE IF NOT EXISTS repositories (
|
|
||||||
id SERIAL PRIMARY KEY,
|
|
||||||
name TEXT UNIQUE,
|
|
||||||
url TEXT,
|
|
||||||
indexed_at TIMESTAMP,
|
|
||||||
file_count INT DEFAULT 0
|
|
||||||
);
|
|
||||||
CREATE INDEX IF NOT EXISTS idx_issues_status ON issues(status);
|
|
||||||
CREATE INDEX IF NOT EXISTS idx_issues_external ON issues(external_id, source);
|
|
||||||
""")
|
|
||||||
|
|
||||||
@asynccontextmanager
|
|
||||||
async def lifespan(app: FastAPI):
|
|
||||||
await init_db()
|
|
||||||
yield
|
|
||||||
if db_pool:
|
|
||||||
await db_pool.close()
|
|
||||||
|
|
||||||
app = FastAPI(title="JIRA AI Fixer", version="1.0.0", lifespan=lifespan)
|
|
||||||
app.add_middleware(CORSMiddleware, allow_origins=["*"], allow_credentials=True, allow_methods=["*"], allow_headers=["*"])
|
|
||||||
|
|
||||||
# Models
|
|
||||||
class WebhookPayload(BaseModel):
|
|
||||||
event: str
|
|
||||||
timestamp: str
|
|
||||||
data: Dict[str, Any]
|
|
||||||
|
|
||||||
class IssueResponse(BaseModel):
|
|
||||||
id: int
|
|
||||||
external_key: str
|
|
||||||
title: str
|
|
||||||
status: str
|
|
||||||
confidence: Optional[float]
|
|
||||||
analysis: Optional[str]
|
|
||||||
suggested_fix: Optional[str]
|
|
||||||
|
|
||||||
# Health
|
|
||||||
@app.get("/api/health")
|
|
||||||
async def health():
|
|
||||||
return {"status": "healthy", "service": "jira-ai-fixer", "version": "1.0.0"}
|
|
||||||
|
|
||||||
# Webhook endpoint for TicketHub
|
|
||||||
@app.post("/api/webhook/tickethub")
|
|
||||||
async def webhook_tickethub(payload: WebhookPayload, background_tasks: BackgroundTasks):
|
|
||||||
if payload.event != "ticket.created":
|
|
||||||
return {"status": "ignored", "reason": f"event {payload.event} not handled"}
|
|
||||||
|
|
||||||
ticket = payload.data
|
|
||||||
|
|
||||||
# Save to database
|
|
||||||
async with db_pool.acquire() as conn:
|
|
||||||
issue_id = await conn.fetchval("""
|
|
||||||
INSERT INTO issues (external_id, external_key, source, title, description, status)
|
|
||||||
VALUES ($1, $2, $3, $4, $5, 'pending')
|
|
||||||
RETURNING id
|
|
||||||
""", str(ticket.get("id")), ticket.get("key"), "tickethub",
|
|
||||||
ticket.get("title"), ticket.get("description"))
|
|
||||||
|
|
||||||
# Trigger analysis in background
|
|
||||||
background_tasks.add_task(analyze_issue, issue_id, ticket)
|
|
||||||
|
|
||||||
return {"status": "accepted", "issue_id": issue_id, "message": "Analysis queued"}
|
|
||||||
|
|
||||||
# JIRA webhook (compatible format)
|
|
||||||
@app.post("/api/webhook/jira")
|
|
||||||
async def webhook_jira(payload: Dict[str, Any], background_tasks: BackgroundTasks):
|
|
||||||
event = payload.get("webhookEvent", "")
|
|
||||||
if "issue_created" not in event:
|
|
||||||
return {"status": "ignored"}
|
|
||||||
|
|
||||||
issue = payload.get("issue", {})
|
|
||||||
fields = issue.get("fields", {})
|
|
||||||
|
|
||||||
async with db_pool.acquire() as conn:
|
|
||||||
issue_id = await conn.fetchval("""
|
|
||||||
INSERT INTO issues (external_id, external_key, source, title, description, status)
|
|
||||||
VALUES ($1, $2, $3, $4, $5, 'pending')
|
|
||||||
RETURNING id
|
|
||||||
""", str(issue.get("id")), issue.get("key"), "jira",
|
|
||||||
fields.get("summary"), fields.get("description"))
|
|
||||||
|
|
||||||
background_tasks.add_task(analyze_issue, issue_id, {
|
|
||||||
"key": issue.get("key"),
|
|
||||||
"title": fields.get("summary"),
|
|
||||||
"description": fields.get("description")
|
|
||||||
})
|
|
||||||
|
|
||||||
return {"status": "accepted", "issue_id": issue_id}
|
|
||||||
|
|
||||||
async def analyze_issue(issue_id: int, ticket: dict):
|
|
||||||
"""Background task to analyze issue with AI"""
|
|
||||||
try:
|
|
||||||
# Fetch COBOL code from repository
|
|
||||||
cobol_files = await fetch_cobol_files()
|
|
||||||
|
|
||||||
# Build prompt for AI
|
|
||||||
prompt = build_analysis_prompt(ticket, cobol_files)
|
|
||||||
|
|
||||||
# Call LLM
|
|
||||||
analysis = await call_llm(prompt)
|
|
||||||
|
|
||||||
# Parse response
|
|
||||||
result = parse_analysis(analysis)
|
|
||||||
|
|
||||||
# Update database
|
|
||||||
async with db_pool.acquire() as conn:
|
|
||||||
await conn.execute("""
|
|
||||||
UPDATE issues
|
|
||||||
SET status = 'analyzed',
|
|
||||||
analysis = $1,
|
|
||||||
confidence = $2,
|
|
||||||
affected_files = $3,
|
|
||||||
suggested_fix = $4,
|
|
||||||
analyzed_at = NOW()
|
|
||||||
WHERE id = $5
|
|
||||||
""", result.get("analysis"), result.get("confidence"),
|
|
||||||
json.dumps(result.get("affected_files", [])),
|
|
||||||
result.get("suggested_fix"), issue_id)
|
|
||||||
|
|
||||||
# Create branch and PR with the fix
|
|
||||||
pr_info = await create_fix_branch_and_pr(ticket, result)
|
|
||||||
|
|
||||||
# Post complete analysis with PR link back to TicketHub
|
|
||||||
await post_complete_analysis(ticket, result, pr_info)
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
async with db_pool.acquire() as conn:
|
|
||||||
await conn.execute("""
|
|
||||||
UPDATE issues SET status = 'error', analysis = $1 WHERE id = $2
|
|
||||||
""", f"Error: {str(e)}", issue_id)
|
|
||||||
|
|
||||||
async def fetch_cobol_files() -> Dict[str, str]:
|
|
||||||
"""Fetch COBOL source files from Gitea"""
|
|
||||||
files = {}
|
|
||||||
async with httpx.AsyncClient(timeout=30.0) as client:
|
|
||||||
# Get file list
|
|
||||||
url = f"{GITEA_URL}/api/v1/repos/{COBOL_REPO}/contents/src/cobol"
|
|
||||||
try:
|
|
||||||
resp = await client.get(url)
|
|
||||||
if resp.status_code == 200:
|
|
||||||
for item in resp.json():
|
|
||||||
if item["name"].endswith(".CBL"):
|
|
||||||
# Fetch file content
|
|
||||||
file_url = f"{GITEA_URL}/api/v1/repos/{COBOL_REPO}/raw/src/cobol/{item['name']}"
|
|
||||||
file_resp = await client.get(file_url)
|
|
||||||
if file_resp.status_code == 200:
|
|
||||||
files[item["name"]] = file_resp.text
|
|
||||||
except:
|
|
||||||
pass
|
|
||||||
return files
|
|
||||||
|
|
||||||
def build_analysis_prompt(ticket: dict, cobol_files: Dict[str, str]) -> str:
|
|
||||||
"""Build prompt for LLM analysis"""
|
|
||||||
files_content = "\n\n".join([
|
|
||||||
f"=== {name} ===\n{content}"
|
|
||||||
for name, content in cobol_files.items()
|
|
||||||
])
|
|
||||||
|
|
||||||
return f"""You are a COBOL expert analyzing a support case.
|
|
||||||
|
|
||||||
## Support Case
|
|
||||||
**Title:** {ticket.get('title', '')}
|
|
||||||
**Description:** {ticket.get('description', '')}
|
|
||||||
|
|
||||||
## Source Code Files
|
|
||||||
{files_content}
|
|
||||||
|
|
||||||
## Task
|
|
||||||
1. Identify the root cause of the issue described
|
|
||||||
2. Find the specific file(s) and line(s) affected
|
|
||||||
3. Propose a fix with the exact code change needed
|
|
||||||
4. Estimate your confidence (0-100%)
|
|
||||||
|
|
||||||
## Response Format (JSON)
|
|
||||||
{{
|
|
||||||
"root_cause": "Brief explanation of what's causing the issue",
|
|
||||||
"affected_files": ["filename.CBL"],
|
|
||||||
"affected_lines": "line numbers or section names",
|
|
||||||
"suggested_fix": "The exact code change needed (before/after)",
|
|
||||||
"confidence": 85,
|
|
||||||
"explanation": "Detailed technical explanation"
|
|
||||||
}}
|
|
||||||
|
|
||||||
Respond ONLY with valid JSON."""
|
|
||||||
|
|
||||||
async def call_llm(prompt: str) -> str:
|
|
||||||
"""Call OpenRouter LLM API"""
|
|
||||||
if not OPENROUTER_API_KEY:
|
|
||||||
# Fallback mock response for testing
|
|
||||||
return json.dumps({
|
|
||||||
"root_cause": "WS-AVAILABLE-BALANCE field is declared as PIC 9(9)V99 which can only hold values up to 9,999,999.99. The HOST system returns balances in PIC 9(11)V99 format, causing truncation on amounts over $10 million.",
|
|
||||||
"affected_files": ["AUTH.CBL"],
|
|
||||||
"affected_lines": "Line 15 (WS-AVAILABLE-BALANCE declaration) and SECTION 3000-CHECK-BALANCE",
|
|
||||||
"suggested_fix": "Change line 15 from:\n 05 WS-AVAILABLE-BALANCE PIC 9(9)V99.\nTo:\n 05 WS-AVAILABLE-BALANCE PIC 9(11)V99.",
|
|
||||||
"confidence": 92,
|
|
||||||
"explanation": "The AUTH.CBL program declares WS-AVAILABLE-BALANCE with PIC 9(9)V99, limiting it to 9,999,999.99. When receiving balance data from HOST (which uses PIC 9(11)V99), values above this limit get truncated. For example, a balance of 150,000,000.00 would be truncated to 0,000,000.00, causing false 'insufficient funds' responses. The fix is to align the field size with the HOST response format."
|
|
||||||
})
|
|
||||||
|
|
||||||
async with httpx.AsyncClient(timeout=60.0) as client:
|
|
||||||
resp = await client.post(
|
|
||||||
"https://openrouter.ai/api/v1/chat/completions",
|
|
||||||
headers={
|
|
||||||
"Authorization": f"Bearer {OPENROUTER_API_KEY}",
|
|
||||||
"Content-Type": "application/json"
|
|
||||||
},
|
|
||||||
json={
|
|
||||||
"model": "meta-llama/llama-3.3-70b-instruct:free",
|
|
||||||
"messages": [{"role": "user", "content": prompt}],
|
|
||||||
"temperature": 0.1
|
|
||||||
}
|
|
||||||
)
|
|
||||||
if resp.status_code == 200:
|
|
||||||
return resp.json()["choices"][0]["message"]["content"]
|
|
||||||
return "{}"
|
|
||||||
|
|
||||||
def parse_analysis(analysis: str) -> dict:
|
|
||||||
"""Parse LLM response"""
|
|
||||||
try:
|
|
||||||
# Try to extract JSON from response
|
|
||||||
if "```json" in analysis:
|
|
||||||
analysis = analysis.split("```json")[1].split("```")[0]
|
|
||||||
elif "```" in analysis:
|
|
||||||
analysis = analysis.split("```")[1].split("```")[0]
|
|
||||||
|
|
||||||
data = json.loads(analysis.strip())
|
|
||||||
return {
|
|
||||||
"analysis": data.get("root_cause", "") + "\n\n" + data.get("explanation", ""),
|
|
||||||
"confidence": data.get("confidence", 0) / 100.0,
|
|
||||||
"affected_files": data.get("affected_files", []),
|
|
||||||
"suggested_fix": data.get("suggested_fix", "")
|
|
||||||
}
|
|
||||||
except:
|
|
||||||
return {
|
|
||||||
"analysis": analysis,
|
|
||||||
"confidence": 0.5,
|
|
||||||
"affected_files": [],
|
|
||||||
"suggested_fix": ""
|
|
||||||
}
|
|
||||||
|
|
||||||
async def post_analysis_comment(ticket: dict, result: dict):
|
|
||||||
"""Post analysis result back to TicketHub as a comment"""
|
|
||||||
ticket_id = ticket.get("id")
|
|
||||||
if not ticket_id:
|
|
||||||
return
|
|
||||||
|
|
||||||
confidence_pct = int(result.get("confidence", 0) * 100)
|
|
||||||
files = ", ".join(result.get("affected_files", ["Unknown"]))
|
|
||||||
|
|
||||||
# Formatação texto plano com quebras de linha claras
|
|
||||||
comment = f"""🤖 AI ANALYSIS COMPLETE
|
|
||||||
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
|
||||||
|
|
||||||
📋 ROOT CAUSE:
|
|
||||||
{result.get('analysis', 'Unable to determine')}
|
|
||||||
|
|
||||||
📁 AFFECTED FILES: {files}
|
|
||||||
|
|
||||||
🔧 SUGGESTED FIX:
|
|
||||||
────────────────────────────────────────
|
|
||||||
{result.get('suggested_fix', 'No fix suggested')}
|
|
||||||
────────────────────────────────────────
|
|
||||||
|
|
||||||
📊 CONFIDENCE: {confidence_pct}%
|
|
||||||
|
|
||||||
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
|
||||||
Analyzed by JIRA AI Fixer"""
|
|
||||||
|
|
||||||
async with httpx.AsyncClient(timeout=10.0) as client:
|
|
||||||
try:
|
|
||||||
await client.post(
|
|
||||||
f"https://tickethub.startdata.com.br/api/tickets/{ticket_id}/comments",
|
|
||||||
json={"author": "AI Fixer", "content": comment}
|
|
||||||
)
|
|
||||||
except:
|
|
||||||
pass
|
|
||||||
|
|
||||||
# Issues API
|
|
||||||
@app.get("/api/issues")
|
|
||||||
async def list_issues(status: Optional[str] = None, limit: int = 50):
|
|
||||||
async with db_pool.acquire() as conn:
|
|
||||||
if status:
|
|
||||||
rows = await conn.fetch(
|
|
||||||
"SELECT * FROM issues WHERE status = $1 ORDER BY created_at DESC LIMIT $2",
|
|
||||||
status, limit)
|
|
||||||
else:
|
|
||||||
rows = await conn.fetch(
|
|
||||||
"SELECT * FROM issues ORDER BY created_at DESC LIMIT $1", limit)
|
|
||||||
return [dict(r) for r in rows]
|
|
||||||
|
|
||||||
@app.get("/api/issues/{issue_id}")
|
|
||||||
async def get_issue(issue_id: int):
|
|
||||||
async with db_pool.acquire() as conn:
|
|
||||||
row = await conn.fetchrow("SELECT * FROM issues WHERE id = $1", issue_id)
|
|
||||||
if not row:
|
|
||||||
raise HTTPException(404, "Issue not found")
|
|
||||||
return dict(row)
|
|
||||||
|
|
||||||
# Dashboard HTML
|
|
||||||
|
|
||||||
# Dashboard HTML
|
|
||||||
|
|
||||||
DASHBOARD_HTML = """<!DOCTYPE html>
|
|
||||||
<html lang="en">
|
|
||||||
<head>
|
|
||||||
<meta charset="UTF-8">
|
|
||||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
|
||||||
<title>JIRA AI Fixer</title>
|
|
||||||
<script src="https://cdn.tailwindcss.com"></script>
|
|
||||||
<link href="https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600;700&display=swap" rel="stylesheet">
|
|
||||||
<style>
|
|
||||||
body { font-family: 'Inter', system-ui, sans-serif; }
|
|
||||||
.gradient-bg { background: linear-gradient(135deg, #1e3a8a 0%, #7c3aed 100%); }
|
|
||||||
</style>
|
|
||||||
</head>
|
|
||||||
<body class="bg-gray-900 text-white min-h-screen">
|
|
||||||
<!-- Header -->
|
|
||||||
<header class="gradient-bg border-b border-white/10">
|
|
||||||
<div class="max-w-7xl mx-auto px-6 py-4">
|
|
||||||
<div class="flex items-center justify-between">
|
|
||||||
<div class="flex items-center gap-3">
|
|
||||||
<span class="text-3xl">🤖</span>
|
|
||||||
<div>
|
|
||||||
<h1 class="text-xl font-bold">JIRA AI Fixer</h1>
|
|
||||||
<p class="text-sm text-blue-200">Intelligent Support Case Resolution</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div class="flex items-center gap-4">
|
|
||||||
<span class="px-3 py-1 bg-green-500/20 text-green-400 rounded-full text-sm" id="status">● Online</span>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</header>
|
|
||||||
|
|
||||||
<main class="max-w-7xl mx-auto px-6 py-8">
|
|
||||||
<!-- Stats Grid -->
|
|
||||||
<div class="grid grid-cols-1 md:grid-cols-4 gap-6 mb-8">
|
|
||||||
<div class="bg-gray-800 rounded-xl p-6 border border-gray-700">
|
|
||||||
<div class="flex items-center justify-between">
|
|
||||||
<div>
|
|
||||||
<p class="text-gray-400 text-sm">Total Issues</p>
|
|
||||||
<p class="text-3xl font-bold mt-1" id="stat-total">0</p>
|
|
||||||
</div>
|
|
||||||
<div class="w-12 h-12 bg-blue-500/20 rounded-lg flex items-center justify-center">
|
|
||||||
<span class="text-2xl">📋</span>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div class="bg-gray-800 rounded-xl p-6 border border-gray-700">
|
|
||||||
<div class="flex items-center justify-between">
|
|
||||||
<div>
|
|
||||||
<p class="text-gray-400 text-sm">Analyzed</p>
|
|
||||||
<p class="text-3xl font-bold mt-1 text-green-400" id="stat-analyzed">0</p>
|
|
||||||
</div>
|
|
||||||
<div class="w-12 h-12 bg-green-500/20 rounded-lg flex items-center justify-center">
|
|
||||||
<span class="text-2xl">✅</span>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div class="bg-gray-800 rounded-xl p-6 border border-gray-700">
|
|
||||||
<div class="flex items-center justify-between">
|
|
||||||
<div>
|
|
||||||
<p class="text-gray-400 text-sm">PRs Created</p>
|
|
||||||
<p class="text-3xl font-bold mt-1 text-purple-400" id="stat-prs">0</p>
|
|
||||||
</div>
|
|
||||||
<div class="w-12 h-12 bg-purple-500/20 rounded-lg flex items-center justify-center">
|
|
||||||
<span class="text-2xl">🔀</span>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div class="bg-gray-800 rounded-xl p-6 border border-gray-700">
|
|
||||||
<div class="flex items-center justify-between">
|
|
||||||
<div>
|
|
||||||
<p class="text-gray-400 text-sm">Avg Confidence</p>
|
|
||||||
<p class="text-3xl font-bold mt-1 text-yellow-400" id="stat-confidence">0%</p>
|
|
||||||
</div>
|
|
||||||
<div class="w-12 h-12 bg-yellow-500/20 rounded-lg flex items-center justify-center">
|
|
||||||
<span class="text-2xl">🎯</span>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<!-- Main Content Grid -->
|
|
||||||
<div class="grid grid-cols-1 lg:grid-cols-3 gap-6">
|
|
||||||
<!-- Issues List -->
|
|
||||||
<div class="lg:col-span-2 bg-gray-800 rounded-xl border border-gray-700">
|
|
||||||
<div class="p-4 border-b border-gray-700 flex items-center justify-between">
|
|
||||||
<h2 class="font-semibold">Recent Issues</h2>
|
|
||||||
<select id="filter-status" onchange="loadIssues()" class="bg-gray-700 border border-gray-600 rounded-lg px-3 py-1 text-sm">
|
|
||||||
<option value="">All Status</option>
|
|
||||||
<option value="pending">Pending</option>
|
|
||||||
<option value="analyzed">Analyzed</option>
|
|
||||||
<option value="error">Error</option>
|
|
||||||
</select>
|
|
||||||
</div>
|
|
||||||
<div id="issues-list" class="divide-y divide-gray-700 max-h-[600px] overflow-y-auto">
|
|
||||||
<div class="p-8 text-center text-gray-500">Loading...</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<!-- Sidebar -->
|
|
||||||
<div class="space-y-6">
|
|
||||||
<!-- Integrations -->
|
|
||||||
<div class="bg-gray-800 rounded-xl border border-gray-700 p-4">
|
|
||||||
<h3 class="font-semibold mb-4">Integrations</h3>
|
|
||||||
<div class="space-y-3">
|
|
||||||
<div class="flex items-center justify-between">
|
|
||||||
<div class="flex items-center gap-2">
|
|
||||||
<span>🎫</span>
|
|
||||||
<span class="text-sm">TicketHub</span>
|
|
||||||
</div>
|
|
||||||
<span class="text-xs px-2 py-1 bg-green-500/20 text-green-400 rounded">Active</span>
|
|
||||||
</div>
|
|
||||||
<div class="flex items-center justify-between">
|
|
||||||
<div class="flex items-center gap-2">
|
|
||||||
<span>📦</span>
|
|
||||||
<span class="text-sm">Gitea</span>
|
|
||||||
</div>
|
|
||||||
<span class="text-xs px-2 py-1 bg-green-500/20 text-green-400 rounded">Active</span>
|
|
||||||
</div>
|
|
||||||
<div class="flex items-center justify-between">
|
|
||||||
<div class="flex items-center gap-2">
|
|
||||||
<span>🔵</span>
|
|
||||||
<span class="text-sm">JIRA</span>
|
|
||||||
</div>
|
|
||||||
<span class="text-xs px-2 py-1 bg-gray-500/20 text-gray-400 rounded">Ready</span>
|
|
||||||
</div>
|
|
||||||
<div class="flex items-center justify-between">
|
|
||||||
<div class="flex items-center gap-2">
|
|
||||||
<span>⚙️</span>
|
|
||||||
<span class="text-sm">ServiceNow</span>
|
|
||||||
</div>
|
|
||||||
<span class="text-xs px-2 py-1 bg-gray-500/20 text-gray-400 rounded">Ready</span>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<!-- Connected Repos -->
|
|
||||||
<div class="bg-gray-800 rounded-xl border border-gray-700 p-4">
|
|
||||||
<h3 class="font-semibold mb-4">Repositories</h3>
|
|
||||||
<div class="space-y-3" id="repos-list">
|
|
||||||
<div class="flex items-center gap-2 p-2 bg-gray-700/50 rounded-lg">
|
|
||||||
<span>📁</span>
|
|
||||||
<div>
|
|
||||||
<p class="text-sm font-medium">cobol-sample-app</p>
|
|
||||||
<p class="text-xs text-gray-400">4 COBOL files indexed</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<!-- Quick Actions -->
|
|
||||||
<div class="bg-gray-800 rounded-xl border border-gray-700 p-4">
|
|
||||||
<h3 class="font-semibold mb-4">Webhook Endpoints</h3>
|
|
||||||
<div class="space-y-2 text-xs">
|
|
||||||
<div class="p-2 bg-gray-700/50 rounded font-mono break-all">
|
|
||||||
POST /api/webhook/tickethub
|
|
||||||
</div>
|
|
||||||
<div class="p-2 bg-gray-700/50 rounded font-mono break-all">
|
|
||||||
POST /api/webhook/jira
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</main>
|
|
||||||
|
|
||||||
<!-- Issue Detail Modal -->
|
|
||||||
<div id="issue-modal" class="fixed inset-0 bg-black/70 hidden items-center justify-center z-50 p-4">
|
|
||||||
<div class="bg-gray-800 rounded-xl w-full max-w-2xl max-h-[90vh] overflow-hidden border border-gray-700">
|
|
||||||
<div class="p-4 border-b border-gray-700 flex justify-between items-center">
|
|
||||||
<div>
|
|
||||||
<span class="font-mono text-blue-400" id="modal-key"></span>
|
|
||||||
<span class="ml-2 px-2 py-1 rounded text-xs" id="modal-status"></span>
|
|
||||||
</div>
|
|
||||||
<button onclick="hideModal()" class="text-gray-400 hover:text-white">
|
|
||||||
<svg class="w-5 h-5" fill="none" stroke="currentColor" viewBox="0 0 24 24"><path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M6 18L18 6M6 6l12 12"></path></svg>
|
|
||||||
</button>
|
|
||||||
</div>
|
|
||||||
<div class="p-6 overflow-y-auto max-h-[70vh]" id="modal-content"></div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
<script>
|
|
||||||
loadIssues();
|
|
||||||
setInterval(loadIssues, 10000);
|
|
||||||
|
|
||||||
async function loadIssues() {
|
|
||||||
const filter = document.getElementById('filter-status').value;
|
|
||||||
const url = filter ? '/api/issues?status=' + filter : '/api/issues';
|
|
||||||
|
|
||||||
try {
|
|
||||||
const r = await fetch(url);
|
|
||||||
const issues = await r.json();
|
|
||||||
|
|
||||||
// Update stats
|
|
||||||
document.getElementById('stat-total').textContent = issues.length;
|
|
||||||
document.getElementById('stat-analyzed').textContent = issues.filter(i => i.status === 'analyzed').length;
|
|
||||||
|
|
||||||
// Count PRs (issues with suggested_fix that aren't errors)
|
|
||||||
const prs = issues.filter(i => i.status === 'analyzed' && i.suggested_fix).length;
|
|
||||||
document.getElementById('stat-prs').textContent = prs;
|
|
||||||
|
|
||||||
// Avg confidence
|
|
||||||
const analyzed = issues.filter(i => i.confidence);
|
|
||||||
const avgConf = analyzed.length ? Math.round(analyzed.reduce((a, i) => a + (i.confidence || 0), 0) / analyzed.length * 100) : 0;
|
|
||||||
document.getElementById('stat-confidence').textContent = avgConf + '%';
|
|
||||||
|
|
||||||
// Render list
|
|
||||||
const list = document.getElementById('issues-list');
|
|
||||||
if (!issues.length) {
|
|
||||||
list.innerHTML = '<div class="p-8 text-center text-gray-500">No issues found</div>';
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
list.innerHTML = issues.map(i => `
|
|
||||||
<div onclick="showIssue(${i.id})" class="p-4 hover:bg-gray-700/50 cursor-pointer">
|
|
||||||
<div class="flex justify-between items-start">
|
|
||||||
<div class="flex-1">
|
|
||||||
<div class="flex items-center gap-2">
|
|
||||||
<span class="font-mono text-blue-400 text-sm">${i.external_key || '#' + i.id}</span>
|
|
||||||
<span class="text-xs px-2 py-0.5 rounded ${getStatusClass(i.status)}">${i.status}</span>
|
|
||||||
</div>
|
|
||||||
<h4 class="font-medium mt-1">${i.title}</h4>
|
|
||||||
${i.confidence ? `<div class="mt-2 flex items-center gap-2">
|
|
||||||
<div class="flex-1 bg-gray-700 rounded-full h-2">
|
|
||||||
<div class="bg-green-500 h-2 rounded-full" style="width: ${Math.round(i.confidence * 100)}%"></div>
|
|
||||||
</div>
|
|
||||||
<span class="text-xs text-gray-400">${Math.round(i.confidence * 100)}%</span>
|
|
||||||
</div>` : ''}
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
`).join('');
|
|
||||||
} catch (e) {
|
|
||||||
console.error(e);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
function getStatusClass(status) {
|
|
||||||
switch(status) {
|
|
||||||
case 'analyzed': return 'bg-green-500/20 text-green-400';
|
|
||||||
case 'pending': return 'bg-yellow-500/20 text-yellow-400';
|
|
||||||
case 'error': return 'bg-red-500/20 text-red-400';
|
|
||||||
default: return 'bg-gray-500/20 text-gray-400';
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
async function showIssue(id) {
|
|
||||||
const r = await fetch('/api/issues/' + id);
|
|
||||||
const issue = await r.json();
|
|
||||||
|
|
||||||
document.getElementById('modal-key').textContent = issue.external_key || '#' + issue.id;
|
|
||||||
document.getElementById('modal-status').textContent = issue.status;
|
|
||||||
document.getElementById('modal-status').className = 'ml-2 px-2 py-1 rounded text-xs ' + getStatusClass(issue.status);
|
|
||||||
|
|
||||||
let affectedFiles = [];
|
|
||||||
try {
|
|
||||||
affectedFiles = JSON.parse(issue.affected_files || '[]');
|
|
||||||
} catch(e) {}
|
|
||||||
|
|
||||||
document.getElementById('modal-content').innerHTML = `
|
|
||||||
<h3 class="text-lg font-semibold">${issue.title}</h3>
|
|
||||||
<p class="text-gray-400 text-sm mt-1">Source: ${issue.source}</p>
|
|
||||||
|
|
||||||
<div class="mt-4 p-4 bg-gray-700/50 rounded-lg">
|
|
||||||
<h4 class="text-sm font-medium text-gray-300 mb-2">Description</h4>
|
|
||||||
<pre class="whitespace-pre-wrap text-sm">${issue.description || 'N/A'}</pre>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
${issue.analysis ? `
|
|
||||||
<div class="mt-4 p-4 bg-green-500/10 border border-green-500/30 rounded-lg">
|
|
||||||
<h4 class="text-sm font-medium text-green-400 mb-2">🔍 Analysis</h4>
|
|
||||||
<pre class="whitespace-pre-wrap text-sm">${issue.analysis}</pre>
|
|
||||||
</div>
|
|
||||||
` : ''}
|
|
||||||
|
|
||||||
${affectedFiles.length ? `
|
|
||||||
<div class="mt-4">
|
|
||||||
<h4 class="text-sm font-medium text-gray-300 mb-2">📁 Affected Files</h4>
|
|
||||||
<div class="flex flex-wrap gap-2">
|
|
||||||
${affectedFiles.map(f => `<span class="px-2 py-1 bg-gray-700 rounded text-sm font-mono">${f}</span>`).join('')}
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
` : ''}
|
|
||||||
|
|
||||||
${issue.suggested_fix ? `
|
|
||||||
<div class="mt-4 p-4 bg-purple-500/10 border border-purple-500/30 rounded-lg">
|
|
||||||
<h4 class="text-sm font-medium text-purple-400 mb-2">🔧 Suggested Fix</h4>
|
|
||||||
<pre class="whitespace-pre-wrap text-sm font-mono bg-gray-900 p-3 rounded">${issue.suggested_fix}</pre>
|
|
||||||
</div>
|
|
||||||
` : ''}
|
|
||||||
|
|
||||||
${issue.confidence ? `
|
|
||||||
<div class="mt-4 flex items-center gap-3">
|
|
||||||
<span class="text-sm text-gray-400">Confidence:</span>
|
|
||||||
<div class="flex-1 bg-gray-700 rounded-full h-3">
|
|
||||||
<div class="bg-green-500 h-3 rounded-full" style="width: ${Math.round(issue.confidence * 100)}%"></div>
|
|
||||||
</div>
|
|
||||||
<span class="font-bold text-green-400">${Math.round(issue.confidence * 100)}%</span>
|
|
||||||
</div>
|
|
||||||
` : ''}
|
|
||||||
|
|
||||||
<div class="mt-4 text-xs text-gray-500">
|
|
||||||
Created: ${new Date(issue.created_at).toLocaleString()}
|
|
||||||
${issue.analyzed_at ? `<br>Analyzed: ${new Date(issue.analyzed_at).toLocaleString()}` : ''}
|
|
||||||
</div>
|
|
||||||
`;
|
|
||||||
|
|
||||||
document.getElementById('issue-modal').classList.remove('hidden');
|
|
||||||
document.getElementById('issue-modal').classList.add('flex');
|
|
||||||
}
|
|
||||||
|
|
||||||
function hideModal() {
|
|
||||||
document.getElementById('issue-modal').classList.add('hidden');
|
|
||||||
document.getElementById('issue-modal').classList.remove('flex');
|
|
||||||
}
|
|
||||||
</script>
|
|
||||||
</body>
|
|
||||||
</html>"""
|
|
||||||
|
|
||||||
@app.get("/", response_class=HTMLResponse)
|
|
||||||
async def dashboard():
|
|
||||||
return DASHBOARD_HTML
|
|
||||||
|
|
||||||
@app.get("/dashboard", response_class=HTMLResponse)
|
|
||||||
async def dashboard_alt():
|
|
||||||
return DASHBOARD_HTML
|
|
||||||
# ============================================
|
|
||||||
# GIT INTEGRATION - Create Branch and PR
|
|
||||||
# ============================================
|
|
||||||
|
|
||||||
GITEA_TOKEN = os.getenv("GITEA_TOKEN", "") # Token de acesso ao Gitea
|
|
||||||
|
|
||||||
async def create_fix_branch_and_pr(ticket: dict, result: dict):
|
|
||||||
"""Create a branch with the fix and open a Pull Request"""
|
|
||||||
ticket_key = ticket.get("key", "unknown")
|
|
||||||
ticket_id = ticket.get("id")
|
|
||||||
|
|
||||||
if not result.get("affected_files") or not result.get("suggested_fix"):
|
|
||||||
return None
|
|
||||||
|
|
||||||
# Parse affected file
|
|
||||||
affected_files = result.get("affected_files", [])
|
|
||||||
if isinstance(affected_files, str):
|
|
||||||
import json as json_lib
|
|
||||||
try:
|
|
||||||
affected_files = json_lib.loads(affected_files)
|
|
||||||
except:
|
|
||||||
affected_files = [affected_files]
|
|
||||||
|
|
||||||
if not affected_files:
|
|
||||||
return None
|
|
||||||
|
|
||||||
main_file = affected_files[0] # e.g., "AUTH.CBL"
|
|
||||||
branch_name = f"fix/{ticket_key.lower()}-auto-fix"
|
|
||||||
|
|
||||||
async with httpx.AsyncClient(timeout=30.0) as client:
|
|
||||||
headers = {}
|
|
||||||
if GITEA_TOKEN:
|
|
||||||
headers["Authorization"] = f"token {GITEA_TOKEN}"
|
|
||||||
|
|
||||||
try:
|
|
||||||
# 1. Get the current file content and SHA
|
|
||||||
file_path = f"src/cobol/{main_file}"
|
|
||||||
file_url = f"{GITEA_URL}/api/v1/repos/{COBOL_REPO}/contents/{file_path}"
|
|
||||||
|
|
||||||
resp = await client.get(file_url, headers=headers)
|
|
||||||
if resp.status_code != 200:
|
|
||||||
return {"error": f"File not found: {file_path}"}
|
|
||||||
|
|
||||||
file_data = resp.json()
|
|
||||||
current_content = file_data.get("content", "")
|
|
||||||
file_sha = file_data.get("sha", "")
|
|
||||||
|
|
||||||
# Decode base64 content
|
|
||||||
import base64
|
|
||||||
try:
|
|
||||||
original_code = base64.b64decode(current_content).decode('utf-8')
|
|
||||||
except:
|
|
||||||
return {"error": "Failed to decode file content"}
|
|
||||||
|
|
||||||
# 2. Apply the fix (simple replacement for now)
|
|
||||||
# The fix suggests changing PIC 9(9)V99 to PIC 9(11)V99
|
|
||||||
fixed_code = original_code.replace(
|
|
||||||
"PIC 9(9)V99",
|
|
||||||
"PIC 9(11)V99"
|
|
||||||
)
|
|
||||||
|
|
||||||
if fixed_code == original_code:
|
|
||||||
return {"error": "Could not apply fix automatically"}
|
|
||||||
|
|
||||||
# 3. Get default branch SHA for creating new branch
|
|
||||||
repo_url = f"{GITEA_URL}/api/v1/repos/{COBOL_REPO}"
|
|
||||||
repo_resp = await client.get(repo_url, headers=headers)
|
|
||||||
default_branch = repo_resp.json().get("default_branch", "main")
|
|
||||||
|
|
||||||
# Get the SHA of default branch
|
|
||||||
branch_url = f"{GITEA_URL}/api/v1/repos/{COBOL_REPO}/branches/{default_branch}"
|
|
||||||
branch_resp = await client.get(branch_url, headers=headers)
|
|
||||||
base_sha = branch_resp.json().get("commit", {}).get("sha", "")
|
|
||||||
|
|
||||||
# 4. Create new branch
|
|
||||||
create_branch_url = f"{GITEA_URL}/api/v1/repos/{COBOL_REPO}/branches"
|
|
||||||
branch_data = {
|
|
||||||
"new_branch_name": branch_name,
|
|
||||||
"old_ref_name": default_branch
|
|
||||||
}
|
|
||||||
|
|
||||||
branch_create_resp = await client.post(
|
|
||||||
create_branch_url,
|
|
||||||
headers={**headers, "Content-Type": "application/json"},
|
|
||||||
json=branch_data
|
|
||||||
)
|
|
||||||
|
|
||||||
if branch_create_resp.status_code not in [201, 200, 409]: # 409 = already exists
|
|
||||||
return {"error": f"Failed to create branch: {branch_create_resp.text}"}
|
|
||||||
|
|
||||||
# 5. Update the file in the new branch
|
|
||||||
update_url = f"{GITEA_URL}/api/v1/repos/{COBOL_REPO}/contents/{file_path}"
|
|
||||||
update_data = {
|
|
||||||
"message": f"fix({ticket_key}): {ticket.get('title', 'Auto-fix')}\n\nAutomatically generated fix by JIRA AI Fixer.\nConfidence: {int(result.get('confidence', 0) * 100)}%",
|
|
||||||
"content": base64.b64encode(fixed_code.encode()).decode(),
|
|
||||||
"sha": file_sha,
|
|
||||||
"branch": branch_name
|
|
||||||
}
|
|
||||||
|
|
||||||
update_resp = await client.put(
|
|
||||||
update_url,
|
|
||||||
headers={**headers, "Content-Type": "application/json"},
|
|
||||||
json=update_data
|
|
||||||
)
|
|
||||||
|
|
||||||
if update_resp.status_code not in [200, 201]:
|
|
||||||
return {"error": f"Failed to update file: {update_resp.text}"}
|
|
||||||
|
|
||||||
# 6. Create Pull Request
|
|
||||||
pr_url = f"{GITEA_URL}/api/v1/repos/{COBOL_REPO}/pulls"
|
|
||||||
pr_data = {
|
|
||||||
"title": f"[{ticket_key}] {ticket.get('title', 'Auto-fix')}",
|
|
||||||
"body": f"""## 🤖 Automated Fix
|
|
||||||
|
|
||||||
**Ticket:** {ticket_key}
|
|
||||||
**Issue:** {ticket.get('title', '')}
|
|
||||||
|
|
||||||
### Root Cause Analysis
|
|
||||||
{result.get('analysis', 'N/A')}
|
|
||||||
|
|
||||||
### Changes Made
|
|
||||||
- **File:** `{file_path}`
|
|
||||||
- **Fix:** {result.get('suggested_fix', 'N/A')}
|
|
||||||
|
|
||||||
### Confidence
|
|
||||||
{int(result.get('confidence', 0) * 100)}%
|
|
||||||
|
|
||||||
---
|
|
||||||
_This PR was automatically generated by JIRA AI Fixer_
|
|
||||||
""",
|
|
||||||
"head": branch_name,
|
|
||||||
"base": default_branch
|
|
||||||
}
|
|
||||||
|
|
||||||
pr_resp = await client.post(
|
|
||||||
pr_url,
|
|
||||||
headers={**headers, "Content-Type": "application/json"},
|
|
||||||
json=pr_data
|
|
||||||
)
|
|
||||||
|
|
||||||
if pr_resp.status_code in [200, 201]:
|
|
||||||
pr_info = pr_resp.json()
|
|
||||||
return {
|
|
||||||
"success": True,
|
|
||||||
"branch": branch_name,
|
|
||||||
"pr_number": pr_info.get("number"),
|
|
||||||
"pr_url": pr_info.get("html_url", f"{GITEA_URL}/{COBOL_REPO}/pulls/{pr_info.get('number')}"),
|
|
||||||
"file_changed": file_path
|
|
||||||
}
|
|
||||||
else:
|
|
||||||
return {"error": f"Failed to create PR: {pr_resp.text}"}
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
return {"error": str(e)}
|
|
||||||
|
|
||||||
|
|
||||||
async def post_complete_analysis(ticket: dict, result: dict, pr_info: dict = None):
|
|
||||||
"""Post complete analysis with PR link back to TicketHub"""
|
|
||||||
ticket_id = ticket.get("id")
|
|
||||||
if not ticket_id:
|
|
||||||
return
|
|
||||||
|
|
||||||
confidence_pct = int(result.get("confidence", 0) * 100)
|
|
||||||
files = ", ".join(result.get("affected_files", ["Unknown"]))
|
|
||||||
|
|
||||||
# Build PR section
|
|
||||||
pr_section = ""
|
|
||||||
if pr_info and pr_info.get("success"):
|
|
||||||
pr_section = f"""
|
|
||||||
🔀 PULL REQUEST CREATED:
|
|
||||||
────────────────────────────────────────
|
|
||||||
Branch: {pr_info.get('branch')}
|
|
||||||
PR: #{pr_info.get('pr_number')}
|
|
||||||
URL: {pr_info.get('pr_url')}
|
|
||||||
────────────────────────────────────────
|
|
||||||
"""
|
|
||||||
elif pr_info and pr_info.get("error"):
|
|
||||||
pr_section = f"""
|
|
||||||
⚠️ AUTO-FIX FAILED:
|
|
||||||
{pr_info.get('error')}
|
|
||||||
"""
|
|
||||||
|
|
||||||
comment = f"""🤖 AI ANALYSIS COMPLETE
|
|
||||||
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
|
||||||
|
|
||||||
📋 ROOT CAUSE:
|
|
||||||
{result.get('analysis', 'Unable to determine')}
|
|
||||||
|
|
||||||
📁 AFFECTED FILES: {files}
|
|
||||||
|
|
||||||
🔧 SUGGESTED FIX:
|
|
||||||
────────────────────────────────────────
|
|
||||||
{result.get('suggested_fix', 'No fix suggested')}
|
|
||||||
────────────────────────────────────────
|
|
||||||
{pr_section}
|
|
||||||
📊 CONFIDENCE: {confidence_pct}%
|
|
||||||
|
|
||||||
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
|
||||||
Analyzed by JIRA AI Fixer"""
|
|
||||||
|
|
||||||
async with httpx.AsyncClient(timeout=10.0) as client:
|
|
||||||
try:
|
|
||||||
await client.post(
|
|
||||||
f"https://tickethub.startdata.com.br/api/tickets/{ticket_id}/comments",
|
|
||||||
json={"author": "AI Fixer", "content": comment}
|
|
||||||
)
|
|
||||||
except:
|
|
||||||
pass
|
|
||||||
1167
api/main_v3.py
1167
api/main_v3.py
File diff suppressed because it is too large
Load Diff
|
|
@ -1,5 +0,0 @@
|
||||||
fastapi==0.109.0
|
|
||||||
uvicorn==0.27.0
|
|
||||||
asyncpg==0.29.0
|
|
||||||
httpx==0.26.0
|
|
||||||
pydantic==2.5.3
|
|
||||||
|
|
@ -1,4 +0,0 @@
|
||||||
"""API Routers package."""
|
|
||||||
from . import webhook, issues, config
|
|
||||||
|
|
||||||
__all__ = ["webhook", "issues", "config"]
|
|
||||||
|
|
@ -1,134 +0,0 @@
|
||||||
"""
|
|
||||||
Configuration management API.
|
|
||||||
"""
|
|
||||||
from fastapi import APIRouter, HTTPException
|
|
||||||
from pydantic import BaseModel
|
|
||||||
from typing import Optional, List, Dict, Any
|
|
||||||
import logging
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
|
||||||
router = APIRouter()
|
|
||||||
|
|
||||||
|
|
||||||
class IntegrationConfig(BaseModel):
|
|
||||||
jira_url: Optional[str] = None
|
|
||||||
jira_token: Optional[str] = None
|
|
||||||
jira_projects: List[str] = []
|
|
||||||
bitbucket_url: Optional[str] = None
|
|
||||||
bitbucket_token: Optional[str] = None
|
|
||||||
llm_provider: str = "openrouter" # openrouter | azure
|
|
||||||
azure_endpoint: Optional[str] = None
|
|
||||||
azure_key: Optional[str] = None
|
|
||||||
azure_model: str = "gpt-4o"
|
|
||||||
openrouter_key: Optional[str] = None
|
|
||||||
openrouter_model: str = "meta-llama/llama-3.3-70b-instruct:free"
|
|
||||||
embedding_provider: str = "local" # local | azure
|
|
||||||
|
|
||||||
|
|
||||||
class RepositoryConfig(BaseModel):
|
|
||||||
url: str
|
|
||||||
name: str
|
|
||||||
ai_fork_name: Optional[str] = None
|
|
||||||
indexed: bool = False
|
|
||||||
last_sync: Optional[str] = None
|
|
||||||
file_count: int = 0
|
|
||||||
|
|
||||||
|
|
||||||
class ModuleConfig(BaseModel):
|
|
||||||
name: str
|
|
||||||
description: Optional[str] = None
|
|
||||||
program_patterns: List[str] = []
|
|
||||||
keywords: List[str] = []
|
|
||||||
rules: List[str] = []
|
|
||||||
restrictions: List[str] = []
|
|
||||||
|
|
||||||
|
|
||||||
class SystemConfig(BaseModel):
|
|
||||||
integrations: IntegrationConfig
|
|
||||||
repositories: List[RepositoryConfig] = []
|
|
||||||
modules: List[ModuleConfig] = []
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/integrations", response_model=IntegrationConfig)
|
|
||||||
async def get_integrations():
|
|
||||||
"""Get integration configuration (tokens masked)."""
|
|
||||||
# TODO: Load from database
|
|
||||||
return IntegrationConfig()
|
|
||||||
|
|
||||||
|
|
||||||
@router.put("/integrations")
|
|
||||||
async def update_integrations(config: IntegrationConfig):
|
|
||||||
"""Update integration configuration."""
|
|
||||||
logger.info("💾 Updating integration config")
|
|
||||||
# TODO: Save to database
|
|
||||||
return {"status": "updated"}
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/integrations/test/{service}")
|
|
||||||
async def test_integration(service: str):
|
|
||||||
"""Test connection to a service (jira, bitbucket, llm, embeddings)."""
|
|
||||||
logger.info(f"🔌 Testing connection: {service}")
|
|
||||||
# TODO: Implement connection tests
|
|
||||||
return {"status": "ok", "service": service, "connected": True}
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/repositories", response_model=List[RepositoryConfig])
|
|
||||||
async def list_repositories():
|
|
||||||
"""List configured repositories."""
|
|
||||||
# TODO: Load from database
|
|
||||||
return []
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/repositories")
|
|
||||||
async def add_repository(repo: RepositoryConfig):
|
|
||||||
"""Add a new repository for indexing."""
|
|
||||||
logger.info(f"📦 Adding repository: {repo.url}")
|
|
||||||
# TODO: Save and trigger indexing
|
|
||||||
return {"status": "added", "repository": repo.name}
|
|
||||||
|
|
||||||
|
|
||||||
@router.delete("/repositories/{repo_name}")
|
|
||||||
async def remove_repository(repo_name: str):
|
|
||||||
"""Remove a repository."""
|
|
||||||
logger.info(f"🗑️ Removing repository: {repo_name}")
|
|
||||||
# TODO: Remove from database and vector store
|
|
||||||
return {"status": "removed"}
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/repositories/{repo_name}/reindex")
|
|
||||||
async def reindex_repository(repo_name: str):
|
|
||||||
"""Trigger re-indexing of a repository."""
|
|
||||||
logger.info(f"🔄 Re-indexing repository: {repo_name}")
|
|
||||||
# TODO: Queue re-indexing job
|
|
||||||
return {"status": "queued"}
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/modules", response_model=List[ModuleConfig])
|
|
||||||
async def list_modules():
|
|
||||||
"""List business rule modules."""
|
|
||||||
# TODO: Load from database
|
|
||||||
return []
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/modules")
|
|
||||||
async def add_module(module: ModuleConfig):
|
|
||||||
"""Add a new business rule module."""
|
|
||||||
logger.info(f"🧠 Adding module: {module.name}")
|
|
||||||
# TODO: Save to database
|
|
||||||
return {"status": "added", "module": module.name}
|
|
||||||
|
|
||||||
|
|
||||||
@router.put("/modules/{module_name}")
|
|
||||||
async def update_module(module_name: str, module: ModuleConfig):
|
|
||||||
"""Update a business rule module."""
|
|
||||||
logger.info(f"💾 Updating module: {module_name}")
|
|
||||||
# TODO: Update in database
|
|
||||||
return {"status": "updated"}
|
|
||||||
|
|
||||||
|
|
||||||
@router.delete("/modules/{module_name}")
|
|
||||||
async def delete_module(module_name: str):
|
|
||||||
"""Delete a business rule module."""
|
|
||||||
logger.info(f"🗑️ Deleting module: {module_name}")
|
|
||||||
# TODO: Remove from database
|
|
||||||
return {"status": "deleted"}
|
|
||||||
|
|
@ -1,94 +0,0 @@
|
||||||
"""
|
|
||||||
Issue management API.
|
|
||||||
"""
|
|
||||||
from fastapi import APIRouter, HTTPException
|
|
||||||
from pydantic import BaseModel
|
|
||||||
from typing import Optional, List
|
|
||||||
from enum import Enum
|
|
||||||
from datetime import datetime
|
|
||||||
import logging
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
|
||||||
router = APIRouter()
|
|
||||||
|
|
||||||
|
|
||||||
class IssueStatus(str, Enum):
|
|
||||||
PENDING = "pending"
|
|
||||||
ANALYZING = "analyzing"
|
|
||||||
ANALYZED = "analyzed"
|
|
||||||
FIX_GENERATED = "fix_generated"
|
|
||||||
PR_CREATED = "pr_created"
|
|
||||||
ACCEPTED = "accepted"
|
|
||||||
REJECTED = "rejected"
|
|
||||||
FAILED = "failed"
|
|
||||||
|
|
||||||
|
|
||||||
class AnalyzedIssue(BaseModel):
|
|
||||||
id: str
|
|
||||||
jira_key: str
|
|
||||||
title: str
|
|
||||||
status: IssueStatus
|
|
||||||
module: Optional[str] = None
|
|
||||||
confidence: Optional[float] = None
|
|
||||||
analysis_time_ms: Optional[int] = None
|
|
||||||
affected_files: List[str] = []
|
|
||||||
root_cause: Optional[str] = None
|
|
||||||
proposed_fix: Optional[str] = None
|
|
||||||
pr_url: Optional[str] = None
|
|
||||||
created_at: datetime
|
|
||||||
updated_at: datetime
|
|
||||||
|
|
||||||
|
|
||||||
class IssueListResponse(BaseModel):
|
|
||||||
total: int
|
|
||||||
items: List[AnalyzedIssue]
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/", response_model=IssueListResponse)
|
|
||||||
async def list_issues(
|
|
||||||
status: Optional[IssueStatus] = None,
|
|
||||||
module: Optional[str] = None,
|
|
||||||
limit: int = 20,
|
|
||||||
offset: int = 0,
|
|
||||||
):
|
|
||||||
"""
|
|
||||||
List analyzed issues with optional filters.
|
|
||||||
"""
|
|
||||||
# TODO: Implement database query
|
|
||||||
return IssueListResponse(total=0, items=[])
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/{issue_id}", response_model=AnalyzedIssue)
|
|
||||||
async def get_issue(issue_id: str):
|
|
||||||
"""
|
|
||||||
Get details of a specific analyzed issue.
|
|
||||||
"""
|
|
||||||
# TODO: Implement database query
|
|
||||||
raise HTTPException(status_code=404, detail="Issue not found")
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/{issue_id}/reanalyze")
|
|
||||||
async def reanalyze_issue(issue_id: str):
|
|
||||||
"""
|
|
||||||
Trigger re-analysis of an issue.
|
|
||||||
"""
|
|
||||||
logger.info(f"🔄 Re-analyzing issue: {issue_id}")
|
|
||||||
# TODO: Queue for re-analysis
|
|
||||||
return {"status": "queued", "issue_id": issue_id}
|
|
||||||
|
|
||||||
|
|
||||||
@router.get("/stats/summary")
|
|
||||||
async def get_stats():
|
|
||||||
"""
|
|
||||||
Get summary statistics for dashboard.
|
|
||||||
"""
|
|
||||||
# TODO: Implement stats calculation
|
|
||||||
return {
|
|
||||||
"total_issues": 0,
|
|
||||||
"pending": 0,
|
|
||||||
"analyzed": 0,
|
|
||||||
"accepted": 0,
|
|
||||||
"rejected": 0,
|
|
||||||
"success_rate": 0.0,
|
|
||||||
"avg_analysis_time_ms": 0,
|
|
||||||
}
|
|
||||||
|
|
@ -1,79 +0,0 @@
|
||||||
"""
|
|
||||||
Webhook handlers for JIRA and Bitbucket events.
|
|
||||||
"""
|
|
||||||
from fastapi import APIRouter, Request, HTTPException, Header
|
|
||||||
from typing import Optional
|
|
||||||
import hmac
|
|
||||||
import hashlib
|
|
||||||
import logging
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
|
||||||
router = APIRouter()
|
|
||||||
|
|
||||||
|
|
||||||
def verify_jira_signature(payload: bytes, signature: str, secret: str) -> bool:
|
|
||||||
"""Verify JIRA webhook signature."""
|
|
||||||
expected = hmac.new(secret.encode(), payload, hashlib.sha256).hexdigest()
|
|
||||||
return hmac.compare_digest(expected, signature)
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/jira")
|
|
||||||
async def jira_webhook(
|
|
||||||
request: Request,
|
|
||||||
x_atlassian_webhook_identifier: Optional[str] = Header(None),
|
|
||||||
):
|
|
||||||
"""
|
|
||||||
Handle JIRA webhook events.
|
|
||||||
|
|
||||||
Events processed:
|
|
||||||
- jira:issue_created
|
|
||||||
- jira:issue_updated
|
|
||||||
"""
|
|
||||||
body = await request.body()
|
|
||||||
data = await request.json()
|
|
||||||
|
|
||||||
event_type = data.get("webhookEvent", "unknown")
|
|
||||||
issue = data.get("issue", {})
|
|
||||||
issue_key = issue.get("key", "unknown")
|
|
||||||
|
|
||||||
logger.info(f"📥 JIRA webhook: {event_type} - {issue_key}")
|
|
||||||
|
|
||||||
# Filter: only process Support Cases
|
|
||||||
issue_type = issue.get("fields", {}).get("issuetype", {}).get("name", "")
|
|
||||||
if issue_type != "Support Case":
|
|
||||||
logger.info(f"⏭️ Skipping non-Support Case issue: {issue_key} ({issue_type})")
|
|
||||||
return {"status": "skipped", "reason": "not a Support Case"}
|
|
||||||
|
|
||||||
# Queue for analysis
|
|
||||||
# TODO: Implement queue system
|
|
||||||
logger.info(f"📋 Queuing Support Case for analysis: {issue_key}")
|
|
||||||
|
|
||||||
return {
|
|
||||||
"status": "accepted",
|
|
||||||
"issue": issue_key,
|
|
||||||
"event": event_type,
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
@router.post("/bitbucket")
|
|
||||||
async def bitbucket_webhook(request: Request):
|
|
||||||
"""
|
|
||||||
Handle Bitbucket webhook events.
|
|
||||||
|
|
||||||
Events processed:
|
|
||||||
- repo:refs_changed (push)
|
|
||||||
- pr:merged
|
|
||||||
"""
|
|
||||||
data = await request.json()
|
|
||||||
event_type = data.get("eventKey", "unknown")
|
|
||||||
|
|
||||||
logger.info(f"📥 Bitbucket webhook: {event_type}")
|
|
||||||
|
|
||||||
if event_type == "repo:refs_changed":
|
|
||||||
# Re-index affected files
|
|
||||||
changes = data.get("changes", [])
|
|
||||||
for change in changes:
|
|
||||||
ref = change.get("ref", {}).get("displayId", "")
|
|
||||||
logger.info(f"🔄 Branch updated: {ref}")
|
|
||||||
|
|
||||||
return {"status": "accepted", "event": event_type}
|
|
||||||
|
|
@ -1,7 +0,0 @@
|
||||||
"""Services package."""
|
|
||||||
from .jira import JiraClient
|
|
||||||
from .bitbucket import BitbucketClient
|
|
||||||
from .llm import LLMService
|
|
||||||
from .embeddings import EmbeddingsService
|
|
||||||
|
|
||||||
__all__ = ["JiraClient", "BitbucketClient", "LLMService", "EmbeddingsService"]
|
|
||||||
|
|
@ -1,188 +0,0 @@
|
||||||
"""
|
|
||||||
Bitbucket Service - Client for Bitbucket Server API.
|
|
||||||
"""
|
|
||||||
from typing import Optional, Dict, Any, List
|
|
||||||
import httpx
|
|
||||||
import logging
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
|
||||||
|
|
||||||
|
|
||||||
class BitbucketClient:
|
|
||||||
"""Bitbucket Server REST API client."""
|
|
||||||
|
|
||||||
def __init__(self, base_url: str, token: str):
|
|
||||||
self.base_url = base_url.rstrip("/")
|
|
||||||
self.headers = {
|
|
||||||
"Authorization": f"Bearer {token}",
|
|
||||||
"Content-Type": "application/json",
|
|
||||||
}
|
|
||||||
|
|
||||||
async def get_file_content(
|
|
||||||
self,
|
|
||||||
project: str,
|
|
||||||
repo: str,
|
|
||||||
file_path: str,
|
|
||||||
ref: str = "main",
|
|
||||||
) -> str:
|
|
||||||
"""Get raw file content from a repository."""
|
|
||||||
async with httpx.AsyncClient() as client:
|
|
||||||
response = await client.get(
|
|
||||||
f"{self.base_url}/rest/api/1.0/projects/{project}/repos/{repo}/raw/{file_path}",
|
|
||||||
headers=self.headers,
|
|
||||||
params={"at": ref},
|
|
||||||
)
|
|
||||||
response.raise_for_status()
|
|
||||||
return response.text
|
|
||||||
|
|
||||||
async def list_files(
|
|
||||||
self,
|
|
||||||
project: str,
|
|
||||||
repo: str,
|
|
||||||
path: str = "",
|
|
||||||
ref: str = "main",
|
|
||||||
) -> List[Dict[str, Any]]:
|
|
||||||
"""List files in a directory."""
|
|
||||||
async with httpx.AsyncClient() as client:
|
|
||||||
response = await client.get(
|
|
||||||
f"{self.base_url}/rest/api/1.0/projects/{project}/repos/{repo}/files/{path}",
|
|
||||||
headers=self.headers,
|
|
||||||
params={"at": ref},
|
|
||||||
)
|
|
||||||
response.raise_for_status()
|
|
||||||
return response.json().get("values", [])
|
|
||||||
|
|
||||||
async def create_branch(
|
|
||||||
self,
|
|
||||||
project: str,
|
|
||||||
repo: str,
|
|
||||||
branch_name: str,
|
|
||||||
start_point: str = "main",
|
|
||||||
) -> Dict[str, Any]:
|
|
||||||
"""Create a new branch."""
|
|
||||||
async with httpx.AsyncClient() as client:
|
|
||||||
response = await client.post(
|
|
||||||
f"{self.base_url}/rest/api/1.0/projects/{project}/repos/{repo}/branches",
|
|
||||||
headers=self.headers,
|
|
||||||
json={
|
|
||||||
"name": branch_name,
|
|
||||||
"startPoint": f"refs/heads/{start_point}",
|
|
||||||
},
|
|
||||||
)
|
|
||||||
response.raise_for_status()
|
|
||||||
return response.json()
|
|
||||||
|
|
||||||
async def commit_file(
|
|
||||||
self,
|
|
||||||
project: str,
|
|
||||||
repo: str,
|
|
||||||
branch: str,
|
|
||||||
file_path: str,
|
|
||||||
content: str,
|
|
||||||
message: str,
|
|
||||||
) -> Dict[str, Any]:
|
|
||||||
"""Commit a file change to a branch."""
|
|
||||||
# Get current commit for the branch
|
|
||||||
async with httpx.AsyncClient() as client:
|
|
||||||
# First, get the latest commit on the branch
|
|
||||||
branch_response = await client.get(
|
|
||||||
f"{self.base_url}/rest/api/1.0/projects/{project}/repos/{repo}/branches",
|
|
||||||
headers=self.headers,
|
|
||||||
params={"filterText": branch},
|
|
||||||
)
|
|
||||||
branch_response.raise_for_status()
|
|
||||||
branches = branch_response.json().get("values", [])
|
|
||||||
|
|
||||||
if not branches:
|
|
||||||
raise ValueError(f"Branch not found: {branch}")
|
|
||||||
|
|
||||||
latest_commit = branches[0].get("latestCommit")
|
|
||||||
|
|
||||||
# Use file edit API
|
|
||||||
response = await client.put(
|
|
||||||
f"{self.base_url}/rest/api/1.0/projects/{project}/repos/{repo}/browse/{file_path}",
|
|
||||||
headers=self.headers,
|
|
||||||
json={
|
|
||||||
"content": content,
|
|
||||||
"message": message,
|
|
||||||
"branch": branch,
|
|
||||||
"sourceCommitId": latest_commit,
|
|
||||||
},
|
|
||||||
)
|
|
||||||
response.raise_for_status()
|
|
||||||
return response.json()
|
|
||||||
|
|
||||||
async def create_pull_request(
|
|
||||||
self,
|
|
||||||
project: str,
|
|
||||||
repo: str,
|
|
||||||
title: str,
|
|
||||||
description: str,
|
|
||||||
source_branch: str,
|
|
||||||
target_branch: str = "main",
|
|
||||||
target_project: Optional[str] = None,
|
|
||||||
target_repo: Optional[str] = None,
|
|
||||||
) -> Dict[str, Any]:
|
|
||||||
"""Create a pull request."""
|
|
||||||
target_project = target_project or project
|
|
||||||
target_repo = target_repo or repo
|
|
||||||
|
|
||||||
async with httpx.AsyncClient() as client:
|
|
||||||
response = await client.post(
|
|
||||||
f"{self.base_url}/rest/api/1.0/projects/{project}/repos/{repo}/pull-requests",
|
|
||||||
headers=self.headers,
|
|
||||||
json={
|
|
||||||
"title": title,
|
|
||||||
"description": description,
|
|
||||||
"fromRef": {
|
|
||||||
"id": f"refs/heads/{source_branch}",
|
|
||||||
"repository": {
|
|
||||||
"slug": repo,
|
|
||||||
"project": {"key": project},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
"toRef": {
|
|
||||||
"id": f"refs/heads/{target_branch}",
|
|
||||||
"repository": {
|
|
||||||
"slug": target_repo,
|
|
||||||
"project": {"key": target_project},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
},
|
|
||||||
)
|
|
||||||
response.raise_for_status()
|
|
||||||
return response.json()
|
|
||||||
|
|
||||||
async def get_repositories(self, project: str) -> List[Dict[str, Any]]:
|
|
||||||
"""List repositories in a project."""
|
|
||||||
async with httpx.AsyncClient() as client:
|
|
||||||
response = await client.get(
|
|
||||||
f"{self.base_url}/rest/api/1.0/projects/{project}/repos",
|
|
||||||
headers=self.headers,
|
|
||||||
)
|
|
||||||
response.raise_for_status()
|
|
||||||
return response.json().get("values", [])
|
|
||||||
|
|
||||||
async def search_code(
|
|
||||||
self,
|
|
||||||
project: str,
|
|
||||||
repo: str,
|
|
||||||
query: str,
|
|
||||||
ref: str = "main",
|
|
||||||
) -> List[Dict[str, Any]]:
|
|
||||||
"""Search for code in a repository."""
|
|
||||||
# Bitbucket Server code search API
|
|
||||||
async with httpx.AsyncClient() as client:
|
|
||||||
response = await client.get(
|
|
||||||
f"{self.base_url}/rest/search/1.0/search",
|
|
||||||
headers=self.headers,
|
|
||||||
params={
|
|
||||||
"query": query,
|
|
||||||
"entities": "code",
|
|
||||||
"projectKey": project,
|
|
||||||
"repositorySlug": repo,
|
|
||||||
},
|
|
||||||
)
|
|
||||||
if response.status_code == 200:
|
|
||||||
return response.json().get("values", [])
|
|
||||||
return []
|
|
||||||
|
|
@ -1,300 +0,0 @@
|
||||||
"""
|
|
||||||
Embeddings Service - Code indexing with vector embeddings.
|
|
||||||
"""
|
|
||||||
from typing import Optional, Dict, Any, List, Tuple
|
|
||||||
import httpx
|
|
||||||
import numpy as np
|
|
||||||
import logging
|
|
||||||
import re
|
|
||||||
from dataclasses import dataclass
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
|
||||||
|
|
||||||
|
|
||||||
@dataclass
|
|
||||||
class CodeChunk:
|
|
||||||
"""A chunk of indexed code."""
|
|
||||||
file_path: str
|
|
||||||
content: str
|
|
||||||
start_line: int
|
|
||||||
end_line: int
|
|
||||||
chunk_type: str # program, section, paragraph, copybook
|
|
||||||
metadata: Dict[str, Any]
|
|
||||||
|
|
||||||
|
|
||||||
class EmbeddingsService:
|
|
||||||
"""
|
|
||||||
Service for generating and managing code embeddings.
|
|
||||||
|
|
||||||
Supports:
|
|
||||||
- Local MiniLM-L6-v2 (development)
|
|
||||||
- Azure OpenAI embeddings (production)
|
|
||||||
"""
|
|
||||||
|
|
||||||
def __init__(
|
|
||||||
self,
|
|
||||||
provider: str = "local",
|
|
||||||
azure_endpoint: Optional[str] = None,
|
|
||||||
azure_key: Optional[str] = None,
|
|
||||||
azure_model: str = "text-embedding-3-large",
|
|
||||||
qdrant_url: str = "http://localhost:6333",
|
|
||||||
):
|
|
||||||
self.provider = provider
|
|
||||||
self.azure_endpoint = azure_endpoint
|
|
||||||
self.azure_key = azure_key
|
|
||||||
self.azure_model = azure_model
|
|
||||||
self.qdrant_url = qdrant_url
|
|
||||||
self._local_model = None
|
|
||||||
|
|
||||||
async def embed_text(self, text: str) -> List[float]:
|
|
||||||
"""Generate embedding for a text."""
|
|
||||||
if self.provider == "azure":
|
|
||||||
return await self._embed_azure(text)
|
|
||||||
else:
|
|
||||||
return self._embed_local(text)
|
|
||||||
|
|
||||||
async def _embed_azure(self, text: str) -> List[float]:
|
|
||||||
"""Generate embedding using Azure OpenAI."""
|
|
||||||
url = f"{self.azure_endpoint}/openai/deployments/{self.azure_model}/embeddings?api-version=2024-02-01"
|
|
||||||
|
|
||||||
async with httpx.AsyncClient() as client:
|
|
||||||
response = await client.post(
|
|
||||||
url,
|
|
||||||
headers={
|
|
||||||
"api-key": self.azure_key,
|
|
||||||
"Content-Type": "application/json",
|
|
||||||
},
|
|
||||||
json={"input": text},
|
|
||||||
timeout=60.0,
|
|
||||||
)
|
|
||||||
response.raise_for_status()
|
|
||||||
data = response.json()
|
|
||||||
return data["data"][0]["embedding"]
|
|
||||||
|
|
||||||
def _embed_local(self, text: str) -> List[float]:
|
|
||||||
"""Generate embedding using local MiniLM model."""
|
|
||||||
if self._local_model is None:
|
|
||||||
from sentence_transformers import SentenceTransformer
|
|
||||||
self._local_model = SentenceTransformer("all-MiniLM-L6-v2")
|
|
||||||
|
|
||||||
embedding = self._local_model.encode(text)
|
|
||||||
return embedding.tolist()
|
|
||||||
|
|
||||||
def parse_cobol_program(self, content: str, file_path: str) -> List[CodeChunk]:
|
|
||||||
"""
|
|
||||||
Parse a COBOL program into indexable chunks.
|
|
||||||
|
|
||||||
Extracts:
|
|
||||||
- PROGRAM-ID
|
|
||||||
- COPY statements
|
|
||||||
- CALL statements
|
|
||||||
- SECTIONs and PARAGRAPHs
|
|
||||||
- FILE-CONTROL
|
|
||||||
- Working Storage variables
|
|
||||||
"""
|
|
||||||
chunks = []
|
|
||||||
lines = content.split("\n")
|
|
||||||
|
|
||||||
# Extract PROGRAM-ID
|
|
||||||
program_id = None
|
|
||||||
for i, line in enumerate(lines):
|
|
||||||
match = re.search(r"PROGRAM-ID\.\s+(\S+)", line, re.IGNORECASE)
|
|
||||||
if match:
|
|
||||||
program_id = match.group(1).rstrip(".")
|
|
||||||
break
|
|
||||||
|
|
||||||
# Extract COPY statements
|
|
||||||
copies = []
|
|
||||||
for i, line in enumerate(lines):
|
|
||||||
match = re.search(r"COPY\s+(\S+)", line, re.IGNORECASE)
|
|
||||||
if match:
|
|
||||||
copies.append(match.group(1).rstrip("."))
|
|
||||||
|
|
||||||
# Extract CALL statements
|
|
||||||
calls = []
|
|
||||||
for i, line in enumerate(lines):
|
|
||||||
match = re.search(r"CALL\s+['\"](\S+)['\"]", line, re.IGNORECASE)
|
|
||||||
if match:
|
|
||||||
calls.append(match.group(1))
|
|
||||||
|
|
||||||
# Extract SECTIONs
|
|
||||||
current_section = None
|
|
||||||
section_start = 0
|
|
||||||
section_content = []
|
|
||||||
|
|
||||||
for i, line in enumerate(lines):
|
|
||||||
# Check for SECTION definition
|
|
||||||
match = re.search(r"^\s{7}(\S+)\s+SECTION", line)
|
|
||||||
if match:
|
|
||||||
# Save previous section
|
|
||||||
if current_section:
|
|
||||||
chunks.append(CodeChunk(
|
|
||||||
file_path=file_path,
|
|
||||||
content="\n".join(section_content),
|
|
||||||
start_line=section_start,
|
|
||||||
end_line=i - 1,
|
|
||||||
chunk_type="section",
|
|
||||||
metadata={
|
|
||||||
"program_id": program_id,
|
|
||||||
"section_name": current_section,
|
|
||||||
"copies": copies,
|
|
||||||
"calls": calls,
|
|
||||||
},
|
|
||||||
))
|
|
||||||
current_section = match.group(1)
|
|
||||||
section_start = i
|
|
||||||
section_content = [line]
|
|
||||||
elif current_section:
|
|
||||||
section_content.append(line)
|
|
||||||
|
|
||||||
# Save last section
|
|
||||||
if current_section:
|
|
||||||
chunks.append(CodeChunk(
|
|
||||||
file_path=file_path,
|
|
||||||
content="\n".join(section_content),
|
|
||||||
start_line=section_start,
|
|
||||||
end_line=len(lines) - 1,
|
|
||||||
chunk_type="section",
|
|
||||||
metadata={
|
|
||||||
"program_id": program_id,
|
|
||||||
"section_name": current_section,
|
|
||||||
"copies": copies,
|
|
||||||
"calls": calls,
|
|
||||||
},
|
|
||||||
))
|
|
||||||
|
|
||||||
# If no sections found, chunk the whole program
|
|
||||||
if not chunks:
|
|
||||||
chunks.append(CodeChunk(
|
|
||||||
file_path=file_path,
|
|
||||||
content=content,
|
|
||||||
start_line=1,
|
|
||||||
end_line=len(lines),
|
|
||||||
chunk_type="program",
|
|
||||||
metadata={
|
|
||||||
"program_id": program_id,
|
|
||||||
"copies": copies,
|
|
||||||
"calls": calls,
|
|
||||||
},
|
|
||||||
))
|
|
||||||
|
|
||||||
return chunks
|
|
||||||
|
|
||||||
async def index_chunks(
|
|
||||||
self,
|
|
||||||
chunks: List[CodeChunk],
|
|
||||||
collection: str,
|
|
||||||
product: str,
|
|
||||||
client: str,
|
|
||||||
) -> int:
|
|
||||||
"""Index code chunks into Qdrant."""
|
|
||||||
indexed = 0
|
|
||||||
|
|
||||||
for chunk in chunks:
|
|
||||||
# Generate embedding
|
|
||||||
text_to_embed = f"""
|
|
||||||
File: {chunk.file_path}
|
|
||||||
Type: {chunk.chunk_type}
|
|
||||||
{chunk.metadata.get('section_name', '')}
|
|
||||||
{chunk.content[:1000]}
|
|
||||||
"""
|
|
||||||
embedding = await self.embed_text(text_to_embed)
|
|
||||||
|
|
||||||
# Store in Qdrant
|
|
||||||
await self._store_vector(
|
|
||||||
collection=collection,
|
|
||||||
vector=embedding,
|
|
||||||
payload={
|
|
||||||
"file_path": chunk.file_path,
|
|
||||||
"content": chunk.content,
|
|
||||||
"start_line": chunk.start_line,
|
|
||||||
"end_line": chunk.end_line,
|
|
||||||
"chunk_type": chunk.chunk_type,
|
|
||||||
"product": product,
|
|
||||||
"client": client,
|
|
||||||
**chunk.metadata,
|
|
||||||
},
|
|
||||||
)
|
|
||||||
indexed += 1
|
|
||||||
|
|
||||||
return indexed
|
|
||||||
|
|
||||||
async def search_similar(
|
|
||||||
self,
|
|
||||||
query: str,
|
|
||||||
collection: str,
|
|
||||||
limit: int = 10,
|
|
||||||
filters: Optional[Dict[str, Any]] = None,
|
|
||||||
) -> List[Dict[str, Any]]:
|
|
||||||
"""Search for similar code chunks."""
|
|
||||||
embedding = await self.embed_text(query)
|
|
||||||
|
|
||||||
async with httpx.AsyncClient() as client:
|
|
||||||
body = {
|
|
||||||
"vector": embedding,
|
|
||||||
"limit": limit,
|
|
||||||
"with_payload": True,
|
|
||||||
}
|
|
||||||
if filters:
|
|
||||||
body["filter"] = filters
|
|
||||||
|
|
||||||
response = await client.post(
|
|
||||||
f"{self.qdrant_url}/collections/{collection}/points/search",
|
|
||||||
json=body,
|
|
||||||
timeout=30.0,
|
|
||||||
)
|
|
||||||
|
|
||||||
if response.status_code == 200:
|
|
||||||
results = response.json().get("result", [])
|
|
||||||
return [
|
|
||||||
{
|
|
||||||
"score": r["score"],
|
|
||||||
**r["payload"],
|
|
||||||
}
|
|
||||||
for r in results
|
|
||||||
]
|
|
||||||
return []
|
|
||||||
|
|
||||||
async def _store_vector(
|
|
||||||
self,
|
|
||||||
collection: str,
|
|
||||||
vector: List[float],
|
|
||||||
payload: Dict[str, Any],
|
|
||||||
) -> bool:
|
|
||||||
"""Store a vector in Qdrant."""
|
|
||||||
import uuid
|
|
||||||
|
|
||||||
async with httpx.AsyncClient() as client:
|
|
||||||
response = await client.put(
|
|
||||||
f"{self.qdrant_url}/collections/{collection}/points",
|
|
||||||
json={
|
|
||||||
"points": [
|
|
||||||
{
|
|
||||||
"id": str(uuid.uuid4()),
|
|
||||||
"vector": vector,
|
|
||||||
"payload": payload,
|
|
||||||
}
|
|
||||||
]
|
|
||||||
},
|
|
||||||
timeout=30.0,
|
|
||||||
)
|
|
||||||
return response.status_code == 200
|
|
||||||
|
|
||||||
async def create_collection(
|
|
||||||
self,
|
|
||||||
name: str,
|
|
||||||
vector_size: int = 384, # MiniLM default
|
|
||||||
) -> bool:
|
|
||||||
"""Create a Qdrant collection."""
|
|
||||||
async with httpx.AsyncClient() as client:
|
|
||||||
response = await client.put(
|
|
||||||
f"{self.qdrant_url}/collections/{name}",
|
|
||||||
json={
|
|
||||||
"vectors": {
|
|
||||||
"size": vector_size,
|
|
||||||
"distance": "Cosine",
|
|
||||||
}
|
|
||||||
},
|
|
||||||
timeout=30.0,
|
|
||||||
)
|
|
||||||
return response.status_code in [200, 201]
|
|
||||||
|
|
@ -1,110 +0,0 @@
|
||||||
"""
|
|
||||||
JIRA Service - Client for JIRA Server API.
|
|
||||||
"""
|
|
||||||
from typing import Optional, Dict, Any, List
|
|
||||||
import httpx
|
|
||||||
import logging
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
|
||||||
|
|
||||||
|
|
||||||
class JiraClient:
|
|
||||||
"""JIRA Server REST API client."""
|
|
||||||
|
|
||||||
def __init__(self, base_url: str, token: str):
|
|
||||||
self.base_url = base_url.rstrip("/")
|
|
||||||
self.headers = {
|
|
||||||
"Authorization": f"Bearer {token}",
|
|
||||||
"Content-Type": "application/json",
|
|
||||||
}
|
|
||||||
|
|
||||||
async def get_issue(self, issue_key: str) -> Dict[str, Any]:
|
|
||||||
"""Fetch issue details."""
|
|
||||||
async with httpx.AsyncClient() as client:
|
|
||||||
response = await client.get(
|
|
||||||
f"{self.base_url}/rest/api/2/issue/{issue_key}",
|
|
||||||
headers=self.headers,
|
|
||||||
)
|
|
||||||
response.raise_for_status()
|
|
||||||
return response.json()
|
|
||||||
|
|
||||||
async def add_comment(self, issue_key: str, body: str) -> Dict[str, Any]:
|
|
||||||
"""Add a comment to an issue."""
|
|
||||||
async with httpx.AsyncClient() as client:
|
|
||||||
response = await client.post(
|
|
||||||
f"{self.base_url}/rest/api/2/issue/{issue_key}/comment",
|
|
||||||
headers=self.headers,
|
|
||||||
json={"body": body},
|
|
||||||
)
|
|
||||||
response.raise_for_status()
|
|
||||||
return response.json()
|
|
||||||
|
|
||||||
async def search_issues(
|
|
||||||
self,
|
|
||||||
jql: str,
|
|
||||||
start_at: int = 0,
|
|
||||||
max_results: int = 50,
|
|
||||||
fields: Optional[List[str]] = None,
|
|
||||||
) -> Dict[str, Any]:
|
|
||||||
"""Search issues using JQL."""
|
|
||||||
params = {
|
|
||||||
"jql": jql,
|
|
||||||
"startAt": start_at,
|
|
||||||
"maxResults": max_results,
|
|
||||||
}
|
|
||||||
if fields:
|
|
||||||
params["fields"] = ",".join(fields)
|
|
||||||
|
|
||||||
async with httpx.AsyncClient() as client:
|
|
||||||
response = await client.get(
|
|
||||||
f"{self.base_url}/rest/api/2/search",
|
|
||||||
headers=self.headers,
|
|
||||||
params=params,
|
|
||||||
)
|
|
||||||
response.raise_for_status()
|
|
||||||
return response.json()
|
|
||||||
|
|
||||||
async def get_projects(self) -> List[Dict[str, Any]]:
|
|
||||||
"""List all accessible projects."""
|
|
||||||
async with httpx.AsyncClient() as client:
|
|
||||||
response = await client.get(
|
|
||||||
f"{self.base_url}/rest/api/2/project",
|
|
||||||
headers=self.headers,
|
|
||||||
)
|
|
||||||
response.raise_for_status()
|
|
||||||
return response.json()
|
|
||||||
|
|
||||||
def format_analysis_comment(
|
|
||||||
self,
|
|
||||||
root_cause: str,
|
|
||||||
affected_files: List[str],
|
|
||||||
proposed_fix: str,
|
|
||||||
confidence: float,
|
|
||||||
pr_url: Optional[str] = None,
|
|
||||||
) -> str:
|
|
||||||
"""Format AI analysis as a JIRA comment."""
|
|
||||||
files_list = "\n".join([f"* {f}" for f in affected_files])
|
|
||||||
|
|
||||||
comment = f"""
|
|
||||||
h2. 📋 Análise Automática
|
|
||||||
|
|
||||||
h3. 🔍 Causa Raiz Identificada
|
|
||||||
{root_cause}
|
|
||||||
|
|
||||||
h3. 📁 Arquivos Afetados
|
|
||||||
{files_list}
|
|
||||||
|
|
||||||
h3. 💡 Correção Proposta
|
|
||||||
{{code:cobol}}
|
|
||||||
{proposed_fix}
|
|
||||||
{{code}}
|
|
||||||
|
|
||||||
h3. 📊 Confiança: {confidence:.0%}
|
|
||||||
"""
|
|
||||||
|
|
||||||
if pr_url:
|
|
||||||
comment += f"\nh3. 🔗 Pull Request\n[Ver PR|{pr_url}]"
|
|
||||||
|
|
||||||
comment += "\n\n_Gerado automaticamente por JIRA AI Fixer_"
|
|
||||||
|
|
||||||
return comment
|
|
||||||
|
|
@ -1,193 +0,0 @@
|
||||||
"""
|
|
||||||
LLM Service - Orchestration for AI models.
|
|
||||||
"""
|
|
||||||
from typing import Optional, Dict, Any, List
|
|
||||||
import httpx
|
|
||||||
import json
|
|
||||||
import logging
|
|
||||||
import os
|
|
||||||
|
|
||||||
logger = logging.getLogger(__name__)
|
|
||||||
|
|
||||||
|
|
||||||
class LLMService:
|
|
||||||
"""
|
|
||||||
LLM orchestration service supporting multiple providers.
|
|
||||||
|
|
||||||
Providers:
|
|
||||||
- Azure OpenAI (production, compliance)
|
|
||||||
- OpenRouter (development, free models)
|
|
||||||
"""
|
|
||||||
|
|
||||||
def __init__(
|
|
||||||
self,
|
|
||||||
provider: str = "openrouter",
|
|
||||||
azure_endpoint: Optional[str] = None,
|
|
||||||
azure_key: Optional[str] = None,
|
|
||||||
azure_model: str = "gpt-4o",
|
|
||||||
openrouter_key: Optional[str] = None,
|
|
||||||
openrouter_model: str = "meta-llama/llama-3.3-70b-instruct:free",
|
|
||||||
):
|
|
||||||
self.provider = provider
|
|
||||||
self.azure_endpoint = azure_endpoint
|
|
||||||
self.azure_key = azure_key
|
|
||||||
self.azure_model = azure_model
|
|
||||||
self.openrouter_key = openrouter_key
|
|
||||||
self.openrouter_model = openrouter_model
|
|
||||||
|
|
||||||
async def analyze_issue(
|
|
||||||
self,
|
|
||||||
issue_description: str,
|
|
||||||
code_context: str,
|
|
||||||
business_rules: Optional[str] = None,
|
|
||||||
similar_fixes: Optional[List[Dict[str, Any]]] = None,
|
|
||||||
) -> Dict[str, Any]:
|
|
||||||
"""
|
|
||||||
Analyze an issue and generate fix suggestions.
|
|
||||||
|
|
||||||
Returns:
|
|
||||||
{
|
|
||||||
"root_cause": str,
|
|
||||||
"affected_files": List[str],
|
|
||||||
"proposed_fix": str,
|
|
||||||
"confidence": float,
|
|
||||||
"explanation": str,
|
|
||||||
}
|
|
||||||
"""
|
|
||||||
prompt = self._build_analysis_prompt(
|
|
||||||
issue_description,
|
|
||||||
code_context,
|
|
||||||
business_rules,
|
|
||||||
similar_fixes,
|
|
||||||
)
|
|
||||||
|
|
||||||
response = await self._call_llm(prompt)
|
|
||||||
return self._parse_analysis_response(response)
|
|
||||||
|
|
||||||
def _build_analysis_prompt(
|
|
||||||
self,
|
|
||||||
issue_description: str,
|
|
||||||
code_context: str,
|
|
||||||
business_rules: Optional[str],
|
|
||||||
similar_fixes: Optional[List[Dict[str, Any]]],
|
|
||||||
) -> str:
|
|
||||||
"""Build the analysis prompt."""
|
|
||||||
|
|
||||||
prompt = f"""Você é um especialista em sistemas de pagamento mainframe, especificamente nos produtos JIRA Acquirer (ACQ-MF) e Interchange (ICG-MF).
|
|
||||||
|
|
||||||
## Contexto do Sistema
|
|
||||||
{business_rules or "Nenhuma regra de negócio específica fornecida."}
|
|
||||||
|
|
||||||
## Issue Reportada
|
|
||||||
{issue_description}
|
|
||||||
|
|
||||||
## Código Atual
|
|
||||||
{code_context}
|
|
||||||
|
|
||||||
"""
|
|
||||||
|
|
||||||
if similar_fixes:
|
|
||||||
prompt += "## Histórico de Fixes Similares\n"
|
|
||||||
for i, fix in enumerate(similar_fixes[:3], 1):
|
|
||||||
prompt += f"""
|
|
||||||
### Exemplo {i}
|
|
||||||
Problema: {fix.get('problem', 'N/A')}
|
|
||||||
Solução: {fix.get('solution', 'N/A')}
|
|
||||||
"""
|
|
||||||
|
|
||||||
prompt += """
|
|
||||||
## Tarefa
|
|
||||||
Analise a issue e:
|
|
||||||
1. Identifique a causa raiz provável
|
|
||||||
2. Localize o(s) programa(s) afetado(s)
|
|
||||||
3. Proponha uma correção específica
|
|
||||||
4. Explique o impacto da alteração
|
|
||||||
|
|
||||||
## Regras
|
|
||||||
- Mantenha compatibilidade COBOL-85
|
|
||||||
- Preserve a estrutura de copybooks existente
|
|
||||||
- Não altere interfaces com outros sistemas sem menção explícita
|
|
||||||
- Documente todas as alterações propostas
|
|
||||||
|
|
||||||
## Formato de Resposta
|
|
||||||
Responda em JSON válido:
|
|
||||||
{
|
|
||||||
"root_cause": "Descrição da causa raiz identificada",
|
|
||||||
"affected_files": ["arquivo1.cbl", "arquivo2.cbl"],
|
|
||||||
"proposed_fix": "Código COBOL com a correção proposta",
|
|
||||||
"confidence": 0.85,
|
|
||||||
"explanation": "Explicação detalhada do impacto"
|
|
||||||
}
|
|
||||||
"""
|
|
||||||
return prompt
|
|
||||||
|
|
||||||
async def _call_llm(self, prompt: str) -> str:
|
|
||||||
"""Call the configured LLM provider."""
|
|
||||||
if self.provider == "azure":
|
|
||||||
return await self._call_azure(prompt)
|
|
||||||
else:
|
|
||||||
return await self._call_openrouter(prompt)
|
|
||||||
|
|
||||||
async def _call_azure(self, prompt: str) -> str:
|
|
||||||
"""Call Azure OpenAI."""
|
|
||||||
url = f"{self.azure_endpoint}/openai/deployments/{self.azure_model}/chat/completions?api-version=2024-02-01"
|
|
||||||
|
|
||||||
async with httpx.AsyncClient() as client:
|
|
||||||
response = await client.post(
|
|
||||||
url,
|
|
||||||
headers={
|
|
||||||
"api-key": self.azure_key,
|
|
||||||
"Content-Type": "application/json",
|
|
||||||
},
|
|
||||||
json={
|
|
||||||
"messages": [{"role": "user", "content": prompt}],
|
|
||||||
"temperature": 0.2,
|
|
||||||
"max_tokens": 4096,
|
|
||||||
},
|
|
||||||
timeout=120.0,
|
|
||||||
)
|
|
||||||
response.raise_for_status()
|
|
||||||
data = response.json()
|
|
||||||
return data["choices"][0]["message"]["content"]
|
|
||||||
|
|
||||||
async def _call_openrouter(self, prompt: str) -> str:
|
|
||||||
"""Call OpenRouter API."""
|
|
||||||
async with httpx.AsyncClient() as client:
|
|
||||||
response = await client.post(
|
|
||||||
"https://openrouter.ai/api/v1/chat/completions",
|
|
||||||
headers={
|
|
||||||
"Authorization": f"Bearer {self.openrouter_key}",
|
|
||||||
"Content-Type": "application/json",
|
|
||||||
},
|
|
||||||
json={
|
|
||||||
"model": self.openrouter_model,
|
|
||||||
"messages": [{"role": "user", "content": prompt}],
|
|
||||||
"temperature": 0.2,
|
|
||||||
"max_tokens": 4096,
|
|
||||||
},
|
|
||||||
timeout=120.0,
|
|
||||||
)
|
|
||||||
response.raise_for_status()
|
|
||||||
data = response.json()
|
|
||||||
return data["choices"][0]["message"]["content"]
|
|
||||||
|
|
||||||
def _parse_analysis_response(self, response: str) -> Dict[str, Any]:
|
|
||||||
"""Parse LLM response into structured format."""
|
|
||||||
try:
|
|
||||||
# Try to extract JSON from response
|
|
||||||
start = response.find("{")
|
|
||||||
end = response.rfind("}") + 1
|
|
||||||
if start >= 0 and end > start:
|
|
||||||
json_str = response[start:end]
|
|
||||||
return json.loads(json_str)
|
|
||||||
except json.JSONDecodeError:
|
|
||||||
logger.warning("Failed to parse LLM response as JSON")
|
|
||||||
|
|
||||||
# Fallback: return raw response
|
|
||||||
return {
|
|
||||||
"root_cause": "Unable to parse structured response",
|
|
||||||
"affected_files": [],
|
|
||||||
"proposed_fix": response,
|
|
||||||
"confidence": 0.3,
|
|
||||||
"explanation": "Response could not be parsed automatically",
|
|
||||||
}
|
|
||||||
|
|
@ -0,0 +1,21 @@
|
||||||
|
from fastapi import APIRouter
|
||||||
|
from .auth import router as auth_router
|
||||||
|
from .users import router as users_router
|
||||||
|
from .organizations import router as orgs_router
|
||||||
|
from .integrations import router as integrations_router
|
||||||
|
from .issues import router as issues_router
|
||||||
|
from .webhooks import router as webhooks_router
|
||||||
|
from .reports import router as reports_router
|
||||||
|
from .gitea import router as gitea_router
|
||||||
|
from .settings import router as settings_router
|
||||||
|
|
||||||
|
api_router = APIRouter()
|
||||||
|
api_router.include_router(auth_router, prefix="/auth", tags=["Authentication"])
|
||||||
|
api_router.include_router(users_router, prefix="/users", tags=["Users"])
|
||||||
|
api_router.include_router(orgs_router, prefix="/organizations", tags=["Organizations"])
|
||||||
|
api_router.include_router(settings_router, prefix="/organizations", tags=["Settings"])
|
||||||
|
api_router.include_router(integrations_router, prefix="/integrations", tags=["Integrations"])
|
||||||
|
api_router.include_router(issues_router, prefix="/issues", tags=["Issues"])
|
||||||
|
api_router.include_router(webhooks_router, prefix="/webhooks", tags=["Webhooks"])
|
||||||
|
api_router.include_router(reports_router, prefix="/reports", tags=["Reports"])
|
||||||
|
api_router.include_router(gitea_router, prefix="/gitea", tags=["Gitea"])
|
||||||
|
|
@ -0,0 +1,158 @@
|
||||||
|
"""Authentication endpoints."""
|
||||||
|
from datetime import datetime
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException, status, Request
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
from sqlalchemy import select
|
||||||
|
from app.core.database import get_db
|
||||||
|
from app.core.security import verify_password, get_password_hash, create_access_token, create_refresh_token, decode_token
|
||||||
|
from app.models.user import User
|
||||||
|
from app.models.organization import Organization, OrganizationMember, MemberRole
|
||||||
|
from app.schemas.user import UserCreate, UserRead, Token, LoginRequest
|
||||||
|
from app.services.audit import AuditService
|
||||||
|
import re
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
def slugify(text: str) -> str:
|
||||||
|
"""Convert text to URL-friendly slug."""
|
||||||
|
text = text.lower()
|
||||||
|
text = re.sub(r'[^\w\s-]', '', text)
|
||||||
|
text = re.sub(r'[-\s]+', '-', text)
|
||||||
|
return text.strip('-')
|
||||||
|
|
||||||
|
@router.post("/register", response_model=Token)
|
||||||
|
async def register(
|
||||||
|
user_in: UserCreate,
|
||||||
|
request: Request,
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Register a new user."""
|
||||||
|
# Check if email exists
|
||||||
|
result = await db.execute(select(User).where(User.email == user_in.email))
|
||||||
|
if result.scalar_one_or_none():
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_400_BAD_REQUEST,
|
||||||
|
detail="Email already registered"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create user
|
||||||
|
user = User(
|
||||||
|
email=user_in.email,
|
||||||
|
hashed_password=get_password_hash(user_in.password),
|
||||||
|
full_name=user_in.full_name
|
||||||
|
)
|
||||||
|
db.add(user)
|
||||||
|
await db.flush()
|
||||||
|
|
||||||
|
# Create default organization for user
|
||||||
|
org_name = user_in.full_name or user_in.email.split('@')[0]
|
||||||
|
org_slug = slugify(org_name) + f"-{user.id}"
|
||||||
|
|
||||||
|
organization = Organization(
|
||||||
|
name=f"{org_name}'s Organization",
|
||||||
|
slug=org_slug
|
||||||
|
)
|
||||||
|
db.add(organization)
|
||||||
|
await db.flush()
|
||||||
|
|
||||||
|
# Add user as organization owner
|
||||||
|
membership = OrganizationMember(
|
||||||
|
organization_id=organization.id,
|
||||||
|
user_id=user.id,
|
||||||
|
role=MemberRole.OWNER
|
||||||
|
)
|
||||||
|
db.add(membership)
|
||||||
|
await db.flush()
|
||||||
|
|
||||||
|
# Audit log
|
||||||
|
await AuditService.log(
|
||||||
|
db,
|
||||||
|
action="user.register",
|
||||||
|
user_id=user.id,
|
||||||
|
resource_type="user",
|
||||||
|
resource_id=user.id,
|
||||||
|
ip_address=request.client.host if request.client else None
|
||||||
|
)
|
||||||
|
|
||||||
|
# Return tokens
|
||||||
|
access_token = create_access_token({"sub": str(user.id), "email": user.email})
|
||||||
|
refresh_token = create_refresh_token({"sub": str(user.id), "email": user.email})
|
||||||
|
|
||||||
|
return Token(
|
||||||
|
access_token=access_token,
|
||||||
|
refresh_token=refresh_token,
|
||||||
|
token_type="bearer"
|
||||||
|
)
|
||||||
|
|
||||||
|
@router.post("/login", response_model=Token)
|
||||||
|
async def login(
|
||||||
|
credentials: LoginRequest,
|
||||||
|
request: Request,
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Login and get access token."""
|
||||||
|
result = await db.execute(select(User).where(User.email == credentials.email))
|
||||||
|
user = result.scalar_one_or_none()
|
||||||
|
|
||||||
|
if not user or not verify_password(credentials.password, user.hashed_password):
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||||
|
detail="Invalid email or password"
|
||||||
|
)
|
||||||
|
|
||||||
|
if not user.is_active:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_403_FORBIDDEN,
|
||||||
|
detail="User is inactive"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Update last login
|
||||||
|
user.last_login = datetime.utcnow()
|
||||||
|
|
||||||
|
# Audit log
|
||||||
|
await AuditService.log(
|
||||||
|
db,
|
||||||
|
action="user.login",
|
||||||
|
user_id=user.id,
|
||||||
|
resource_type="user",
|
||||||
|
resource_id=user.id,
|
||||||
|
ip_address=request.client.host if request.client else None,
|
||||||
|
user_agent=request.headers.get("user-agent")
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create tokens
|
||||||
|
token_data = {"user_id": user.id, "email": user.email}
|
||||||
|
access_token = create_access_token(token_data)
|
||||||
|
refresh_token = create_refresh_token(token_data)
|
||||||
|
|
||||||
|
return Token(access_token=access_token, refresh_token=refresh_token)
|
||||||
|
|
||||||
|
@router.post("/refresh", response_model=Token)
|
||||||
|
async def refresh_token(
|
||||||
|
refresh_token: str,
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Refresh access token."""
|
||||||
|
payload = decode_token(refresh_token)
|
||||||
|
|
||||||
|
if not payload or payload.get("type") != "refresh":
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||||
|
detail="Invalid refresh token"
|
||||||
|
)
|
||||||
|
|
||||||
|
user_id = payload.get("user_id")
|
||||||
|
result = await db.execute(select(User).where(User.id == user_id))
|
||||||
|
user = result.scalar_one_or_none()
|
||||||
|
|
||||||
|
if not user or not user.is_active:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||||
|
detail="User not found or inactive"
|
||||||
|
)
|
||||||
|
|
||||||
|
token_data = {"user_id": user.id, "email": user.email}
|
||||||
|
new_access_token = create_access_token(token_data)
|
||||||
|
new_refresh_token = create_refresh_token(token_data)
|
||||||
|
|
||||||
|
return Token(access_token=new_access_token, refresh_token=new_refresh_token)
|
||||||
|
|
@ -0,0 +1,69 @@
|
||||||
|
"""API dependencies."""
|
||||||
|
from fastapi import Depends, HTTPException, status
|
||||||
|
from fastapi.security import HTTPBearer, HTTPAuthorizationCredentials
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
from sqlalchemy import select
|
||||||
|
from app.core.database import get_db
|
||||||
|
from app.core.security import decode_token, has_permission
|
||||||
|
from app.models.user import User
|
||||||
|
from app.models.organization import OrganizationMember
|
||||||
|
|
||||||
|
security = HTTPBearer()
|
||||||
|
|
||||||
|
async def get_current_user(
|
||||||
|
credentials: HTTPAuthorizationCredentials = Depends(security),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
) -> User:
|
||||||
|
"""Get current authenticated user."""
|
||||||
|
token = credentials.credentials
|
||||||
|
payload = decode_token(token)
|
||||||
|
|
||||||
|
if not payload or payload.get("type") != "access":
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||||
|
detail="Invalid or expired token"
|
||||||
|
)
|
||||||
|
|
||||||
|
user_id = payload.get("user_id")
|
||||||
|
result = await db.execute(select(User).where(User.id == user_id))
|
||||||
|
user = result.scalar_one_or_none()
|
||||||
|
|
||||||
|
if not user or not user.is_active:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_401_UNAUTHORIZED,
|
||||||
|
detail="User not found or inactive"
|
||||||
|
)
|
||||||
|
|
||||||
|
return user
|
||||||
|
|
||||||
|
async def get_org_member(
|
||||||
|
org_id: int,
|
||||||
|
user: User = Depends(get_current_user),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
) -> OrganizationMember:
|
||||||
|
"""Get user's membership in organization."""
|
||||||
|
result = await db.execute(
|
||||||
|
select(OrganizationMember)
|
||||||
|
.where(OrganizationMember.organization_id == org_id)
|
||||||
|
.where(OrganizationMember.user_id == user.id)
|
||||||
|
)
|
||||||
|
member = result.scalar_one_or_none()
|
||||||
|
|
||||||
|
if not member and not user.is_superuser:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_403_FORBIDDEN,
|
||||||
|
detail="Not a member of this organization"
|
||||||
|
)
|
||||||
|
|
||||||
|
return member
|
||||||
|
|
||||||
|
def require_role(required_role: str):
|
||||||
|
"""Dependency to require a minimum role."""
|
||||||
|
async def check_role(member: OrganizationMember = Depends(get_org_member)):
|
||||||
|
if not has_permission(member.role.value, required_role):
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_403_FORBIDDEN,
|
||||||
|
detail=f"Requires {required_role} role or higher"
|
||||||
|
)
|
||||||
|
return member
|
||||||
|
return check_role
|
||||||
|
|
@ -0,0 +1,109 @@
|
||||||
|
"""Gitea integration endpoints."""
|
||||||
|
from typing import List, Dict, Any
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException, status
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
from sqlalchemy import select
|
||||||
|
from app.core.database import get_db
|
||||||
|
from app.models.integration import Integration, IntegrationType
|
||||||
|
from app.models.organization import OrganizationMember
|
||||||
|
from app.api.deps import require_role
|
||||||
|
from app.services.gitea import GiteaService
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
@router.get("/repos", response_model=List[Dict[str, Any]])
|
||||||
|
async def list_repositories(
|
||||||
|
org_id: int,
|
||||||
|
member: OrganizationMember = Depends(require_role("viewer")),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""List Gitea repositories for organization."""
|
||||||
|
# Get Gitea integration
|
||||||
|
result = await db.execute(
|
||||||
|
select(Integration)
|
||||||
|
.where(Integration.organization_id == org_id)
|
||||||
|
.where(Integration.type == IntegrationType.GITLAB) # Using GITLAB as Gitea
|
||||||
|
.where(Integration.status == "ACTIVE")
|
||||||
|
)
|
||||||
|
integration = result.scalar_one_or_none()
|
||||||
|
|
||||||
|
if not integration or not integration.base_url or not integration.api_key:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_404_NOT_FOUND,
|
||||||
|
detail="Gitea integration not configured"
|
||||||
|
)
|
||||||
|
|
||||||
|
gitea = GiteaService(integration.base_url, integration.api_key)
|
||||||
|
repos = await gitea.list_repositories("startdata") # Fixed owner for now
|
||||||
|
|
||||||
|
return repos
|
||||||
|
|
||||||
|
@router.get("/repos/{owner}/{repo}")
|
||||||
|
async def get_repository(
|
||||||
|
org_id: int,
|
||||||
|
owner: str,
|
||||||
|
repo: str,
|
||||||
|
member: OrganizationMember = Depends(require_role("viewer")),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Get repository details."""
|
||||||
|
result = await db.execute(
|
||||||
|
select(Integration)
|
||||||
|
.where(Integration.organization_id == org_id)
|
||||||
|
.where(Integration.type == IntegrationType.GITLAB)
|
||||||
|
.where(Integration.status == "ACTIVE")
|
||||||
|
)
|
||||||
|
integration = result.scalar_one_or_none()
|
||||||
|
|
||||||
|
if not integration or not integration.base_url or not integration.api_key:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_404_NOT_FOUND,
|
||||||
|
detail="Gitea integration not configured"
|
||||||
|
)
|
||||||
|
|
||||||
|
gitea = GiteaService(integration.base_url, integration.api_key)
|
||||||
|
repo_data = await gitea.get_repo(owner, repo)
|
||||||
|
|
||||||
|
if not repo_data:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_404_NOT_FOUND,
|
||||||
|
detail="Repository not found"
|
||||||
|
)
|
||||||
|
|
||||||
|
return repo_data
|
||||||
|
|
||||||
|
@router.get("/repos/{owner}/{repo}/file")
|
||||||
|
async def get_file(
|
||||||
|
org_id: int,
|
||||||
|
owner: str,
|
||||||
|
repo: str,
|
||||||
|
path: str,
|
||||||
|
ref: str = "main",
|
||||||
|
member: OrganizationMember = Depends(require_role("viewer")),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Get file content from repository."""
|
||||||
|
result = await db.execute(
|
||||||
|
select(Integration)
|
||||||
|
.where(Integration.organization_id == org_id)
|
||||||
|
.where(Integration.type == IntegrationType.GITLAB)
|
||||||
|
.where(Integration.status == "ACTIVE")
|
||||||
|
)
|
||||||
|
integration = result.scalar_one_or_none()
|
||||||
|
|
||||||
|
if not integration or not integration.base_url or not integration.api_key:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_404_NOT_FOUND,
|
||||||
|
detail="Gitea integration not configured"
|
||||||
|
)
|
||||||
|
|
||||||
|
gitea = GiteaService(integration.base_url, integration.api_key)
|
||||||
|
content = await gitea.get_file(owner, repo, path, ref)
|
||||||
|
|
||||||
|
if content is None:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=status.HTTP_404_NOT_FOUND,
|
||||||
|
detail="File not found"
|
||||||
|
)
|
||||||
|
|
||||||
|
return {"path": path, "content": content, "ref": ref}
|
||||||
|
|
@ -0,0 +1,134 @@
|
||||||
|
"""Integration management endpoints."""
|
||||||
|
from typing import List
|
||||||
|
import secrets
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException, status
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
from sqlalchemy import select
|
||||||
|
from app.core.database import get_db
|
||||||
|
from app.models.integration import Integration, IntegrationType, IntegrationStatus
|
||||||
|
from app.models.organization import OrganizationMember
|
||||||
|
from app.schemas.integration import IntegrationCreate, IntegrationRead, IntegrationUpdate
|
||||||
|
from app.api.deps import get_current_user, require_role
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
@router.get("/", response_model=List[IntegrationRead])
|
||||||
|
async def list_integrations(
|
||||||
|
org_id: int,
|
||||||
|
member: OrganizationMember = Depends(require_role("analyst")),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""List integrations for organization."""
|
||||||
|
result = await db.execute(
|
||||||
|
select(Integration).where(Integration.organization_id == org_id)
|
||||||
|
)
|
||||||
|
return result.scalars().all()
|
||||||
|
|
||||||
|
@router.post("/", response_model=IntegrationRead, status_code=status.HTTP_201_CREATED)
|
||||||
|
async def create_integration(
|
||||||
|
org_id: int,
|
||||||
|
integration_in: IntegrationCreate,
|
||||||
|
member: OrganizationMember = Depends(require_role("admin")),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Create a new integration."""
|
||||||
|
# Generate webhook secret
|
||||||
|
webhook_secret = secrets.token_hex(32)
|
||||||
|
|
||||||
|
integration = Integration(
|
||||||
|
organization_id=org_id,
|
||||||
|
name=integration_in.name,
|
||||||
|
type=integration_in.type,
|
||||||
|
base_url=integration_in.base_url,
|
||||||
|
api_key=integration_in.api_key,
|
||||||
|
webhook_secret=webhook_secret,
|
||||||
|
callback_url=integration_in.callback_url,
|
||||||
|
auto_analyze=integration_in.auto_analyze,
|
||||||
|
config=integration_in.config or {},
|
||||||
|
status=IntegrationStatus.ACTIVE
|
||||||
|
)
|
||||||
|
db.add(integration)
|
||||||
|
await db.commit()
|
||||||
|
await db.refresh(integration)
|
||||||
|
|
||||||
|
return integration
|
||||||
|
|
||||||
|
@router.get("/{integration_id}", response_model=IntegrationRead)
|
||||||
|
async def get_integration(
|
||||||
|
org_id: int,
|
||||||
|
integration_id: int,
|
||||||
|
member: OrganizationMember = Depends(require_role("analyst")),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Get integration details."""
|
||||||
|
result = await db.execute(
|
||||||
|
select(Integration)
|
||||||
|
.where(Integration.id == integration_id)
|
||||||
|
.where(Integration.organization_id == org_id)
|
||||||
|
)
|
||||||
|
integration = result.scalar_one_or_none()
|
||||||
|
if not integration:
|
||||||
|
raise HTTPException(status_code=404, detail="Integration not found")
|
||||||
|
return integration
|
||||||
|
|
||||||
|
@router.patch("/{integration_id}", response_model=IntegrationRead)
|
||||||
|
async def update_integration(
|
||||||
|
org_id: int,
|
||||||
|
integration_id: int,
|
||||||
|
integration_update: IntegrationUpdate,
|
||||||
|
member: OrganizationMember = Depends(require_role("admin")),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Update integration."""
|
||||||
|
result = await db.execute(
|
||||||
|
select(Integration)
|
||||||
|
.where(Integration.id == integration_id)
|
||||||
|
.where(Integration.organization_id == org_id)
|
||||||
|
)
|
||||||
|
integration = result.scalar_one_or_none()
|
||||||
|
if not integration:
|
||||||
|
raise HTTPException(status_code=404, detail="Integration not found")
|
||||||
|
|
||||||
|
for field, value in integration_update.dict(exclude_unset=True).items():
|
||||||
|
setattr(integration, field, value)
|
||||||
|
|
||||||
|
return integration
|
||||||
|
|
||||||
|
@router.delete("/{integration_id}", status_code=status.HTTP_204_NO_CONTENT)
|
||||||
|
async def delete_integration(
|
||||||
|
org_id: int,
|
||||||
|
integration_id: int,
|
||||||
|
member: OrganizationMember = Depends(require_role("admin")),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Delete integration."""
|
||||||
|
result = await db.execute(
|
||||||
|
select(Integration)
|
||||||
|
.where(Integration.id == integration_id)
|
||||||
|
.where(Integration.organization_id == org_id)
|
||||||
|
)
|
||||||
|
integration = result.scalar_one_or_none()
|
||||||
|
if not integration:
|
||||||
|
raise HTTPException(status_code=404, detail="Integration not found")
|
||||||
|
|
||||||
|
await db.delete(integration)
|
||||||
|
|
||||||
|
@router.post("/{integration_id}/test")
|
||||||
|
async def test_integration(
|
||||||
|
org_id: int,
|
||||||
|
integration_id: int,
|
||||||
|
member: OrganizationMember = Depends(require_role("admin")),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Test integration connection."""
|
||||||
|
result = await db.execute(
|
||||||
|
select(Integration)
|
||||||
|
.where(Integration.id == integration_id)
|
||||||
|
.where(Integration.organization_id == org_id)
|
||||||
|
)
|
||||||
|
integration = result.scalar_one_or_none()
|
||||||
|
if not integration:
|
||||||
|
raise HTTPException(status_code=404, detail="Integration not found")
|
||||||
|
|
||||||
|
# TODO: Implement actual connection test based on integration type
|
||||||
|
return {"status": "ok", "message": "Connection successful"}
|
||||||
|
|
@ -0,0 +1,280 @@
|
||||||
|
"""Issue management endpoints."""
|
||||||
|
from typing import List, Optional
|
||||||
|
from datetime import datetime
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException, status, BackgroundTasks, Query
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
from sqlalchemy import select, func
|
||||||
|
from app.core.database import get_db
|
||||||
|
from app.models.issue import Issue, IssueStatus, IssueComment
|
||||||
|
from app.models.organization import OrganizationMember
|
||||||
|
from app.models.integration import Integration
|
||||||
|
from app.schemas.issue import IssueCreate, IssueRead, IssueUpdate, IssueStats, IssueComment as IssueCommentSchema
|
||||||
|
from app.api.deps import get_current_user, require_role
|
||||||
|
from app.services.analysis import AnalysisService
|
||||||
|
from app.services.email import EmailService
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
async def run_analysis(issue_id: int, db_url: str):
|
||||||
|
"""Background task to analyze issue."""
|
||||||
|
from sqlalchemy.ext.asyncio import create_async_engine, AsyncSession
|
||||||
|
from sqlalchemy.orm import sessionmaker
|
||||||
|
from app.models.organization import Organization
|
||||||
|
|
||||||
|
engine = create_async_engine(db_url)
|
||||||
|
async_session = sessionmaker(engine, class_=AsyncSession, expire_on_commit=False)
|
||||||
|
|
||||||
|
async with async_session() as db:
|
||||||
|
result = await db.execute(select(Issue).where(Issue.id == issue_id))
|
||||||
|
issue = result.scalar_one_or_none()
|
||||||
|
if not issue:
|
||||||
|
return
|
||||||
|
|
||||||
|
issue.status = IssueStatus.ANALYZING
|
||||||
|
issue.analysis_started_at = datetime.utcnow()
|
||||||
|
await db.commit()
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Get AI config from organization
|
||||||
|
ai_config = await AnalysisService.get_org_ai_config(db, issue.organization_id)
|
||||||
|
|
||||||
|
# Get integration to find associated repo
|
||||||
|
repo = "startdata/cobol-sample-app" # Default
|
||||||
|
if issue.integration_id:
|
||||||
|
intg_result = await db.execute(
|
||||||
|
select(Integration).where(Integration.id == issue.integration_id)
|
||||||
|
)
|
||||||
|
integration = intg_result.scalar_one_or_none()
|
||||||
|
if integration and integration.config:
|
||||||
|
# Get repo from integration config if available
|
||||||
|
repo = integration.config.get("repository", repo)
|
||||||
|
|
||||||
|
# Run analysis with org's AI config
|
||||||
|
analysis = await AnalysisService.analyze(
|
||||||
|
{
|
||||||
|
"title": issue.title,
|
||||||
|
"description": issue.description,
|
||||||
|
"priority": issue.priority.value if issue.priority else "medium"
|
||||||
|
},
|
||||||
|
repo=repo,
|
||||||
|
ai_config=ai_config
|
||||||
|
)
|
||||||
|
|
||||||
|
issue.root_cause = analysis.get("root_cause")
|
||||||
|
issue.affected_files = analysis.get("affected_files", [])
|
||||||
|
issue.suggested_fix = analysis.get("suggested_fix")
|
||||||
|
issue.confidence = analysis.get("confidence", 0)
|
||||||
|
issue.analysis_raw = analysis
|
||||||
|
issue.status = IssueStatus.ANALYZED
|
||||||
|
issue.analysis_completed_at = datetime.utcnow()
|
||||||
|
|
||||||
|
# Create PR if enabled and confidence meets threshold
|
||||||
|
confidence_threshold = ai_config.get("confidence_threshold", 70) / 100
|
||||||
|
auto_create_pr = ai_config.get("auto_create_pr", True)
|
||||||
|
|
||||||
|
if auto_create_pr and repo and issue.confidence and issue.confidence >= confidence_threshold:
|
||||||
|
branch = f"fix/{issue.external_key or issue.id}-auto-fix"
|
||||||
|
pr_url = await AnalysisService.create_pull_request(
|
||||||
|
repo=repo,
|
||||||
|
branch=branch,
|
||||||
|
title=f"Fix: {issue.title}",
|
||||||
|
description=f"## Root Cause\n{issue.root_cause}\n\n## Suggested Fix\n{issue.suggested_fix}",
|
||||||
|
file_changes=[]
|
||||||
|
)
|
||||||
|
if pr_url:
|
||||||
|
issue.pr_url = pr_url
|
||||||
|
issue.pr_branch = branch
|
||||||
|
issue.status = IssueStatus.PR_CREATED
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
issue.status = IssueStatus.ERROR
|
||||||
|
issue.root_cause = f"Analysis failed: {str(e)}"
|
||||||
|
|
||||||
|
await db.commit()
|
||||||
|
|
||||||
|
@router.get("/", response_model=List[IssueRead])
|
||||||
|
async def list_issues(
|
||||||
|
org_id: int,
|
||||||
|
status: Optional[IssueStatus] = Query(None),
|
||||||
|
source: Optional[str] = Query(None),
|
||||||
|
limit: int = 50,
|
||||||
|
offset: int = 0,
|
||||||
|
member: OrganizationMember = Depends(require_role("viewer")),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""List issues for organization."""
|
||||||
|
query = select(Issue).where(Issue.organization_id == org_id)
|
||||||
|
|
||||||
|
if status:
|
||||||
|
query = query.where(Issue.status == status)
|
||||||
|
if source:
|
||||||
|
query = query.where(Issue.source == source)
|
||||||
|
|
||||||
|
query = query.order_by(Issue.created_at.desc()).offset(offset).limit(limit)
|
||||||
|
result = await db.execute(query)
|
||||||
|
return result.scalars().all()
|
||||||
|
|
||||||
|
@router.get("/stats", response_model=IssueStats)
|
||||||
|
async def get_stats(
|
||||||
|
org_id: int,
|
||||||
|
member: OrganizationMember = Depends(require_role("viewer")),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Get issue statistics."""
|
||||||
|
# Total counts by status
|
||||||
|
total_result = await db.execute(
|
||||||
|
select(func.count(Issue.id)).where(Issue.organization_id == org_id)
|
||||||
|
)
|
||||||
|
total = total_result.scalar() or 0
|
||||||
|
|
||||||
|
status_counts = {}
|
||||||
|
for s in IssueStatus:
|
||||||
|
result = await db.execute(
|
||||||
|
select(func.count(Issue.id))
|
||||||
|
.where(Issue.organization_id == org_id)
|
||||||
|
.where(Issue.status == s)
|
||||||
|
)
|
||||||
|
status_counts[s.value] = result.scalar() or 0
|
||||||
|
|
||||||
|
# By source
|
||||||
|
source_result = await db.execute(
|
||||||
|
select(Issue.source, func.count(Issue.id))
|
||||||
|
.where(Issue.organization_id == org_id)
|
||||||
|
.group_by(Issue.source)
|
||||||
|
)
|
||||||
|
by_source = {row[0] or "unknown": row[1] for row in source_result.all()}
|
||||||
|
|
||||||
|
# By priority
|
||||||
|
priority_result = await db.execute(
|
||||||
|
select(Issue.priority, func.count(Issue.id))
|
||||||
|
.where(Issue.organization_id == org_id)
|
||||||
|
.group_by(Issue.priority)
|
||||||
|
)
|
||||||
|
by_priority = {str(row[0].value) if row[0] else "unknown": row[1] for row in priority_result.all()}
|
||||||
|
|
||||||
|
# Avg confidence
|
||||||
|
avg_result = await db.execute(
|
||||||
|
select(func.avg(Issue.confidence))
|
||||||
|
.where(Issue.organization_id == org_id)
|
||||||
|
.where(Issue.confidence.isnot(None))
|
||||||
|
)
|
||||||
|
avg_confidence = avg_result.scalar() or 0
|
||||||
|
|
||||||
|
# SLA breached (disabled for now - field doesn't exist)
|
||||||
|
sla_breached = 0
|
||||||
|
|
||||||
|
return IssueStats(
|
||||||
|
total=total,
|
||||||
|
pending=status_counts.get("pending", 0),
|
||||||
|
analyzing=status_counts.get("analyzing", 0),
|
||||||
|
analyzed=status_counts.get("analyzed", 0),
|
||||||
|
pr_created=status_counts.get("pr_created", 0),
|
||||||
|
completed=status_counts.get("completed", 0),
|
||||||
|
error=status_counts.get("error", 0),
|
||||||
|
avg_confidence=avg_confidence,
|
||||||
|
by_source=by_source,
|
||||||
|
by_priority=by_priority,
|
||||||
|
sla_breached=sla_breached
|
||||||
|
)
|
||||||
|
|
||||||
|
@router.post("/", response_model=IssueRead, status_code=status.HTTP_201_CREATED)
|
||||||
|
async def create_issue(
|
||||||
|
org_id: int,
|
||||||
|
issue_in: IssueCreate,
|
||||||
|
background_tasks: BackgroundTasks,
|
||||||
|
member: OrganizationMember = Depends(require_role("analyst")),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Create and analyze a new issue."""
|
||||||
|
issue = Issue(
|
||||||
|
organization_id=org_id,
|
||||||
|
title=issue_in.title,
|
||||||
|
description=issue_in.description,
|
||||||
|
priority=issue_in.priority,
|
||||||
|
external_id=issue_in.external_id,
|
||||||
|
external_key=issue_in.external_key,
|
||||||
|
external_url=issue_in.external_url,
|
||||||
|
source=issue_in.source,
|
||||||
|
labels=issue_in.labels,
|
||||||
|
callback_url=issue_in.callback_url,
|
||||||
|
raw_payload=issue_in.raw_payload
|
||||||
|
)
|
||||||
|
db.add(issue)
|
||||||
|
await db.flush()
|
||||||
|
|
||||||
|
# Queue analysis
|
||||||
|
from app.core.config import settings
|
||||||
|
background_tasks.add_task(run_analysis, issue.id, settings.DATABASE_URL.replace("postgresql://", "postgresql+asyncpg://"))
|
||||||
|
|
||||||
|
return issue
|
||||||
|
|
||||||
|
@router.get("/{issue_id}", response_model=IssueRead)
|
||||||
|
async def get_issue(
|
||||||
|
org_id: int,
|
||||||
|
issue_id: int,
|
||||||
|
member: OrganizationMember = Depends(require_role("viewer")),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Get issue details."""
|
||||||
|
result = await db.execute(
|
||||||
|
select(Issue)
|
||||||
|
.where(Issue.id == issue_id)
|
||||||
|
.where(Issue.organization_id == org_id)
|
||||||
|
)
|
||||||
|
issue = result.scalar_one_or_none()
|
||||||
|
if not issue:
|
||||||
|
raise HTTPException(status_code=404, detail="Issue not found")
|
||||||
|
return issue
|
||||||
|
|
||||||
|
@router.post("/{issue_id}/reanalyze", response_model=IssueRead)
|
||||||
|
async def reanalyze_issue(
|
||||||
|
org_id: int,
|
||||||
|
issue_id: int,
|
||||||
|
background_tasks: BackgroundTasks,
|
||||||
|
member: OrganizationMember = Depends(require_role("analyst")),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Rerun analysis on issue."""
|
||||||
|
result = await db.execute(
|
||||||
|
select(Issue)
|
||||||
|
.where(Issue.id == issue_id)
|
||||||
|
.where(Issue.organization_id == org_id)
|
||||||
|
)
|
||||||
|
issue = result.scalar_one_or_none()
|
||||||
|
if not issue:
|
||||||
|
raise HTTPException(status_code=404, detail="Issue not found")
|
||||||
|
|
||||||
|
issue.status = IssueStatus.PENDING
|
||||||
|
|
||||||
|
from app.core.config import settings
|
||||||
|
background_tasks.add_task(run_analysis, issue.id, settings.DATABASE_URL.replace("postgresql://", "postgresql+asyncpg://"))
|
||||||
|
|
||||||
|
return issue
|
||||||
|
|
||||||
|
@router.post("/{issue_id}/comments")
|
||||||
|
async def add_comment(
|
||||||
|
org_id: int,
|
||||||
|
issue_id: int,
|
||||||
|
comment: IssueCommentSchema,
|
||||||
|
member: OrganizationMember = Depends(require_role("analyst")),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Add comment to issue."""
|
||||||
|
result = await db.execute(
|
||||||
|
select(Issue)
|
||||||
|
.where(Issue.id == issue_id)
|
||||||
|
.where(Issue.organization_id == org_id)
|
||||||
|
)
|
||||||
|
issue = result.scalar_one_or_none()
|
||||||
|
if not issue:
|
||||||
|
raise HTTPException(status_code=404, detail="Issue not found")
|
||||||
|
|
||||||
|
new_comment = IssueComment(
|
||||||
|
issue_id=issue_id,
|
||||||
|
author=comment.author,
|
||||||
|
content=comment.content,
|
||||||
|
author_type=comment.author_type
|
||||||
|
)
|
||||||
|
db.add(new_comment)
|
||||||
|
|
||||||
|
return {"status": "ok"}
|
||||||
|
|
@ -0,0 +1,169 @@
|
||||||
|
"""Organization management endpoints."""
|
||||||
|
from typing import List
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException, status
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
from sqlalchemy import select, func
|
||||||
|
from app.core.database import get_db
|
||||||
|
from app.models.user import User
|
||||||
|
from app.models.organization import Organization, OrganizationMember, MemberRole
|
||||||
|
from app.schemas.organization import OrganizationCreate, OrganizationRead, OrganizationUpdate, MemberCreate, MemberRead
|
||||||
|
from app.api.deps import get_current_user, require_role
|
||||||
|
from app.services.email import EmailService
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
@router.get("/", response_model=List[OrganizationRead])
|
||||||
|
async def list_organizations(
|
||||||
|
user: User = Depends(get_current_user),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""List organizations user belongs to."""
|
||||||
|
result = await db.execute(
|
||||||
|
select(Organization)
|
||||||
|
.join(OrganizationMember)
|
||||||
|
.where(OrganizationMember.user_id == user.id)
|
||||||
|
)
|
||||||
|
return result.scalars().all()
|
||||||
|
|
||||||
|
@router.post("/", response_model=OrganizationRead, status_code=status.HTTP_201_CREATED)
|
||||||
|
async def create_organization(
|
||||||
|
org_in: OrganizationCreate,
|
||||||
|
user: User = Depends(get_current_user),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Create a new organization."""
|
||||||
|
import re
|
||||||
|
|
||||||
|
# Auto-generate slug if not provided
|
||||||
|
if not org_in.slug:
|
||||||
|
base_slug = re.sub(r'[^\w\s-]', '', org_in.name.lower())
|
||||||
|
base_slug = re.sub(r'[-\s]+', '-', base_slug).strip('-')
|
||||||
|
|
||||||
|
# Check uniqueness and add number if needed
|
||||||
|
slug = base_slug
|
||||||
|
counter = 1
|
||||||
|
while True:
|
||||||
|
result = await db.execute(select(Organization).where(Organization.slug == slug))
|
||||||
|
if not result.scalar_one_or_none():
|
||||||
|
break
|
||||||
|
counter += 1
|
||||||
|
slug = f"{base_slug}-{counter}"
|
||||||
|
else:
|
||||||
|
slug = org_in.slug
|
||||||
|
# Check slug uniqueness
|
||||||
|
result = await db.execute(select(Organization).where(Organization.slug == slug))
|
||||||
|
if result.scalar_one_or_none():
|
||||||
|
raise HTTPException(status_code=400, detail="Slug already exists")
|
||||||
|
|
||||||
|
# Create org
|
||||||
|
org = Organization(
|
||||||
|
name=org_in.name,
|
||||||
|
slug=slug
|
||||||
|
)
|
||||||
|
db.add(org)
|
||||||
|
await db.flush()
|
||||||
|
|
||||||
|
# Add creator as owner
|
||||||
|
member = OrganizationMember(
|
||||||
|
organization_id=org.id,
|
||||||
|
user_id=user.id,
|
||||||
|
role=MemberRole.OWNER
|
||||||
|
)
|
||||||
|
db.add(member)
|
||||||
|
|
||||||
|
return org
|
||||||
|
|
||||||
|
@router.get("/{org_id}", response_model=OrganizationRead)
|
||||||
|
async def get_organization(
|
||||||
|
org_id: int,
|
||||||
|
user: User = Depends(get_current_user),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Get organization details."""
|
||||||
|
result = await db.execute(select(Organization).where(Organization.id == org_id))
|
||||||
|
org = result.scalar_one_or_none()
|
||||||
|
if not org:
|
||||||
|
raise HTTPException(status_code=404, detail="Organization not found")
|
||||||
|
return org
|
||||||
|
|
||||||
|
@router.patch("/{org_id}", response_model=OrganizationRead)
|
||||||
|
async def update_organization(
|
||||||
|
org_id: int,
|
||||||
|
org_update: OrganizationUpdate,
|
||||||
|
member: OrganizationMember = Depends(require_role("admin")),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Update organization (admin only)."""
|
||||||
|
result = await db.execute(select(Organization).where(Organization.id == org_id))
|
||||||
|
org = result.scalar_one_or_none()
|
||||||
|
if not org:
|
||||||
|
raise HTTPException(status_code=404, detail="Organization not found")
|
||||||
|
|
||||||
|
for field, value in org_update.dict(exclude_unset=True).items():
|
||||||
|
setattr(org, field, value)
|
||||||
|
|
||||||
|
return org
|
||||||
|
|
||||||
|
@router.get("/{org_id}/members", response_model=List[MemberRead])
|
||||||
|
async def list_members(
|
||||||
|
org_id: int,
|
||||||
|
user: User = Depends(get_current_user),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""List organization members."""
|
||||||
|
result = await db.execute(
|
||||||
|
select(OrganizationMember)
|
||||||
|
.where(OrganizationMember.organization_id == org_id)
|
||||||
|
)
|
||||||
|
return result.scalars().all()
|
||||||
|
|
||||||
|
@router.post("/{org_id}/members", response_model=MemberRead, status_code=status.HTTP_201_CREATED)
|
||||||
|
async def invite_member(
|
||||||
|
org_id: int,
|
||||||
|
member_in: MemberCreate,
|
||||||
|
current_member: OrganizationMember = Depends(require_role("admin")),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Invite a new member (admin only)."""
|
||||||
|
# Find or create user
|
||||||
|
result = await db.execute(select(User).where(User.email == member_in.email))
|
||||||
|
user = result.scalar_one_or_none()
|
||||||
|
|
||||||
|
if not user:
|
||||||
|
# Create placeholder user
|
||||||
|
from app.core.security import get_password_hash
|
||||||
|
import secrets
|
||||||
|
user = User(
|
||||||
|
email=member_in.email,
|
||||||
|
hashed_password=get_password_hash(secrets.token_urlsafe(32)),
|
||||||
|
is_active=False # Will activate on first login
|
||||||
|
)
|
||||||
|
db.add(user)
|
||||||
|
await db.flush()
|
||||||
|
|
||||||
|
# Check if already member
|
||||||
|
result = await db.execute(
|
||||||
|
select(OrganizationMember)
|
||||||
|
.where(OrganizationMember.organization_id == org_id)
|
||||||
|
.where(OrganizationMember.user_id == user.id)
|
||||||
|
)
|
||||||
|
if result.scalar_one_or_none():
|
||||||
|
raise HTTPException(status_code=400, detail="User is already a member")
|
||||||
|
|
||||||
|
# Add member
|
||||||
|
member = OrganizationMember(
|
||||||
|
organization_id=org_id,
|
||||||
|
user_id=user.id,
|
||||||
|
role=member_in.role,
|
||||||
|
invited_by_id=current_member.user_id
|
||||||
|
)
|
||||||
|
db.add(member)
|
||||||
|
|
||||||
|
# Get org name for email
|
||||||
|
org_result = await db.execute(select(Organization).where(Organization.id == org_id))
|
||||||
|
org = org_result.scalar_one()
|
||||||
|
|
||||||
|
# Send welcome email
|
||||||
|
await EmailService.send_welcome(user.email, user.full_name or user.email, org.name)
|
||||||
|
|
||||||
|
return member
|
||||||
|
|
@ -0,0 +1,192 @@
|
||||||
|
"""Reports and analytics endpoints."""
|
||||||
|
from typing import List, Optional
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from fastapi import APIRouter, Depends
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
from sqlalchemy import select, func, and_
|
||||||
|
from app.core.database import get_db
|
||||||
|
from app.models.issue import Issue, IssueStatus
|
||||||
|
from app.models.organization import OrganizationMember
|
||||||
|
from app.api.deps import require_role
|
||||||
|
from pydantic import BaseModel
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
class DailyStats(BaseModel):
|
||||||
|
date: str
|
||||||
|
total: int
|
||||||
|
analyzed: int
|
||||||
|
prs_created: int
|
||||||
|
avg_confidence: float
|
||||||
|
|
||||||
|
class ReportSummary(BaseModel):
|
||||||
|
period_start: datetime
|
||||||
|
period_end: datetime
|
||||||
|
total_issues: int
|
||||||
|
analyzed_issues: int
|
||||||
|
prs_created: int
|
||||||
|
avg_confidence: float
|
||||||
|
avg_analysis_time_hours: Optional[float]
|
||||||
|
top_sources: List[dict]
|
||||||
|
daily_breakdown: List[DailyStats]
|
||||||
|
|
||||||
|
@router.get("/summary", response_model=ReportSummary)
|
||||||
|
async def get_report_summary(
|
||||||
|
org_id: int,
|
||||||
|
days: int = 30,
|
||||||
|
member: OrganizationMember = Depends(require_role("viewer")),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Get summary report for organization."""
|
||||||
|
end_date = datetime.utcnow()
|
||||||
|
start_date = end_date - timedelta(days=days)
|
||||||
|
|
||||||
|
# Total issues
|
||||||
|
total_result = await db.execute(
|
||||||
|
select(func.count(Issue.id))
|
||||||
|
.where(Issue.organization_id == org_id)
|
||||||
|
.where(Issue.created_at >= start_date)
|
||||||
|
)
|
||||||
|
total = total_result.scalar() or 0
|
||||||
|
|
||||||
|
# Analyzed
|
||||||
|
analyzed_result = await db.execute(
|
||||||
|
select(func.count(Issue.id))
|
||||||
|
.where(Issue.organization_id == org_id)
|
||||||
|
.where(Issue.created_at >= start_date)
|
||||||
|
.where(Issue.status.in_([IssueStatus.ANALYZED, IssueStatus.PR_CREATED, IssueStatus.COMPLETED]))
|
||||||
|
)
|
||||||
|
analyzed = analyzed_result.scalar() or 0
|
||||||
|
|
||||||
|
# PRs created
|
||||||
|
prs_result = await db.execute(
|
||||||
|
select(func.count(Issue.id))
|
||||||
|
.where(Issue.organization_id == org_id)
|
||||||
|
.where(Issue.created_at >= start_date)
|
||||||
|
.where(Issue.pr_url.isnot(None))
|
||||||
|
)
|
||||||
|
prs = prs_result.scalar() or 0
|
||||||
|
|
||||||
|
# Avg confidence
|
||||||
|
avg_conf_result = await db.execute(
|
||||||
|
select(func.avg(Issue.confidence))
|
||||||
|
.where(Issue.organization_id == org_id)
|
||||||
|
.where(Issue.created_at >= start_date)
|
||||||
|
.where(Issue.confidence.isnot(None))
|
||||||
|
)
|
||||||
|
avg_confidence = avg_conf_result.scalar() or 0
|
||||||
|
|
||||||
|
# Top sources
|
||||||
|
sources_result = await db.execute(
|
||||||
|
select(Issue.source, func.count(Issue.id).label("count"))
|
||||||
|
.where(Issue.organization_id == org_id)
|
||||||
|
.where(Issue.created_at >= start_date)
|
||||||
|
.group_by(Issue.source)
|
||||||
|
.order_by(func.count(Issue.id).desc())
|
||||||
|
.limit(5)
|
||||||
|
)
|
||||||
|
top_sources = [{"source": r[0] or "unknown", "count": r[1]} for r in sources_result.all()]
|
||||||
|
|
||||||
|
# Daily breakdown (simplified)
|
||||||
|
daily_breakdown = []
|
||||||
|
for i in range(min(days, 30)):
|
||||||
|
day_start = start_date + timedelta(days=i)
|
||||||
|
day_end = day_start + timedelta(days=1)
|
||||||
|
|
||||||
|
day_total = await db.execute(
|
||||||
|
select(func.count(Issue.id))
|
||||||
|
.where(Issue.organization_id == org_id)
|
||||||
|
.where(Issue.created_at >= day_start)
|
||||||
|
.where(Issue.created_at < day_end)
|
||||||
|
)
|
||||||
|
day_analyzed = await db.execute(
|
||||||
|
select(func.count(Issue.id))
|
||||||
|
.where(Issue.organization_id == org_id)
|
||||||
|
.where(Issue.created_at >= day_start)
|
||||||
|
.where(Issue.created_at < day_end)
|
||||||
|
.where(Issue.status.in_([IssueStatus.ANALYZED, IssueStatus.PR_CREATED, IssueStatus.COMPLETED]))
|
||||||
|
)
|
||||||
|
day_prs = await db.execute(
|
||||||
|
select(func.count(Issue.id))
|
||||||
|
.where(Issue.organization_id == org_id)
|
||||||
|
.where(Issue.created_at >= day_start)
|
||||||
|
.where(Issue.created_at < day_end)
|
||||||
|
.where(Issue.pr_url.isnot(None))
|
||||||
|
)
|
||||||
|
day_conf = await db.execute(
|
||||||
|
select(func.avg(Issue.confidence))
|
||||||
|
.where(Issue.organization_id == org_id)
|
||||||
|
.where(Issue.created_at >= day_start)
|
||||||
|
.where(Issue.created_at < day_end)
|
||||||
|
.where(Issue.confidence.isnot(None))
|
||||||
|
)
|
||||||
|
|
||||||
|
daily_breakdown.append(DailyStats(
|
||||||
|
date=day_start.strftime("%Y-%m-%d"),
|
||||||
|
total=day_total.scalar() or 0,
|
||||||
|
analyzed=day_analyzed.scalar() or 0,
|
||||||
|
prs_created=day_prs.scalar() or 0,
|
||||||
|
avg_confidence=day_conf.scalar() or 0
|
||||||
|
))
|
||||||
|
|
||||||
|
return ReportSummary(
|
||||||
|
period_start=start_date,
|
||||||
|
period_end=end_date,
|
||||||
|
total_issues=total,
|
||||||
|
analyzed_issues=analyzed,
|
||||||
|
prs_created=prs,
|
||||||
|
avg_confidence=avg_confidence,
|
||||||
|
avg_analysis_time_hours=None, # TODO: calculate
|
||||||
|
top_sources=top_sources,
|
||||||
|
daily_breakdown=daily_breakdown
|
||||||
|
)
|
||||||
|
|
||||||
|
@router.get("/export/csv")
|
||||||
|
async def export_csv(
|
||||||
|
org_id: int,
|
||||||
|
days: int = 30,
|
||||||
|
member: OrganizationMember = Depends(require_role("manager")),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Export issues as CSV."""
|
||||||
|
from fastapi.responses import StreamingResponse
|
||||||
|
import io
|
||||||
|
import csv
|
||||||
|
|
||||||
|
start_date = datetime.utcnow() - timedelta(days=days)
|
||||||
|
|
||||||
|
result = await db.execute(
|
||||||
|
select(Issue)
|
||||||
|
.where(Issue.organization_id == org_id)
|
||||||
|
.where(Issue.created_at >= start_date)
|
||||||
|
.order_by(Issue.created_at.desc())
|
||||||
|
)
|
||||||
|
issues = result.scalars().all()
|
||||||
|
|
||||||
|
output = io.StringIO()
|
||||||
|
writer = csv.writer(output)
|
||||||
|
writer.writerow([
|
||||||
|
"ID", "Key", "Title", "Source", "Status", "Priority",
|
||||||
|
"Confidence", "PR URL", "Created At", "Analyzed At"
|
||||||
|
])
|
||||||
|
|
||||||
|
for issue in issues:
|
||||||
|
writer.writerow([
|
||||||
|
issue.id,
|
||||||
|
issue.external_key,
|
||||||
|
issue.title,
|
||||||
|
issue.source,
|
||||||
|
issue.status.value if issue.status else "",
|
||||||
|
issue.priority.value if issue.priority else "",
|
||||||
|
f"{issue.confidence:.0%}" if issue.confidence else "",
|
||||||
|
issue.pr_url or "",
|
||||||
|
issue.created_at.isoformat() if issue.created_at else "",
|
||||||
|
issue.analysis_completed_at.isoformat() if issue.analysis_completed_at else ""
|
||||||
|
])
|
||||||
|
|
||||||
|
output.seek(0)
|
||||||
|
return StreamingResponse(
|
||||||
|
iter([output.getvalue()]),
|
||||||
|
media_type="text/csv",
|
||||||
|
headers={"Content-Disposition": f"attachment; filename=issues-{datetime.utcnow().strftime('%Y%m%d')}.csv"}
|
||||||
|
)
|
||||||
|
|
@ -0,0 +1,176 @@
|
||||||
|
"""Organization settings endpoints."""
|
||||||
|
from typing import Optional
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
from sqlalchemy import select
|
||||||
|
from pydantic import BaseModel
|
||||||
|
import httpx
|
||||||
|
import base64
|
||||||
|
|
||||||
|
from app.core.database import get_db
|
||||||
|
from app.models.organization import Organization, OrganizationMember
|
||||||
|
from app.api.deps import require_role
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
class AIConfig(BaseModel):
|
||||||
|
provider: str = "openrouter"
|
||||||
|
apiKey: str = ""
|
||||||
|
model: str = "meta-llama/llama-3.3-70b-instruct"
|
||||||
|
autoAnalyze: bool = True
|
||||||
|
autoCreatePR: bool = True
|
||||||
|
confidenceThreshold: int = 70
|
||||||
|
|
||||||
|
class SettingsUpdate(BaseModel):
|
||||||
|
ai_config: Optional[AIConfig] = None
|
||||||
|
|
||||||
|
class TestLLMRequest(BaseModel):
|
||||||
|
provider: str
|
||||||
|
api_key: str
|
||||||
|
model: str
|
||||||
|
|
||||||
|
def encrypt_key(key: str) -> str:
|
||||||
|
"""Simple obfuscation - in production use proper encryption."""
|
||||||
|
return base64.b64encode(key.encode()).decode()
|
||||||
|
|
||||||
|
def decrypt_key(encrypted: str) -> str:
|
||||||
|
"""Simple deobfuscation."""
|
||||||
|
try:
|
||||||
|
return base64.b64decode(encrypted.encode()).decode()
|
||||||
|
except:
|
||||||
|
return ""
|
||||||
|
|
||||||
|
@router.get("/{org_id}/settings")
|
||||||
|
async def get_settings(
|
||||||
|
org_id: int,
|
||||||
|
member: OrganizationMember = Depends(require_role("viewer")),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Get organization settings."""
|
||||||
|
result = await db.execute(select(Organization).where(Organization.id == org_id))
|
||||||
|
org = result.scalar_one_or_none()
|
||||||
|
if not org:
|
||||||
|
raise HTTPException(status_code=404, detail="Organization not found")
|
||||||
|
|
||||||
|
return {
|
||||||
|
"ai_config": {
|
||||||
|
"provider": org.ai_provider or "openrouter",
|
||||||
|
"apiKey": "***configured***" if org.ai_api_key_encrypted else "",
|
||||||
|
"model": org.ai_model or "meta-llama/llama-3.3-70b-instruct",
|
||||||
|
"autoAnalyze": org.ai_auto_analyze if org.ai_auto_analyze is not None else True,
|
||||||
|
"autoCreatePR": org.ai_auto_create_pr if org.ai_auto_create_pr is not None else True,
|
||||||
|
"confidenceThreshold": org.ai_confidence_threshold or 70,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@router.put("/{org_id}/settings")
|
||||||
|
async def update_settings(
|
||||||
|
org_id: int,
|
||||||
|
settings: SettingsUpdate,
|
||||||
|
member: OrganizationMember = Depends(require_role("admin")),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Update organization settings."""
|
||||||
|
result = await db.execute(select(Organization).where(Organization.id == org_id))
|
||||||
|
org = result.scalar_one_or_none()
|
||||||
|
if not org:
|
||||||
|
raise HTTPException(status_code=404, detail="Organization not found")
|
||||||
|
|
||||||
|
if settings.ai_config:
|
||||||
|
org.ai_provider = settings.ai_config.provider
|
||||||
|
org.ai_model = settings.ai_config.model
|
||||||
|
org.ai_auto_analyze = settings.ai_config.autoAnalyze
|
||||||
|
org.ai_auto_create_pr = settings.ai_config.autoCreatePR
|
||||||
|
org.ai_confidence_threshold = settings.ai_config.confidenceThreshold
|
||||||
|
|
||||||
|
# Only update key if provided and not masked
|
||||||
|
if settings.ai_config.apiKey and settings.ai_config.apiKey != "***configured***":
|
||||||
|
org.ai_api_key_encrypted = encrypt_key(settings.ai_config.apiKey)
|
||||||
|
|
||||||
|
await db.commit()
|
||||||
|
return {"message": "Settings updated"}
|
||||||
|
|
||||||
|
@router.post("/{org_id}/test-llm")
|
||||||
|
async def test_llm_connection(
|
||||||
|
org_id: int,
|
||||||
|
request: TestLLMRequest,
|
||||||
|
member: OrganizationMember = Depends(require_role("admin")),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Test LLM API connection."""
|
||||||
|
|
||||||
|
# Build request based on provider
|
||||||
|
if request.provider == "openrouter":
|
||||||
|
url = "https://openrouter.ai/api/v1/chat/completions"
|
||||||
|
headers = {
|
||||||
|
"Authorization": f"Bearer {request.api_key}",
|
||||||
|
"Content-Type": "application/json",
|
||||||
|
"HTTP-Referer": "https://jira-fixer.startdata.com.br",
|
||||||
|
"X-Title": "JIRA AI Fixer"
|
||||||
|
}
|
||||||
|
payload = {
|
||||||
|
"model": request.model,
|
||||||
|
"messages": [{"role": "user", "content": "Say 'OK' if you can read this."}],
|
||||||
|
"max_tokens": 10
|
||||||
|
}
|
||||||
|
elif request.provider == "anthropic":
|
||||||
|
url = "https://api.anthropic.com/v1/messages"
|
||||||
|
headers = {
|
||||||
|
"x-api-key": request.api_key,
|
||||||
|
"Content-Type": "application/json",
|
||||||
|
"anthropic-version": "2023-06-01"
|
||||||
|
}
|
||||||
|
payload = {
|
||||||
|
"model": request.model,
|
||||||
|
"max_tokens": 10,
|
||||||
|
"messages": [{"role": "user", "content": "Say 'OK' if you can read this."}]
|
||||||
|
}
|
||||||
|
elif request.provider == "openai":
|
||||||
|
url = "https://api.openai.com/v1/chat/completions"
|
||||||
|
headers = {
|
||||||
|
"Authorization": f"Bearer {request.api_key}",
|
||||||
|
"Content-Type": "application/json"
|
||||||
|
}
|
||||||
|
payload = {
|
||||||
|
"model": request.model,
|
||||||
|
"messages": [{"role": "user", "content": "Say 'OK' if you can read this."}],
|
||||||
|
"max_tokens": 10
|
||||||
|
}
|
||||||
|
elif request.provider == "groq":
|
||||||
|
url = "https://api.groq.com/openai/v1/chat/completions"
|
||||||
|
headers = {
|
||||||
|
"Authorization": f"Bearer {request.api_key}",
|
||||||
|
"Content-Type": "application/json"
|
||||||
|
}
|
||||||
|
payload = {
|
||||||
|
"model": request.model,
|
||||||
|
"messages": [{"role": "user", "content": "Say 'OK' if you can read this."}],
|
||||||
|
"max_tokens": 10
|
||||||
|
}
|
||||||
|
elif request.provider == "google":
|
||||||
|
url = f"https://generativelanguage.googleapis.com/v1beta/models/{request.model}:generateContent?key={request.api_key}"
|
||||||
|
headers = {"Content-Type": "application/json"}
|
||||||
|
payload = {
|
||||||
|
"contents": [{"parts": [{"text": "Say 'OK' if you can read this."}]}],
|
||||||
|
"generationConfig": {"maxOutputTokens": 10}
|
||||||
|
}
|
||||||
|
else:
|
||||||
|
raise HTTPException(status_code=400, detail=f"Unsupported provider: {request.provider}")
|
||||||
|
|
||||||
|
try:
|
||||||
|
async with httpx.AsyncClient() as client:
|
||||||
|
response = await client.post(url, headers=headers, json=payload, timeout=15.0)
|
||||||
|
|
||||||
|
if response.status_code == 200:
|
||||||
|
return {"success": True, "message": "Connection successful"}
|
||||||
|
elif response.status_code == 401:
|
||||||
|
raise HTTPException(status_code=400, detail="Invalid API key")
|
||||||
|
elif response.status_code == 403:
|
||||||
|
raise HTTPException(status_code=400, detail="API key lacks permissions")
|
||||||
|
else:
|
||||||
|
error_detail = response.json().get("error", {}).get("message", response.text[:200])
|
||||||
|
raise HTTPException(status_code=400, detail=f"API error: {error_detail}")
|
||||||
|
except httpx.TimeoutException:
|
||||||
|
raise HTTPException(status_code=400, detail="Connection timeout")
|
||||||
|
except httpx.ConnectError:
|
||||||
|
raise HTTPException(status_code=400, detail="Could not connect to API")
|
||||||
|
|
@ -0,0 +1,33 @@
|
||||||
|
"""User management endpoints."""
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
from app.core.database import get_db
|
||||||
|
from app.core.security import get_password_hash
|
||||||
|
from app.models.user import User
|
||||||
|
from app.schemas.user import UserRead, UserUpdate
|
||||||
|
from app.api.deps import get_current_user
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
@router.get("/me", response_model=UserRead)
|
||||||
|
async def get_me(user: User = Depends(get_current_user)):
|
||||||
|
"""Get current user profile."""
|
||||||
|
return user
|
||||||
|
|
||||||
|
@router.patch("/me", response_model=UserRead)
|
||||||
|
async def update_me(
|
||||||
|
user_update: UserUpdate,
|
||||||
|
user: User = Depends(get_current_user),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Update current user profile."""
|
||||||
|
if user_update.email:
|
||||||
|
user.email = user_update.email
|
||||||
|
if user_update.full_name:
|
||||||
|
user.full_name = user_update.full_name
|
||||||
|
if user_update.avatar_url:
|
||||||
|
user.avatar_url = user_update.avatar_url
|
||||||
|
if user_update.password:
|
||||||
|
user.hashed_password = get_password_hash(user_update.password)
|
||||||
|
|
||||||
|
return user
|
||||||
|
|
@ -0,0 +1,313 @@
|
||||||
|
"""Webhook endpoints for external integrations."""
|
||||||
|
from typing import Optional
|
||||||
|
from datetime import datetime
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException, BackgroundTasks, Request, Header
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
from sqlalchemy import select
|
||||||
|
from app.core.database import get_db
|
||||||
|
from app.models.organization import Organization
|
||||||
|
from app.models.integration import Integration, IntegrationType, IntegrationStatus
|
||||||
|
from app.models.issue import Issue, IssueStatus, IssuePriority
|
||||||
|
import hmac
|
||||||
|
import hashlib
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
def verify_signature(payload: bytes, signature: str, secret: str) -> bool:
|
||||||
|
"""Verify webhook signature."""
|
||||||
|
if not secret or not signature:
|
||||||
|
return True # Skip verification if no secret configured
|
||||||
|
expected = hmac.new(secret.encode(), payload, hashlib.sha256).hexdigest()
|
||||||
|
return hmac.compare_digest(f"sha256={expected}", signature)
|
||||||
|
|
||||||
|
async def process_webhook(
|
||||||
|
org_id: int,
|
||||||
|
integration_type: IntegrationType,
|
||||||
|
payload: dict,
|
||||||
|
background_tasks: BackgroundTasks,
|
||||||
|
db: AsyncSession
|
||||||
|
) -> dict:
|
||||||
|
"""Process incoming webhook and create issue."""
|
||||||
|
# Find integration
|
||||||
|
result = await db.execute(
|
||||||
|
select(Integration)
|
||||||
|
.where(Integration.organization_id == org_id)
|
||||||
|
.where(Integration.type == integration_type)
|
||||||
|
.where(Integration.status == IntegrationStatus.ACTIVE)
|
||||||
|
)
|
||||||
|
integration = result.scalar_one_or_none()
|
||||||
|
|
||||||
|
if not integration:
|
||||||
|
return {"status": "ignored", "message": "No active integration found"}
|
||||||
|
|
||||||
|
# Update integration stats
|
||||||
|
integration.issues_processed = (integration.issues_processed or 0) + 1
|
||||||
|
integration.last_sync_at = datetime.utcnow()
|
||||||
|
|
||||||
|
# Normalize payload based on type
|
||||||
|
issue_data = normalize_payload(integration_type, payload)
|
||||||
|
if not issue_data:
|
||||||
|
return {"status": "ignored", "message": "Event not processed"}
|
||||||
|
|
||||||
|
# Create issue
|
||||||
|
issue = Issue(
|
||||||
|
organization_id=org_id,
|
||||||
|
integration_id=integration.id,
|
||||||
|
external_id=issue_data.get("external_id"),
|
||||||
|
external_key=issue_data.get("external_key"),
|
||||||
|
external_url=issue_data.get("external_url"),
|
||||||
|
source=integration_type.value,
|
||||||
|
title=issue_data.get("title"),
|
||||||
|
description=issue_data.get("description"),
|
||||||
|
priority=IssuePriority(issue_data.get("priority", "medium")),
|
||||||
|
labels=issue_data.get("labels"),
|
||||||
|
callback_url=issue_data.get("callback_url") or integration.callback_url,
|
||||||
|
raw_payload=payload
|
||||||
|
)
|
||||||
|
db.add(issue)
|
||||||
|
await db.flush()
|
||||||
|
|
||||||
|
# Queue analysis if auto_analyze enabled
|
||||||
|
if integration.auto_analyze:
|
||||||
|
from app.api.issues import run_analysis
|
||||||
|
from app.core.config import settings
|
||||||
|
background_tasks.add_task(
|
||||||
|
run_analysis,
|
||||||
|
issue.id,
|
||||||
|
settings.DATABASE_URL.replace("postgresql://", "postgresql+asyncpg://")
|
||||||
|
)
|
||||||
|
|
||||||
|
return {"status": "accepted", "issue_id": issue.id}
|
||||||
|
|
||||||
|
def normalize_payload(integration_type: IntegrationType, payload: dict) -> Optional[dict]:
|
||||||
|
"""Normalize webhook payload to common format."""
|
||||||
|
|
||||||
|
if integration_type == IntegrationType.JIRA_CLOUD:
|
||||||
|
event = payload.get("webhookEvent", "")
|
||||||
|
if "issue_created" not in event:
|
||||||
|
return None
|
||||||
|
issue = payload.get("issue", {})
|
||||||
|
fields = issue.get("fields", {})
|
||||||
|
return {
|
||||||
|
"external_id": str(issue.get("id")),
|
||||||
|
"external_key": issue.get("key"),
|
||||||
|
"external_url": f"{payload.get('issue', {}).get('self', '').split('/rest/')[0]}/browse/{issue.get('key')}",
|
||||||
|
"title": fields.get("summary"),
|
||||||
|
"description": fields.get("description"),
|
||||||
|
"priority": normalize_priority(fields.get("priority", {}).get("name")),
|
||||||
|
"labels": fields.get("labels", [])
|
||||||
|
}
|
||||||
|
|
||||||
|
elif integration_type == IntegrationType.SERVICENOW:
|
||||||
|
return {
|
||||||
|
"external_id": payload.get("sys_id"),
|
||||||
|
"external_key": payload.get("number"),
|
||||||
|
"external_url": payload.get("url"),
|
||||||
|
"title": payload.get("short_description"),
|
||||||
|
"description": payload.get("description"),
|
||||||
|
"priority": normalize_priority(payload.get("priority")),
|
||||||
|
"callback_url": payload.get("callback_url")
|
||||||
|
}
|
||||||
|
|
||||||
|
elif integration_type == IntegrationType.ZENDESK:
|
||||||
|
ticket = payload.get("ticket", payload)
|
||||||
|
return {
|
||||||
|
"external_id": str(ticket.get("id")),
|
||||||
|
"external_key": f"ZD-{ticket.get('id')}",
|
||||||
|
"external_url": ticket.get("url"),
|
||||||
|
"title": ticket.get("subject"),
|
||||||
|
"description": ticket.get("description"),
|
||||||
|
"priority": normalize_priority(ticket.get("priority")),
|
||||||
|
"labels": ticket.get("tags", [])
|
||||||
|
}
|
||||||
|
|
||||||
|
elif integration_type == IntegrationType.GITHUB:
|
||||||
|
action = payload.get("action")
|
||||||
|
if action != "opened":
|
||||||
|
return None
|
||||||
|
issue = payload.get("issue", {})
|
||||||
|
return {
|
||||||
|
"external_id": str(issue.get("id")),
|
||||||
|
"external_key": f"GH-{issue.get('number')}",
|
||||||
|
"external_url": issue.get("html_url"),
|
||||||
|
"title": issue.get("title"),
|
||||||
|
"description": issue.get("body"),
|
||||||
|
"priority": "medium",
|
||||||
|
"labels": [l.get("name") for l in issue.get("labels", [])]
|
||||||
|
}
|
||||||
|
|
||||||
|
elif integration_type == IntegrationType.GITLAB:
|
||||||
|
event = payload.get("object_kind")
|
||||||
|
if event != "issue":
|
||||||
|
return None
|
||||||
|
attrs = payload.get("object_attributes", {})
|
||||||
|
if attrs.get("action") != "open":
|
||||||
|
return None
|
||||||
|
return {
|
||||||
|
"external_id": str(attrs.get("id")),
|
||||||
|
"external_key": f"GL-{attrs.get('iid')}",
|
||||||
|
"external_url": attrs.get("url"),
|
||||||
|
"title": attrs.get("title"),
|
||||||
|
"description": attrs.get("description"),
|
||||||
|
"priority": "medium",
|
||||||
|
"labels": payload.get("labels", [])
|
||||||
|
}
|
||||||
|
|
||||||
|
elif integration_type == IntegrationType.TICKETHUB:
|
||||||
|
event = payload.get("event", "")
|
||||||
|
if "created" not in event:
|
||||||
|
return None
|
||||||
|
data = payload.get("data", payload)
|
||||||
|
return {
|
||||||
|
"external_id": str(data.get("id")),
|
||||||
|
"external_key": data.get("key"),
|
||||||
|
"external_url": f"https://tickethub.startdata.com.br/tickets/{data.get('id')}",
|
||||||
|
"title": data.get("title"),
|
||||||
|
"description": data.get("description"),
|
||||||
|
"priority": normalize_priority(data.get("priority")),
|
||||||
|
"labels": data.get("labels", [])
|
||||||
|
}
|
||||||
|
|
||||||
|
elif integration_type == IntegrationType.GITEA:
|
||||||
|
action = payload.get("action")
|
||||||
|
if action != "opened":
|
||||||
|
return None
|
||||||
|
issue = payload.get("issue", {})
|
||||||
|
repo = payload.get("repository", {})
|
||||||
|
return {
|
||||||
|
"external_id": str(issue.get("id")),
|
||||||
|
"external_key": f"GITEA-{issue.get('number')}",
|
||||||
|
"external_url": issue.get("html_url"),
|
||||||
|
"title": issue.get("title"),
|
||||||
|
"description": issue.get("body"),
|
||||||
|
"priority": "medium",
|
||||||
|
"labels": [l.get("name") for l in issue.get("labels", [])],
|
||||||
|
"repo": repo.get("full_name")
|
||||||
|
}
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
def normalize_priority(priority: Optional[str]) -> str:
|
||||||
|
"""Normalize priority to standard values."""
|
||||||
|
if not priority:
|
||||||
|
return "medium"
|
||||||
|
priority = str(priority).lower()
|
||||||
|
if priority in ("1", "critical", "highest", "urgent"):
|
||||||
|
return "critical"
|
||||||
|
elif priority in ("2", "high"):
|
||||||
|
return "high"
|
||||||
|
elif priority in ("3", "medium", "normal"):
|
||||||
|
return "medium"
|
||||||
|
else:
|
||||||
|
return "low"
|
||||||
|
|
||||||
|
# Webhook endpoints for each integration type
|
||||||
|
@router.post("/{org_id}/jira")
|
||||||
|
async def webhook_jira(
|
||||||
|
org_id: int,
|
||||||
|
request: Request,
|
||||||
|
background_tasks: BackgroundTasks,
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
payload = await request.json()
|
||||||
|
return await process_webhook(org_id, IntegrationType.JIRA_CLOUD, payload, background_tasks, db)
|
||||||
|
|
||||||
|
@router.post("/{org_id}/servicenow")
|
||||||
|
async def webhook_servicenow(
|
||||||
|
org_id: int,
|
||||||
|
request: Request,
|
||||||
|
background_tasks: BackgroundTasks,
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
payload = await request.json()
|
||||||
|
return await process_webhook(org_id, IntegrationType.SERVICENOW, payload, background_tasks, db)
|
||||||
|
|
||||||
|
@router.post("/{org_id}/zendesk")
|
||||||
|
async def webhook_zendesk(
|
||||||
|
org_id: int,
|
||||||
|
request: Request,
|
||||||
|
background_tasks: BackgroundTasks,
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
payload = await request.json()
|
||||||
|
return await process_webhook(org_id, IntegrationType.ZENDESK, payload, background_tasks, db)
|
||||||
|
|
||||||
|
@router.post("/{org_id}/github")
|
||||||
|
async def webhook_github(
|
||||||
|
org_id: int,
|
||||||
|
request: Request,
|
||||||
|
background_tasks: BackgroundTasks,
|
||||||
|
x_github_event: Optional[str] = Header(None),
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
payload = await request.json()
|
||||||
|
if x_github_event != "issues":
|
||||||
|
return {"status": "ignored", "message": "Not an issues event"}
|
||||||
|
return await process_webhook(org_id, IntegrationType.GITHUB, payload, background_tasks, db)
|
||||||
|
|
||||||
|
@router.post("/{org_id}/gitlab")
|
||||||
|
async def webhook_gitlab(
|
||||||
|
org_id: int,
|
||||||
|
request: Request,
|
||||||
|
background_tasks: BackgroundTasks,
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
payload = await request.json()
|
||||||
|
return await process_webhook(org_id, IntegrationType.GITLAB, payload, background_tasks, db)
|
||||||
|
|
||||||
|
@router.post("/{org_id}/tickethub")
|
||||||
|
async def webhook_tickethub(
|
||||||
|
org_id: int,
|
||||||
|
request: Request,
|
||||||
|
background_tasks: BackgroundTasks,
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
payload = await request.json()
|
||||||
|
return await process_webhook(org_id, IntegrationType.TICKETHUB, payload, background_tasks, db)
|
||||||
|
|
||||||
|
@router.post("/{org_id}/gitea")
|
||||||
|
async def webhook_gitea(
|
||||||
|
org_id: int,
|
||||||
|
request: Request,
|
||||||
|
background_tasks: BackgroundTasks,
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
payload = await request.json()
|
||||||
|
return await process_webhook(org_id, IntegrationType.GITEA, payload, background_tasks, db)
|
||||||
|
|
||||||
|
@router.post("/{org_id}/generic")
|
||||||
|
async def webhook_generic(
|
||||||
|
org_id: int,
|
||||||
|
request: Request,
|
||||||
|
background_tasks: BackgroundTasks,
|
||||||
|
db: AsyncSession = Depends(get_db)
|
||||||
|
):
|
||||||
|
"""Generic webhook for custom integrations."""
|
||||||
|
payload = await request.json()
|
||||||
|
|
||||||
|
# Direct mapping
|
||||||
|
issue = Issue(
|
||||||
|
organization_id=org_id,
|
||||||
|
external_id=str(payload.get("id")),
|
||||||
|
external_key=payload.get("key"),
|
||||||
|
external_url=payload.get("url"),
|
||||||
|
source=payload.get("source", "generic"),
|
||||||
|
title=payload.get("title"),
|
||||||
|
description=payload.get("description"),
|
||||||
|
priority=IssuePriority(normalize_priority(payload.get("priority"))),
|
||||||
|
labels=payload.get("labels"),
|
||||||
|
callback_url=payload.get("callback_url"),
|
||||||
|
raw_payload=payload
|
||||||
|
)
|
||||||
|
db.add(issue)
|
||||||
|
await db.flush()
|
||||||
|
|
||||||
|
from app.api.issues import run_analysis
|
||||||
|
from app.core.config import settings
|
||||||
|
background_tasks.add_task(
|
||||||
|
run_analysis,
|
||||||
|
issue.id,
|
||||||
|
settings.DATABASE_URL.replace("postgresql://", "postgresql+asyncpg://")
|
||||||
|
)
|
||||||
|
|
||||||
|
return {"status": "accepted", "issue_id": issue.id}
|
||||||
|
|
@ -0,0 +1,47 @@
|
||||||
|
"""Application configuration."""
|
||||||
|
import os
|
||||||
|
from functools import lru_cache
|
||||||
|
from pydantic_settings import BaseSettings
|
||||||
|
|
||||||
|
class Settings(BaseSettings):
|
||||||
|
# App
|
||||||
|
APP_NAME: str = "JIRA AI Fixer"
|
||||||
|
APP_VERSION: str = "2.0.0"
|
||||||
|
DEBUG: bool = False
|
||||||
|
SECRET_KEY: str = os.getenv("SECRET_KEY", "change-me-in-production-use-openssl-rand-hex-32")
|
||||||
|
|
||||||
|
# Database
|
||||||
|
DATABASE_URL: str = os.getenv("DATABASE_URL", "postgresql://postgres:postgres@postgres_database:5432/jira_fixer_v2")
|
||||||
|
|
||||||
|
# Redis
|
||||||
|
REDIS_URL: str = os.getenv("REDIS_URL", "redis://redis_redis:6379/0")
|
||||||
|
|
||||||
|
# JWT
|
||||||
|
JWT_SECRET: str = os.getenv("JWT_SECRET", "jwt-secret-change-in-production")
|
||||||
|
JWT_ALGORITHM: str = "HS256"
|
||||||
|
JWT_EXPIRE_MINUTES: int = 60 * 24 # 24 hours
|
||||||
|
JWT_REFRESH_DAYS: int = 7
|
||||||
|
|
||||||
|
# Email (Resend)
|
||||||
|
RESEND_API_KEY: str = os.getenv("RESEND_API_KEY", "")
|
||||||
|
EMAIL_FROM: str = os.getenv("EMAIL_FROM", "JIRA AI Fixer <noreply@startdata.com.br>")
|
||||||
|
|
||||||
|
# External APIs
|
||||||
|
OPENROUTER_API_KEY: str = os.getenv("OPENROUTER_API_KEY", "")
|
||||||
|
GITEA_URL: str = os.getenv("GITEA_URL", "https://gitea.startdata.com.br")
|
||||||
|
GITEA_TOKEN: str = os.getenv("GITEA_TOKEN", "")
|
||||||
|
|
||||||
|
# OAuth (for integrations)
|
||||||
|
JIRA_CLIENT_ID: str = os.getenv("JIRA_CLIENT_ID", "")
|
||||||
|
JIRA_CLIENT_SECRET: str = os.getenv("JIRA_CLIENT_SECRET", "")
|
||||||
|
GITHUB_CLIENT_ID: str = os.getenv("GITHUB_CLIENT_ID", "")
|
||||||
|
GITHUB_CLIENT_SECRET: str = os.getenv("GITHUB_CLIENT_SECRET", "")
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
env_file = ".env"
|
||||||
|
|
||||||
|
@lru_cache()
|
||||||
|
def get_settings() -> Settings:
|
||||||
|
return Settings()
|
||||||
|
|
||||||
|
settings = get_settings()
|
||||||
|
|
@ -0,0 +1,37 @@
|
||||||
|
"""Database setup with SQLAlchemy async."""
|
||||||
|
from sqlalchemy.ext.asyncio import create_async_engine, AsyncSession, async_sessionmaker
|
||||||
|
from sqlalchemy.orm import declarative_base
|
||||||
|
from .config import settings
|
||||||
|
|
||||||
|
# Convert sync URL to async
|
||||||
|
DATABASE_URL = settings.DATABASE_URL.replace("postgresql://", "postgresql+asyncpg://")
|
||||||
|
|
||||||
|
engine = create_async_engine(
|
||||||
|
DATABASE_URL,
|
||||||
|
echo=settings.DEBUG,
|
||||||
|
pool_size=10,
|
||||||
|
max_overflow=20,
|
||||||
|
pool_pre_ping=True, # Test connection before using
|
||||||
|
pool_recycle=3600 # Recycle connections after 1 hour
|
||||||
|
)
|
||||||
|
async_session = async_sessionmaker(engine, class_=AsyncSession, expire_on_commit=False)
|
||||||
|
|
||||||
|
Base = declarative_base()
|
||||||
|
|
||||||
|
async def get_db() -> AsyncSession:
|
||||||
|
async with async_session() as session:
|
||||||
|
try:
|
||||||
|
yield session
|
||||||
|
await session.commit()
|
||||||
|
except Exception:
|
||||||
|
await session.rollback()
|
||||||
|
raise
|
||||||
|
finally:
|
||||||
|
await session.close()
|
||||||
|
|
||||||
|
async def init_db():
|
||||||
|
# Import all models here to ensure they are registered with Base.metadata
|
||||||
|
from app.models import User, Organization, OrganizationMember, Integration, Issue, AuditLog # noqa
|
||||||
|
|
||||||
|
async with engine.begin() as conn:
|
||||||
|
await conn.run_sync(Base.metadata.create_all)
|
||||||
|
|
@ -0,0 +1,53 @@
|
||||||
|
"""Security utilities - JWT, password hashing, RBAC."""
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from typing import Optional, Any
|
||||||
|
from jose import jwt, JWTError
|
||||||
|
from passlib.context import CryptContext
|
||||||
|
from .config import settings
|
||||||
|
|
||||||
|
pwd_context = CryptContext(schemes=["bcrypt"], deprecated="auto")
|
||||||
|
|
||||||
|
# Roles hierarchy
|
||||||
|
class Role:
|
||||||
|
VIEWER = "viewer"
|
||||||
|
ANALYST = "analyst"
|
||||||
|
MANAGER = "manager"
|
||||||
|
ADMIN = "admin"
|
||||||
|
OWNER = "owner"
|
||||||
|
|
||||||
|
ROLE_HIERARCHY = {
|
||||||
|
Role.VIEWER: 1,
|
||||||
|
Role.ANALYST: 2,
|
||||||
|
Role.MANAGER: 3,
|
||||||
|
Role.ADMIN: 4,
|
||||||
|
Role.OWNER: 5,
|
||||||
|
}
|
||||||
|
|
||||||
|
def verify_password(plain_password: str, hashed_password: str) -> bool:
|
||||||
|
return pwd_context.verify(plain_password, hashed_password)
|
||||||
|
|
||||||
|
def get_password_hash(password: str) -> str:
|
||||||
|
return pwd_context.hash(password)
|
||||||
|
|
||||||
|
def create_access_token(data: dict, expires_delta: Optional[timedelta] = None) -> str:
|
||||||
|
to_encode = data.copy()
|
||||||
|
expire = datetime.utcnow() + (expires_delta or timedelta(minutes=settings.JWT_EXPIRE_MINUTES))
|
||||||
|
to_encode.update({"exp": expire, "type": "access"})
|
||||||
|
return jwt.encode(to_encode, settings.JWT_SECRET, algorithm=settings.JWT_ALGORITHM)
|
||||||
|
|
||||||
|
def create_refresh_token(data: dict) -> str:
|
||||||
|
to_encode = data.copy()
|
||||||
|
expire = datetime.utcnow() + timedelta(days=settings.JWT_REFRESH_DAYS)
|
||||||
|
to_encode.update({"exp": expire, "type": "refresh"})
|
||||||
|
return jwt.encode(to_encode, settings.JWT_SECRET, algorithm=settings.JWT_ALGORITHM)
|
||||||
|
|
||||||
|
def decode_token(token: str) -> Optional[dict]:
|
||||||
|
try:
|
||||||
|
payload = jwt.decode(token, settings.JWT_SECRET, algorithms=[settings.JWT_ALGORITHM])
|
||||||
|
return payload
|
||||||
|
except JWTError:
|
||||||
|
return None
|
||||||
|
|
||||||
|
def has_permission(user_role: str, required_role: str) -> bool:
|
||||||
|
"""Check if user_role has at least the required_role level."""
|
||||||
|
return ROLE_HIERARCHY.get(user_role, 0) >= ROLE_HIERARCHY.get(required_role, 0)
|
||||||
|
|
@ -0,0 +1,95 @@
|
||||||
|
"""JIRA AI Fixer - Enterprise Issue Analysis Platform."""
|
||||||
|
from contextlib import asynccontextmanager
|
||||||
|
from fastapi import FastAPI, Request, HTTPException
|
||||||
|
from fastapi.middleware.cors import CORSMiddleware
|
||||||
|
from fastapi.staticfiles import StaticFiles
|
||||||
|
from fastapi.responses import FileResponse
|
||||||
|
from starlette.middleware.base import BaseHTTPMiddleware
|
||||||
|
import os
|
||||||
|
|
||||||
|
from app.core.config import settings
|
||||||
|
from app.core.database import init_db
|
||||||
|
from app.api import api_router
|
||||||
|
|
||||||
|
class HTTPSRedirectMiddleware(BaseHTTPMiddleware):
|
||||||
|
"""Force HTTPS in redirects when behind reverse proxy."""
|
||||||
|
async def dispatch(self, request: Request, call_next):
|
||||||
|
response = await call_next(request)
|
||||||
|
if response.status_code in (301, 302, 303, 307, 308):
|
||||||
|
location = response.headers.get("location", "")
|
||||||
|
if location.startswith("http://"):
|
||||||
|
response.headers["location"] = location.replace("http://", "https://", 1)
|
||||||
|
return response
|
||||||
|
|
||||||
|
@asynccontextmanager
|
||||||
|
async def lifespan(app: FastAPI):
|
||||||
|
await init_db()
|
||||||
|
yield
|
||||||
|
|
||||||
|
app = FastAPI(
|
||||||
|
title=settings.APP_NAME,
|
||||||
|
version=settings.APP_VERSION,
|
||||||
|
description="Enterprise AI-powered issue analysis and automated fix generation",
|
||||||
|
docs_url="/api/docs",
|
||||||
|
redoc_url="/api/redoc",
|
||||||
|
openapi_url="/api/openapi.json",
|
||||||
|
lifespan=lifespan
|
||||||
|
)
|
||||||
|
|
||||||
|
app.add_middleware(HTTPSRedirectMiddleware)
|
||||||
|
app.add_middleware(
|
||||||
|
CORSMiddleware,
|
||||||
|
allow_origins=["*"],
|
||||||
|
allow_credentials=True,
|
||||||
|
allow_methods=["*"],
|
||||||
|
allow_headers=["*"],
|
||||||
|
)
|
||||||
|
|
||||||
|
# FIRST: API routes (highest priority)
|
||||||
|
app.include_router(api_router, prefix="/api")
|
||||||
|
|
||||||
|
# Health check (explicit, not in router)
|
||||||
|
@app.get("/api/health")
|
||||||
|
async def health():
|
||||||
|
return {
|
||||||
|
"status": "healthy",
|
||||||
|
"service": "jira-ai-fixer",
|
||||||
|
"version": settings.APP_VERSION
|
||||||
|
}
|
||||||
|
|
||||||
|
# SECOND: Static files
|
||||||
|
FRONTEND_DIR = "/app/frontend"
|
||||||
|
ASSETS_DIR = f"{FRONTEND_DIR}/assets"
|
||||||
|
|
||||||
|
if os.path.exists(ASSETS_DIR):
|
||||||
|
app.mount("/assets", StaticFiles(directory=ASSETS_DIR), name="assets")
|
||||||
|
|
||||||
|
# THIRD: Frontend routes (AFTER API)
|
||||||
|
@app.get("/")
|
||||||
|
async def serve_root():
|
||||||
|
if os.path.exists(f"{FRONTEND_DIR}/index.html"):
|
||||||
|
return FileResponse(f"{FRONTEND_DIR}/index.html")
|
||||||
|
return {
|
||||||
|
"service": settings.APP_NAME,
|
||||||
|
"version": settings.APP_VERSION,
|
||||||
|
"docs": "/api/docs",
|
||||||
|
"health": "/api/health"
|
||||||
|
}
|
||||||
|
|
||||||
|
# LAST: SPA catch-all (exclude api/*)
|
||||||
|
@app.get("/{path:path}", include_in_schema=False)
|
||||||
|
async def serve_spa(path: str):
|
||||||
|
# NEVER capture API routes
|
||||||
|
if path.startswith("api"):
|
||||||
|
raise HTTPException(status_code=404, detail="API route not found")
|
||||||
|
|
||||||
|
# Try to serve static file
|
||||||
|
file_path = f"{FRONTEND_DIR}/{path}"
|
||||||
|
if os.path.exists(file_path) and os.path.isfile(file_path):
|
||||||
|
return FileResponse(file_path)
|
||||||
|
|
||||||
|
# Fallback to index.html for SPA routing
|
||||||
|
if os.path.exists(f"{FRONTEND_DIR}/index.html"):
|
||||||
|
return FileResponse(f"{FRONTEND_DIR}/index.html")
|
||||||
|
|
||||||
|
raise HTTPException(status_code=404, detail="Not found")
|
||||||
|
|
@ -0,0 +1,5 @@
|
||||||
|
from .user import User
|
||||||
|
from .organization import Organization, OrganizationMember
|
||||||
|
from .integration import Integration
|
||||||
|
from .issue import Issue
|
||||||
|
from .audit import AuditLog
|
||||||
|
|
@ -0,0 +1,36 @@
|
||||||
|
"""Audit log for compliance and tracking."""
|
||||||
|
from datetime import datetime
|
||||||
|
from sqlalchemy import Column, Integer, String, DateTime, ForeignKey, JSON, Text
|
||||||
|
from sqlalchemy.orm import relationship
|
||||||
|
from app.core.database import Base
|
||||||
|
|
||||||
|
class AuditLog(Base):
|
||||||
|
__tablename__ = "audit_logs"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
organization_id = Column(Integer, ForeignKey("organizations.id"))
|
||||||
|
user_id = Column(Integer, ForeignKey("users.id"))
|
||||||
|
|
||||||
|
# Action details
|
||||||
|
action = Column(String(100), nullable=False, index=True) # user.login, issue.created, integration.updated
|
||||||
|
resource_type = Column(String(50)) # user, issue, integration, etc
|
||||||
|
resource_id = Column(Integer)
|
||||||
|
|
||||||
|
# Context
|
||||||
|
ip_address = Column(String(45))
|
||||||
|
user_agent = Column(String(500))
|
||||||
|
|
||||||
|
# Changes
|
||||||
|
old_values = Column(JSON)
|
||||||
|
new_values = Column(JSON)
|
||||||
|
description = Column(Text)
|
||||||
|
|
||||||
|
# Status
|
||||||
|
success = Column(String(10), default="success") # success, failure
|
||||||
|
error_message = Column(String(500))
|
||||||
|
|
||||||
|
created_at = Column(DateTime, default=datetime.utcnow, index=True)
|
||||||
|
|
||||||
|
# Relationships
|
||||||
|
organization = relationship("Organization", back_populates="audit_logs")
|
||||||
|
user = relationship("User", back_populates="audit_logs")
|
||||||
|
|
@ -0,0 +1,59 @@
|
||||||
|
"""Integration model."""
|
||||||
|
from datetime import datetime
|
||||||
|
from sqlalchemy import Column, Integer, String, DateTime, ForeignKey, Enum, Boolean, Text, JSON
|
||||||
|
from sqlalchemy.orm import relationship
|
||||||
|
from app.core.database import Base
|
||||||
|
import enum
|
||||||
|
|
||||||
|
class IntegrationType(str, enum.Enum):
|
||||||
|
JIRA_CLOUD = "jira_cloud"
|
||||||
|
JIRA_SERVER = "jira_server"
|
||||||
|
SERVICENOW = "servicenow"
|
||||||
|
ZENDESK = "zendesk"
|
||||||
|
GITHUB = "github"
|
||||||
|
GITLAB = "gitlab"
|
||||||
|
AZURE_DEVOPS = "azure_devops"
|
||||||
|
TICKETHUB = "tickethub"
|
||||||
|
GITEA = "gitea"
|
||||||
|
CUSTOM_WEBHOOK = "custom_webhook"
|
||||||
|
|
||||||
|
class IntegrationStatus(str, enum.Enum):
|
||||||
|
ACTIVE = "active"
|
||||||
|
INACTIVE = "inactive"
|
||||||
|
ERROR = "error"
|
||||||
|
|
||||||
|
class Integration(Base):
|
||||||
|
__tablename__ = "integrations"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
organization_id = Column(Integer, ForeignKey("organizations.id"), nullable=False)
|
||||||
|
|
||||||
|
name = Column(String(255), nullable=False)
|
||||||
|
type = Column(Enum(IntegrationType), nullable=False)
|
||||||
|
status = Column(Enum(IntegrationStatus), default=IntegrationStatus.ACTIVE)
|
||||||
|
|
||||||
|
# Config
|
||||||
|
base_url = Column(String(1024))
|
||||||
|
api_key = Column(Text) # Encrypted
|
||||||
|
oauth_token = Column(Text)
|
||||||
|
webhook_secret = Column(String(255))
|
||||||
|
callback_url = Column(String(1024))
|
||||||
|
config = Column(JSON, default=dict) # Additional config as JSON
|
||||||
|
|
||||||
|
# Stats
|
||||||
|
issues_processed = Column(Integer, default=0)
|
||||||
|
last_sync_at = Column(DateTime)
|
||||||
|
last_error = Column(Text)
|
||||||
|
|
||||||
|
# Settings
|
||||||
|
auto_analyze = Column(Boolean, default=True)
|
||||||
|
|
||||||
|
created_at = Column(DateTime, default=datetime.utcnow)
|
||||||
|
|
||||||
|
# Relations
|
||||||
|
organization = relationship("Organization", back_populates="integrations")
|
||||||
|
issues = relationship("Issue", back_populates="integration")
|
||||||
|
|
||||||
|
@property
|
||||||
|
def webhook_url(self) -> str:
|
||||||
|
return f"https://jira-fixer.startdata.com.br/api/webhook/{self.organization_id}/{self.type.value}"
|
||||||
|
|
@ -0,0 +1,79 @@
|
||||||
|
"""Issue model."""
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import Optional, List
|
||||||
|
from sqlalchemy import Column, Integer, String, Text, DateTime, Float, ForeignKey, Enum, JSON
|
||||||
|
from sqlalchemy.orm import relationship
|
||||||
|
from app.core.database import Base
|
||||||
|
import enum
|
||||||
|
|
||||||
|
class IssueStatus(str, enum.Enum):
|
||||||
|
PENDING = "pending"
|
||||||
|
ANALYZING = "analyzing"
|
||||||
|
ANALYZED = "analyzed"
|
||||||
|
PR_CREATED = "pr_created"
|
||||||
|
COMPLETED = "completed"
|
||||||
|
ERROR = "error"
|
||||||
|
|
||||||
|
class IssuePriority(str, enum.Enum):
|
||||||
|
CRITICAL = "critical"
|
||||||
|
HIGH = "high"
|
||||||
|
MEDIUM = "medium"
|
||||||
|
LOW = "low"
|
||||||
|
|
||||||
|
class Issue(Base):
|
||||||
|
__tablename__ = "issues"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
organization_id = Column(Integer, ForeignKey("organizations.id"), nullable=False)
|
||||||
|
integration_id = Column(Integer, ForeignKey("integrations.id"), nullable=True)
|
||||||
|
|
||||||
|
# External reference
|
||||||
|
external_id = Column(String(255), index=True)
|
||||||
|
external_key = Column(String(100), index=True) # JIRA-123, INC0001234
|
||||||
|
external_url = Column(String(1024))
|
||||||
|
source = Column(String(50)) # jira_cloud, servicenow, etc
|
||||||
|
|
||||||
|
# Issue data
|
||||||
|
title = Column(String(500), nullable=False)
|
||||||
|
description = Column(Text)
|
||||||
|
priority = Column(Enum(IssuePriority), default=IssuePriority.MEDIUM)
|
||||||
|
labels = Column(JSON)
|
||||||
|
|
||||||
|
# Analysis
|
||||||
|
status = Column(Enum(IssueStatus), default=IssueStatus.PENDING)
|
||||||
|
root_cause = Column(Text)
|
||||||
|
suggested_fix = Column(Text)
|
||||||
|
affected_files = Column(JSON)
|
||||||
|
confidence = Column(Float)
|
||||||
|
analysis_completed_at = Column(DateTime)
|
||||||
|
error_message = Column(Text)
|
||||||
|
|
||||||
|
# PR
|
||||||
|
pr_url = Column(String(1024))
|
||||||
|
pr_branch = Column(String(255))
|
||||||
|
|
||||||
|
# Callback
|
||||||
|
callback_url = Column(String(1024))
|
||||||
|
callback_sent = Column(DateTime)
|
||||||
|
|
||||||
|
# Meta
|
||||||
|
raw_payload = Column(JSON)
|
||||||
|
created_at = Column(DateTime, default=datetime.utcnow)
|
||||||
|
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
|
||||||
|
|
||||||
|
# Relations
|
||||||
|
organization = relationship("Organization", back_populates="issues")
|
||||||
|
integration = relationship("Integration", back_populates="issues")
|
||||||
|
comments = relationship("IssueComment", back_populates="issue", cascade="all, delete-orphan")
|
||||||
|
|
||||||
|
class IssueComment(Base):
|
||||||
|
__tablename__ = "issue_comments"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
issue_id = Column(Integer, ForeignKey("issues.id"), nullable=False)
|
||||||
|
user_id = Column(Integer, ForeignKey("users.id"), nullable=True)
|
||||||
|
content = Column(Text, nullable=False)
|
||||||
|
created_at = Column(DateTime, default=datetime.utcnow)
|
||||||
|
|
||||||
|
# Relations
|
||||||
|
issue = relationship("Issue", back_populates="comments")
|
||||||
|
|
@ -0,0 +1,51 @@
|
||||||
|
"""Organization model."""
|
||||||
|
from datetime import datetime
|
||||||
|
from sqlalchemy import Column, Integer, String, DateTime, ForeignKey, Enum, Text, Boolean, JSON
|
||||||
|
from sqlalchemy.orm import relationship
|
||||||
|
from app.core.database import Base
|
||||||
|
import enum
|
||||||
|
|
||||||
|
class MemberRole(str, enum.Enum):
|
||||||
|
VIEWER = "viewer"
|
||||||
|
ANALYST = "analyst"
|
||||||
|
MANAGER = "manager"
|
||||||
|
ADMIN = "admin"
|
||||||
|
OWNER = "owner"
|
||||||
|
|
||||||
|
class Organization(Base):
|
||||||
|
__tablename__ = "organizations"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
name = Column(String(255), nullable=False)
|
||||||
|
slug = Column(String(100), unique=True, nullable=False, index=True)
|
||||||
|
created_at = Column(DateTime, default=datetime.utcnow)
|
||||||
|
|
||||||
|
# AI Configuration
|
||||||
|
ai_provider = Column(String(50), default="openrouter")
|
||||||
|
ai_api_key_encrypted = Column(Text, nullable=True)
|
||||||
|
ai_model = Column(String(100), default="meta-llama/llama-3.3-70b-instruct")
|
||||||
|
ai_auto_analyze = Column(Boolean, default=True)
|
||||||
|
ai_auto_create_pr = Column(Boolean, default=True)
|
||||||
|
ai_confidence_threshold = Column(Integer, default=70)
|
||||||
|
|
||||||
|
# Settings JSON for extensibility
|
||||||
|
settings = Column(JSON, default=dict)
|
||||||
|
|
||||||
|
# Relations
|
||||||
|
members = relationship("OrganizationMember", back_populates="organization", cascade="all, delete-orphan")
|
||||||
|
integrations = relationship("Integration", back_populates="organization", cascade="all, delete-orphan")
|
||||||
|
issues = relationship("Issue", back_populates="organization", cascade="all, delete-orphan")
|
||||||
|
audit_logs = relationship("AuditLog", back_populates="organization")
|
||||||
|
|
||||||
|
class OrganizationMember(Base):
|
||||||
|
__tablename__ = "organization_members"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
organization_id = Column(Integer, ForeignKey("organizations.id"), nullable=False)
|
||||||
|
user_id = Column(Integer, ForeignKey("users.id"), nullable=False)
|
||||||
|
role = Column(Enum(MemberRole), default=MemberRole.VIEWER)
|
||||||
|
joined_at = Column(DateTime, default=datetime.utcnow)
|
||||||
|
|
||||||
|
# Relations
|
||||||
|
organization = relationship("Organization", back_populates="members")
|
||||||
|
user = relationship("User", back_populates="memberships")
|
||||||
|
|
@ -0,0 +1,23 @@
|
||||||
|
"""User model."""
|
||||||
|
from datetime import datetime
|
||||||
|
from sqlalchemy import Column, Integer, String, Boolean, DateTime
|
||||||
|
from sqlalchemy.orm import relationship
|
||||||
|
from app.core.database import Base
|
||||||
|
|
||||||
|
class User(Base):
|
||||||
|
__tablename__ = "users"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True, index=True)
|
||||||
|
email = Column(String(255), unique=True, index=True, nullable=False)
|
||||||
|
hashed_password = Column(String(255), nullable=False)
|
||||||
|
full_name = Column(String(255))
|
||||||
|
avatar_url = Column(String(500))
|
||||||
|
is_active = Column(Boolean, default=True)
|
||||||
|
is_superuser = Column(Boolean, default=False)
|
||||||
|
email_verified = Column(Boolean, default=False)
|
||||||
|
created_at = Column(DateTime, default=datetime.utcnow)
|
||||||
|
last_login = Column(DateTime)
|
||||||
|
|
||||||
|
# Relations
|
||||||
|
memberships = relationship("OrganizationMember", back_populates="user", cascade="all, delete-orphan")
|
||||||
|
audit_logs = relationship("AuditLog", back_populates="user")
|
||||||
|
|
@ -0,0 +1,4 @@
|
||||||
|
from .user import UserCreate, UserRead, UserUpdate, Token, TokenData
|
||||||
|
from .organization import OrganizationCreate, OrganizationRead, OrganizationUpdate, MemberCreate, MemberRead
|
||||||
|
from .integration import IntegrationCreate, IntegrationRead, IntegrationUpdate
|
||||||
|
from .issue import IssueCreate, IssueRead, IssueUpdate, IssueStats
|
||||||
|
|
@ -0,0 +1,55 @@
|
||||||
|
"""Integration schemas."""
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import Optional, List, Dict, Any
|
||||||
|
from pydantic import BaseModel
|
||||||
|
from app.models.integration import IntegrationType, IntegrationStatus
|
||||||
|
|
||||||
|
class IntegrationBase(BaseModel):
|
||||||
|
name: str
|
||||||
|
type: IntegrationType
|
||||||
|
|
||||||
|
class IntegrationCreate(IntegrationBase):
|
||||||
|
base_url: Optional[str] = None
|
||||||
|
auth_type: str = "api_key"
|
||||||
|
api_key: Optional[str] = None
|
||||||
|
api_secret: Optional[str] = None
|
||||||
|
webhook_url: Optional[str] = None
|
||||||
|
callback_url: Optional[str] = None
|
||||||
|
auto_analyze: bool = True
|
||||||
|
sync_comments: bool = True
|
||||||
|
create_prs: bool = True
|
||||||
|
repositories: Optional[List[Dict[str, str]]] = None
|
||||||
|
config: Optional[Dict[str, Any]] = None
|
||||||
|
|
||||||
|
class IntegrationUpdate(BaseModel):
|
||||||
|
name: Optional[str] = None
|
||||||
|
base_url: Optional[str] = None
|
||||||
|
api_key: Optional[str] = None
|
||||||
|
api_secret: Optional[str] = None
|
||||||
|
callback_url: Optional[str] = None
|
||||||
|
auto_analyze: Optional[bool] = None
|
||||||
|
sync_comments: Optional[bool] = None
|
||||||
|
create_prs: Optional[bool] = None
|
||||||
|
repositories: Optional[List[Dict[str, str]]] = None
|
||||||
|
status: Optional[IntegrationStatus] = None
|
||||||
|
config: Optional[Dict[str, Any]] = None
|
||||||
|
|
||||||
|
class IntegrationRead(IntegrationBase):
|
||||||
|
id: int
|
||||||
|
organization_id: int
|
||||||
|
status: IntegrationStatus
|
||||||
|
base_url: Optional[str] = None
|
||||||
|
webhook_url: Optional[str] = None
|
||||||
|
auto_analyze: bool
|
||||||
|
issues_processed: Optional[int] = 0
|
||||||
|
last_sync_at: Optional[datetime] = None
|
||||||
|
last_error: Optional[str] = None
|
||||||
|
config: Optional[Dict[str, Any]] = None
|
||||||
|
created_at: datetime
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True
|
||||||
|
|
||||||
|
class OAuthCallback(BaseModel):
|
||||||
|
code: str
|
||||||
|
state: str
|
||||||
|
|
@ -0,0 +1,74 @@
|
||||||
|
"""Issue schemas."""
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import Optional, List, Dict, Any
|
||||||
|
from pydantic import BaseModel
|
||||||
|
from app.models.issue import IssueStatus, IssuePriority
|
||||||
|
|
||||||
|
class IssueBase(BaseModel):
|
||||||
|
title: str
|
||||||
|
description: Optional[str] = None
|
||||||
|
priority: IssuePriority = IssuePriority.MEDIUM
|
||||||
|
|
||||||
|
class IssueCreate(IssueBase):
|
||||||
|
external_id: Optional[str] = None
|
||||||
|
external_key: Optional[str] = None
|
||||||
|
external_url: Optional[str] = None
|
||||||
|
source: Optional[str] = None
|
||||||
|
labels: Optional[List[str]] = None
|
||||||
|
callback_url: Optional[str] = None
|
||||||
|
raw_payload: Optional[Dict[str, Any]] = None
|
||||||
|
|
||||||
|
class IssueUpdate(BaseModel):
|
||||||
|
title: Optional[str] = None
|
||||||
|
description: Optional[str] = None
|
||||||
|
priority: Optional[IssuePriority] = None
|
||||||
|
status: Optional[IssueStatus] = None
|
||||||
|
labels: Optional[List[str]] = None
|
||||||
|
|
||||||
|
class IssueRead(IssueBase):
|
||||||
|
id: int
|
||||||
|
organization_id: int
|
||||||
|
integration_id: Optional[int] = None
|
||||||
|
external_id: Optional[str] = None
|
||||||
|
external_key: Optional[str] = None
|
||||||
|
external_url: Optional[str] = None
|
||||||
|
source: Optional[str] = None
|
||||||
|
labels: Optional[List[str]] = None
|
||||||
|
|
||||||
|
status: IssueStatus
|
||||||
|
root_cause: Optional[str] = None
|
||||||
|
affected_files: Optional[List[str]] = None
|
||||||
|
suggested_fix: Optional[str] = None
|
||||||
|
confidence: Optional[float] = None
|
||||||
|
|
||||||
|
pr_url: Optional[str] = None
|
||||||
|
pr_branch: Optional[str] = None
|
||||||
|
pr_status: Optional[str] = None
|
||||||
|
|
||||||
|
sla_deadline: Optional[datetime] = None
|
||||||
|
sla_breached: bool = False
|
||||||
|
|
||||||
|
created_at: datetime
|
||||||
|
analysis_completed_at: Optional[datetime] = None
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True
|
||||||
|
|
||||||
|
class IssueStats(BaseModel):
|
||||||
|
total: int
|
||||||
|
pending: int
|
||||||
|
analyzing: int
|
||||||
|
analyzed: int
|
||||||
|
pr_created: int
|
||||||
|
completed: int
|
||||||
|
error: int
|
||||||
|
avg_confidence: float
|
||||||
|
avg_analysis_time_seconds: Optional[float] = None
|
||||||
|
by_source: Dict[str, int]
|
||||||
|
by_priority: Dict[str, int]
|
||||||
|
sla_breached: int
|
||||||
|
|
||||||
|
class IssueComment(BaseModel):
|
||||||
|
author: str
|
||||||
|
content: str
|
||||||
|
author_type: str = "user"
|
||||||
|
|
@ -0,0 +1,46 @@
|
||||||
|
"""Organization schemas."""
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import Optional, List
|
||||||
|
from pydantic import BaseModel, EmailStr
|
||||||
|
from app.models.organization import MemberRole
|
||||||
|
|
||||||
|
class OrganizationBase(BaseModel):
|
||||||
|
name: str
|
||||||
|
description: Optional[str] = None
|
||||||
|
|
||||||
|
class OrganizationCreate(OrganizationBase):
|
||||||
|
slug: Optional[str] = None # Auto-generated if not provided
|
||||||
|
|
||||||
|
class OrganizationUpdate(BaseModel):
|
||||||
|
name: Optional[str] = None
|
||||||
|
description: Optional[str] = None
|
||||||
|
logo_url: Optional[str] = None
|
||||||
|
slack_webhook_url: Optional[str] = None
|
||||||
|
teams_webhook_url: Optional[str] = None
|
||||||
|
|
||||||
|
class OrganizationRead(OrganizationBase):
|
||||||
|
id: int
|
||||||
|
slug: str
|
||||||
|
logo_url: Optional[str] = None
|
||||||
|
plan: str = "free"
|
||||||
|
is_active: bool = True
|
||||||
|
created_at: datetime
|
||||||
|
member_count: Optional[int] = None
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True
|
||||||
|
|
||||||
|
class MemberCreate(BaseModel):
|
||||||
|
email: EmailStr
|
||||||
|
role: MemberRole = MemberRole.ANALYST
|
||||||
|
|
||||||
|
class MemberRead(BaseModel):
|
||||||
|
id: int
|
||||||
|
user_id: int
|
||||||
|
role: MemberRole
|
||||||
|
joined_at: datetime
|
||||||
|
user_email: Optional[str] = None
|
||||||
|
user_name: Optional[str] = None
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True
|
||||||
|
|
@ -0,0 +1,41 @@
|
||||||
|
"""User schemas."""
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import Optional
|
||||||
|
from pydantic import BaseModel, EmailStr
|
||||||
|
|
||||||
|
class UserBase(BaseModel):
|
||||||
|
email: EmailStr
|
||||||
|
full_name: Optional[str] = None
|
||||||
|
|
||||||
|
class UserCreate(UserBase):
|
||||||
|
password: str
|
||||||
|
|
||||||
|
class LoginRequest(BaseModel):
|
||||||
|
email: EmailStr
|
||||||
|
password: str
|
||||||
|
|
||||||
|
class UserUpdate(BaseModel):
|
||||||
|
email: Optional[EmailStr] = None
|
||||||
|
full_name: Optional[str] = None
|
||||||
|
avatar_url: Optional[str] = None
|
||||||
|
password: Optional[str] = None
|
||||||
|
|
||||||
|
class UserRead(UserBase):
|
||||||
|
id: int
|
||||||
|
avatar_url: Optional[str] = None
|
||||||
|
is_active: bool
|
||||||
|
email_verified: bool
|
||||||
|
created_at: datetime
|
||||||
|
last_login: Optional[datetime] = None
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True
|
||||||
|
|
||||||
|
class Token(BaseModel):
|
||||||
|
access_token: str
|
||||||
|
refresh_token: str
|
||||||
|
token_type: str = "bearer"
|
||||||
|
|
||||||
|
class TokenData(BaseModel):
|
||||||
|
user_id: int
|
||||||
|
email: str
|
||||||
|
|
@ -0,0 +1,3 @@
|
||||||
|
from .email import EmailService
|
||||||
|
from .analysis import AnalysisService
|
||||||
|
from .audit import AuditService
|
||||||
|
|
@ -0,0 +1,348 @@
|
||||||
|
"""Analysis service - AI-powered issue analysis."""
|
||||||
|
import httpx
|
||||||
|
import json
|
||||||
|
import base64
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import Optional, Dict, Any, List
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
from sqlalchemy import select
|
||||||
|
from app.core.config import settings
|
||||||
|
from app.models.organization import Organization
|
||||||
|
|
||||||
|
class AnalysisService:
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def decrypt_key(cls, encrypted: str) -> str:
|
||||||
|
"""Simple deobfuscation."""
|
||||||
|
try:
|
||||||
|
return base64.b64decode(encrypted.encode()).decode()
|
||||||
|
except:
|
||||||
|
return ""
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
async def get_org_ai_config(cls, db: AsyncSession, org_id: int) -> Dict[str, Any]:
|
||||||
|
"""Get AI configuration from organization settings."""
|
||||||
|
result = await db.execute(select(Organization).where(Organization.id == org_id))
|
||||||
|
org = result.scalar_one_or_none()
|
||||||
|
|
||||||
|
if org and org.ai_api_key_encrypted:
|
||||||
|
return {
|
||||||
|
"provider": org.ai_provider or "openrouter",
|
||||||
|
"api_key": cls.decrypt_key(org.ai_api_key_encrypted),
|
||||||
|
"model": org.ai_model or "meta-llama/llama-3.3-70b-instruct",
|
||||||
|
"auto_analyze": org.ai_auto_analyze if org.ai_auto_analyze is not None else True,
|
||||||
|
"auto_create_pr": org.ai_auto_create_pr if org.ai_auto_create_pr is not None else True,
|
||||||
|
"confidence_threshold": org.ai_confidence_threshold or 70,
|
||||||
|
}
|
||||||
|
|
||||||
|
# Fallback to env config
|
||||||
|
return {
|
||||||
|
"provider": "openrouter",
|
||||||
|
"api_key": settings.OPENROUTER_API_KEY or "",
|
||||||
|
"model": "meta-llama/llama-3.3-70b-instruct",
|
||||||
|
"auto_analyze": True,
|
||||||
|
"auto_create_pr": True,
|
||||||
|
"confidence_threshold": 70,
|
||||||
|
}
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
async def fetch_repository_files(cls, repo: str, path: str = "") -> List[Dict[str, str]]:
|
||||||
|
"""Fetch files from Gitea repository."""
|
||||||
|
files = []
|
||||||
|
async with httpx.AsyncClient() as client:
|
||||||
|
try:
|
||||||
|
url = f"{settings.GITEA_URL}/api/v1/repos/{repo}/contents/{path}"
|
||||||
|
headers = {}
|
||||||
|
if settings.GITEA_TOKEN:
|
||||||
|
headers["Authorization"] = f"token {settings.GITEA_TOKEN}"
|
||||||
|
|
||||||
|
response = await client.get(url, headers=headers, timeout=30)
|
||||||
|
if response.status_code != 200:
|
||||||
|
return files
|
||||||
|
|
||||||
|
items = response.json()
|
||||||
|
for item in items:
|
||||||
|
if item["type"] == "file" and item["name"].endswith((".cbl", ".cob", ".py", ".java", ".js", ".ts", ".tsx", ".jsx")):
|
||||||
|
content_resp = await client.get(item["download_url"], headers=headers, timeout=30)
|
||||||
|
if content_resp.status_code == 200:
|
||||||
|
files.append({
|
||||||
|
"path": item["path"],
|
||||||
|
"content": content_resp.text[:10000] # Limit size
|
||||||
|
})
|
||||||
|
elif item["type"] == "dir":
|
||||||
|
sub_files = await cls.fetch_repository_files(repo, item["path"])
|
||||||
|
files.extend(sub_files)
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error fetching repo: {e}")
|
||||||
|
|
||||||
|
return files[:20] # Limit to 20 files
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def build_prompt(cls, issue: Dict[str, Any], files: List[Dict[str, str]]) -> str:
|
||||||
|
"""Build analysis prompt for LLM."""
|
||||||
|
files_context = "\n\n".join([
|
||||||
|
f"### {f['path']}\n```\n{f['content']}\n```"
|
||||||
|
for f in files
|
||||||
|
]) if files else "No source code files available."
|
||||||
|
|
||||||
|
return f"""You are an expert software engineer analyzing a support issue.
|
||||||
|
|
||||||
|
## Issue Details
|
||||||
|
**Title:** {issue.get('title', 'N/A')}
|
||||||
|
**Description:** {issue.get('description', 'N/A')}
|
||||||
|
**Priority:** {issue.get('priority', 'N/A')}
|
||||||
|
|
||||||
|
## Source Code Files
|
||||||
|
{files_context}
|
||||||
|
|
||||||
|
## Your Task
|
||||||
|
Analyze the issue and identify:
|
||||||
|
1. Root cause of the problem
|
||||||
|
2. Which files are affected
|
||||||
|
3. Suggested code fix
|
||||||
|
|
||||||
|
## Response Format (JSON)
|
||||||
|
{{
|
||||||
|
"root_cause": "Detailed explanation of what's causing the issue",
|
||||||
|
"affected_files": ["file1.py", "file2.py"],
|
||||||
|
"suggested_fix": "Code changes needed to fix the issue",
|
||||||
|
"confidence": 0.85,
|
||||||
|
"explanation": "Step-by-step explanation of the fix"
|
||||||
|
}}
|
||||||
|
|
||||||
|
Respond ONLY with valid JSON."""
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
async def call_llm(cls, prompt: str, ai_config: Dict[str, Any]) -> Dict[str, Any]:
|
||||||
|
"""Call the configured LLM provider."""
|
||||||
|
provider = ai_config.get("provider", "openrouter")
|
||||||
|
api_key = ai_config.get("api_key", "")
|
||||||
|
model = ai_config.get("model", "meta-llama/llama-3.3-70b-instruct")
|
||||||
|
|
||||||
|
if not api_key:
|
||||||
|
return {
|
||||||
|
"root_cause": "No API key configured. Go to Settings > AI Configuration.",
|
||||||
|
"affected_files": [],
|
||||||
|
"suggested_fix": "",
|
||||||
|
"confidence": 0,
|
||||||
|
"explanation": "Please configure an LLM API key in Settings."
|
||||||
|
}
|
||||||
|
|
||||||
|
async with httpx.AsyncClient() as client:
|
||||||
|
try:
|
||||||
|
if provider == "openrouter":
|
||||||
|
response = await client.post(
|
||||||
|
"https://openrouter.ai/api/v1/chat/completions",
|
||||||
|
headers={
|
||||||
|
"Authorization": f"Bearer {api_key}",
|
||||||
|
"Content-Type": "application/json",
|
||||||
|
"HTTP-Referer": "https://jira-fixer.startdata.com.br",
|
||||||
|
"X-Title": "JIRA AI Fixer"
|
||||||
|
},
|
||||||
|
json={
|
||||||
|
"model": model,
|
||||||
|
"messages": [{"role": "user", "content": prompt}],
|
||||||
|
"temperature": 0.2,
|
||||||
|
"max_tokens": 2000
|
||||||
|
},
|
||||||
|
timeout=120
|
||||||
|
)
|
||||||
|
elif provider == "anthropic":
|
||||||
|
response = await client.post(
|
||||||
|
"https://api.anthropic.com/v1/messages",
|
||||||
|
headers={
|
||||||
|
"x-api-key": api_key,
|
||||||
|
"Content-Type": "application/json",
|
||||||
|
"anthropic-version": "2023-06-01"
|
||||||
|
},
|
||||||
|
json={
|
||||||
|
"model": model,
|
||||||
|
"max_tokens": 2000,
|
||||||
|
"messages": [{"role": "user", "content": prompt}]
|
||||||
|
},
|
||||||
|
timeout=120
|
||||||
|
)
|
||||||
|
elif provider == "openai":
|
||||||
|
response = await client.post(
|
||||||
|
"https://api.openai.com/v1/chat/completions",
|
||||||
|
headers={
|
||||||
|
"Authorization": f"Bearer {api_key}",
|
||||||
|
"Content-Type": "application/json"
|
||||||
|
},
|
||||||
|
json={
|
||||||
|
"model": model,
|
||||||
|
"messages": [{"role": "user", "content": prompt}],
|
||||||
|
"temperature": 0.2,
|
||||||
|
"max_tokens": 2000
|
||||||
|
},
|
||||||
|
timeout=120
|
||||||
|
)
|
||||||
|
elif provider == "groq":
|
||||||
|
response = await client.post(
|
||||||
|
"https://api.groq.com/openai/v1/chat/completions",
|
||||||
|
headers={
|
||||||
|
"Authorization": f"Bearer {api_key}",
|
||||||
|
"Content-Type": "application/json"
|
||||||
|
},
|
||||||
|
json={
|
||||||
|
"model": model,
|
||||||
|
"messages": [{"role": "user", "content": prompt}],
|
||||||
|
"temperature": 0.2,
|
||||||
|
"max_tokens": 2000
|
||||||
|
},
|
||||||
|
timeout=120
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
return {
|
||||||
|
"root_cause": f"Unsupported provider: {provider}",
|
||||||
|
"affected_files": [],
|
||||||
|
"suggested_fix": "",
|
||||||
|
"confidence": 0,
|
||||||
|
"explanation": "Please select a supported AI provider."
|
||||||
|
}
|
||||||
|
|
||||||
|
if response.status_code == 200:
|
||||||
|
data = response.json()
|
||||||
|
|
||||||
|
# Extract content based on provider
|
||||||
|
if provider == "anthropic":
|
||||||
|
content = data["content"][0]["text"]
|
||||||
|
else:
|
||||||
|
content = data["choices"][0]["message"]["content"]
|
||||||
|
|
||||||
|
# Parse JSON from response
|
||||||
|
try:
|
||||||
|
if "```json" in content:
|
||||||
|
content = content.split("```json")[1].split("```")[0]
|
||||||
|
elif "```" in content:
|
||||||
|
content = content.split("```")[1].split("```")[0]
|
||||||
|
|
||||||
|
return json.loads(content.strip())
|
||||||
|
except json.JSONDecodeError:
|
||||||
|
return {
|
||||||
|
"root_cause": content[:500],
|
||||||
|
"affected_files": [],
|
||||||
|
"suggested_fix": "",
|
||||||
|
"confidence": 0.3,
|
||||||
|
"explanation": "Could not parse structured response"
|
||||||
|
}
|
||||||
|
else:
|
||||||
|
error_msg = response.text[:200]
|
||||||
|
try:
|
||||||
|
error_data = response.json()
|
||||||
|
error_msg = error_data.get("error", {}).get("message", error_msg)
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
return {
|
||||||
|
"root_cause": f"API error: {response.status_code}",
|
||||||
|
"affected_files": [],
|
||||||
|
"suggested_fix": "",
|
||||||
|
"confidence": 0,
|
||||||
|
"explanation": error_msg
|
||||||
|
}
|
||||||
|
|
||||||
|
except httpx.TimeoutException:
|
||||||
|
return {
|
||||||
|
"root_cause": "Analysis timeout",
|
||||||
|
"affected_files": [],
|
||||||
|
"suggested_fix": "",
|
||||||
|
"confidence": 0,
|
||||||
|
"explanation": "The AI request timed out. Try again."
|
||||||
|
}
|
||||||
|
except Exception as e:
|
||||||
|
return {
|
||||||
|
"root_cause": f"Analysis error: {str(e)}",
|
||||||
|
"affected_files": [],
|
||||||
|
"suggested_fix": "",
|
||||||
|
"confidence": 0,
|
||||||
|
"explanation": str(e)
|
||||||
|
}
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
async def analyze(cls, issue: Dict[str, Any], repo: Optional[str] = None, ai_config: Optional[Dict[str, Any]] = None) -> Dict[str, Any]:
|
||||||
|
"""Run AI analysis on an issue."""
|
||||||
|
# Use provided config or default
|
||||||
|
if ai_config is None:
|
||||||
|
ai_config = {
|
||||||
|
"provider": "openrouter",
|
||||||
|
"api_key": settings.OPENROUTER_API_KEY or "",
|
||||||
|
"model": "meta-llama/llama-3.3-70b-instruct",
|
||||||
|
}
|
||||||
|
|
||||||
|
# Fetch code context
|
||||||
|
files = []
|
||||||
|
if repo:
|
||||||
|
files = await cls.fetch_repository_files(repo)
|
||||||
|
|
||||||
|
# Build prompt
|
||||||
|
prompt = cls.build_prompt(issue, files)
|
||||||
|
|
||||||
|
# Call LLM
|
||||||
|
return await cls.call_llm(prompt, ai_config)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
async def create_pull_request(
|
||||||
|
cls,
|
||||||
|
repo: str,
|
||||||
|
branch: str,
|
||||||
|
title: str,
|
||||||
|
description: str,
|
||||||
|
file_changes: List[Dict[str, str]]
|
||||||
|
) -> Optional[str]:
|
||||||
|
"""Create a pull request with suggested fix."""
|
||||||
|
if not settings.GITEA_TOKEN:
|
||||||
|
return None
|
||||||
|
|
||||||
|
async with httpx.AsyncClient() as client:
|
||||||
|
headers = {"Authorization": f"token {settings.GITEA_TOKEN}"}
|
||||||
|
|
||||||
|
try:
|
||||||
|
# 1. Get default branch
|
||||||
|
repo_resp = await client.get(
|
||||||
|
f"{settings.GITEA_URL}/api/v1/repos/{repo}",
|
||||||
|
headers=headers,
|
||||||
|
timeout=30
|
||||||
|
)
|
||||||
|
if repo_resp.status_code != 200:
|
||||||
|
return None
|
||||||
|
default_branch = repo_resp.json().get("default_branch", "main")
|
||||||
|
|
||||||
|
# 2. Get latest commit SHA
|
||||||
|
ref_resp = await client.get(
|
||||||
|
f"{settings.GITEA_URL}/api/v1/repos/{repo}/git/refs/heads/{default_branch}",
|
||||||
|
headers=headers,
|
||||||
|
timeout=30
|
||||||
|
)
|
||||||
|
if ref_resp.status_code != 200:
|
||||||
|
return None
|
||||||
|
sha = ref_resp.json()["object"]["sha"]
|
||||||
|
|
||||||
|
# 3. Create branch
|
||||||
|
await client.post(
|
||||||
|
f"{settings.GITEA_URL}/api/v1/repos/{repo}/git/refs",
|
||||||
|
headers=headers,
|
||||||
|
json={"ref": f"refs/heads/{branch}", "sha": sha},
|
||||||
|
timeout=30
|
||||||
|
)
|
||||||
|
|
||||||
|
# 4. Create PR
|
||||||
|
pr_resp = await client.post(
|
||||||
|
f"{settings.GITEA_URL}/api/v1/repos/{repo}/pulls",
|
||||||
|
headers=headers,
|
||||||
|
json={
|
||||||
|
"title": title,
|
||||||
|
"body": description,
|
||||||
|
"head": branch,
|
||||||
|
"base": default_branch
|
||||||
|
},
|
||||||
|
timeout=30
|
||||||
|
)
|
||||||
|
|
||||||
|
if pr_resp.status_code in (200, 201):
|
||||||
|
pr_data = pr_resp.json()
|
||||||
|
return pr_data.get("html_url")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"PR creation error: {e}")
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
@ -0,0 +1,42 @@
|
||||||
|
"""Audit logging service."""
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import Optional, Dict, Any
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
from app.models.audit import AuditLog
|
||||||
|
|
||||||
|
class AuditService:
|
||||||
|
@classmethod
|
||||||
|
async def log(
|
||||||
|
cls,
|
||||||
|
db: AsyncSession,
|
||||||
|
action: str,
|
||||||
|
user_id: Optional[int] = None,
|
||||||
|
organization_id: Optional[int] = None,
|
||||||
|
resource_type: Optional[str] = None,
|
||||||
|
resource_id: Optional[int] = None,
|
||||||
|
old_values: Optional[Dict[str, Any]] = None,
|
||||||
|
new_values: Optional[Dict[str, Any]] = None,
|
||||||
|
description: Optional[str] = None,
|
||||||
|
ip_address: Optional[str] = None,
|
||||||
|
user_agent: Optional[str] = None,
|
||||||
|
success: str = "success",
|
||||||
|
error_message: Optional[str] = None
|
||||||
|
):
|
||||||
|
"""Create an audit log entry."""
|
||||||
|
log = AuditLog(
|
||||||
|
action=action,
|
||||||
|
user_id=user_id,
|
||||||
|
organization_id=organization_id,
|
||||||
|
resource_type=resource_type,
|
||||||
|
resource_id=resource_id,
|
||||||
|
old_values=old_values,
|
||||||
|
new_values=new_values,
|
||||||
|
description=description,
|
||||||
|
ip_address=ip_address,
|
||||||
|
user_agent=user_agent,
|
||||||
|
success=success,
|
||||||
|
error_message=error_message
|
||||||
|
)
|
||||||
|
db.add(log)
|
||||||
|
await db.flush()
|
||||||
|
return log
|
||||||
|
|
@ -0,0 +1,94 @@
|
||||||
|
"""Email service using Resend."""
|
||||||
|
import httpx
|
||||||
|
from typing import Optional, List
|
||||||
|
from app.core.config import settings
|
||||||
|
|
||||||
|
class EmailService:
|
||||||
|
RESEND_API = "https://api.resend.com/emails"
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
async def send(
|
||||||
|
cls,
|
||||||
|
to: List[str],
|
||||||
|
subject: str,
|
||||||
|
html: str,
|
||||||
|
text: Optional[str] = None
|
||||||
|
) -> bool:
|
||||||
|
if not settings.RESEND_API_KEY:
|
||||||
|
return False
|
||||||
|
|
||||||
|
async with httpx.AsyncClient() as client:
|
||||||
|
try:
|
||||||
|
response = await client.post(
|
||||||
|
cls.RESEND_API,
|
||||||
|
headers={
|
||||||
|
"Authorization": f"Bearer {settings.RESEND_API_KEY}",
|
||||||
|
"Content-Type": "application/json"
|
||||||
|
},
|
||||||
|
json={
|
||||||
|
"from": settings.EMAIL_FROM,
|
||||||
|
"to": to,
|
||||||
|
"subject": subject,
|
||||||
|
"html": html,
|
||||||
|
"text": text
|
||||||
|
}
|
||||||
|
)
|
||||||
|
return response.status_code == 200
|
||||||
|
except Exception:
|
||||||
|
return False
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
async def send_welcome(cls, email: str, name: str, org_name: str):
|
||||||
|
html = f"""
|
||||||
|
<div style="font-family: Arial, sans-serif; max-width: 600px; margin: 0 auto;">
|
||||||
|
<h1 style="color: #4F46E5;">Welcome to JIRA AI Fixer! 🤖</h1>
|
||||||
|
<p>Hi {name},</p>
|
||||||
|
<p>You've been added to <strong>{org_name}</strong>.</p>
|
||||||
|
<p>JIRA AI Fixer automatically analyzes support issues and suggests code fixes using AI.</p>
|
||||||
|
<div style="margin: 30px 0;">
|
||||||
|
<a href="https://jira-fixer.startdata.com.br"
|
||||||
|
style="background: #4F46E5; color: white; padding: 12px 24px; text-decoration: none; border-radius: 6px;">
|
||||||
|
Get Started
|
||||||
|
</a>
|
||||||
|
</div>
|
||||||
|
<p style="color: #666;">— The JIRA AI Fixer Team</p>
|
||||||
|
</div>
|
||||||
|
"""
|
||||||
|
await cls.send([email], f"Welcome to {org_name} on JIRA AI Fixer", html)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
async def send_analysis_complete(cls, email: str, issue_key: str, confidence: float, pr_url: Optional[str]):
|
||||||
|
html = f"""
|
||||||
|
<div style="font-family: Arial, sans-serif; max-width: 600px; margin: 0 auto;">
|
||||||
|
<h1 style="color: #10B981;">Analysis Complete ✅</h1>
|
||||||
|
<p>Issue <strong>{issue_key}</strong> has been analyzed.</p>
|
||||||
|
<div style="background: #F3F4F6; padding: 20px; border-radius: 8px; margin: 20px 0;">
|
||||||
|
<p><strong>Confidence:</strong> {confidence:.0%}</p>
|
||||||
|
{f'<p><strong>Pull Request:</strong> <a href="{pr_url}">{pr_url}</a></p>' if pr_url else ''}
|
||||||
|
</div>
|
||||||
|
<a href="https://jira-fixer.startdata.com.br"
|
||||||
|
style="background: #4F46E5; color: white; padding: 12px 24px; text-decoration: none; border-radius: 6px;">
|
||||||
|
View Details
|
||||||
|
</a>
|
||||||
|
</div>
|
||||||
|
"""
|
||||||
|
await cls.send([email], f"Analysis Complete: {issue_key}", html)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
async def send_weekly_digest(cls, email: str, org_name: str, stats: dict):
|
||||||
|
html = f"""
|
||||||
|
<div style="font-family: Arial, sans-serif; max-width: 600px; margin: 0 auto;">
|
||||||
|
<h1 style="color: #4F46E5;">Weekly Digest 📊</h1>
|
||||||
|
<p>Here's what happened in <strong>{org_name}</strong> this week:</p>
|
||||||
|
<div style="background: #F3F4F6; padding: 20px; border-radius: 8px; margin: 20px 0;">
|
||||||
|
<p><strong>Issues Analyzed:</strong> {stats.get('analyzed', 0)}</p>
|
||||||
|
<p><strong>PRs Created:</strong> {stats.get('prs', 0)}</p>
|
||||||
|
<p><strong>Avg Confidence:</strong> {stats.get('confidence', 0):.0%}</p>
|
||||||
|
</div>
|
||||||
|
<a href="https://jira-fixer.startdata.com.br/reports"
|
||||||
|
style="background: #4F46E5; color: white; padding: 12px 24px; text-decoration: none; border-radius: 6px;">
|
||||||
|
View Full Report
|
||||||
|
</a>
|
||||||
|
</div>
|
||||||
|
"""
|
||||||
|
await cls.send([email], f"Weekly Digest: {org_name}", html)
|
||||||
|
|
@ -0,0 +1,119 @@
|
||||||
|
"""Gitea integration service."""
|
||||||
|
import httpx
|
||||||
|
from typing import Optional, Dict, Any, List
|
||||||
|
|
||||||
|
class GiteaService:
|
||||||
|
def __init__(self, base_url: str, token: str):
|
||||||
|
self.base_url = base_url.rstrip('/')
|
||||||
|
self.token = token
|
||||||
|
self.headers = {
|
||||||
|
"Authorization": f"token {token}",
|
||||||
|
"Content-Type": "application/json"
|
||||||
|
}
|
||||||
|
|
||||||
|
async def get_repo(self, owner: str, repo: str) -> Optional[Dict[str, Any]]:
|
||||||
|
"""Get repository details."""
|
||||||
|
async with httpx.AsyncClient() as client:
|
||||||
|
try:
|
||||||
|
response = await client.get(
|
||||||
|
f"{self.base_url}/api/v1/repos/{owner}/{repo}",
|
||||||
|
headers=self.headers,
|
||||||
|
timeout=10.0
|
||||||
|
)
|
||||||
|
response.raise_for_status()
|
||||||
|
return response.json()
|
||||||
|
except Exception:
|
||||||
|
return None
|
||||||
|
|
||||||
|
async def get_file(self, owner: str, repo: str, path: str, ref: str = "main") -> Optional[str]:
|
||||||
|
"""Get file content from repository."""
|
||||||
|
async with httpx.AsyncClient() as client:
|
||||||
|
try:
|
||||||
|
response = await client.get(
|
||||||
|
f"{self.base_url}/api/v1/repos/{owner}/{repo}/contents/{path}?ref={ref}",
|
||||||
|
headers=self.headers,
|
||||||
|
timeout=10.0
|
||||||
|
)
|
||||||
|
response.raise_for_status()
|
||||||
|
data = response.json()
|
||||||
|
# Gitea returns base64 encoded content
|
||||||
|
import base64
|
||||||
|
return base64.b64decode(data.get("content", "")).decode("utf-8")
|
||||||
|
except Exception:
|
||||||
|
return None
|
||||||
|
|
||||||
|
async def create_branch(self, owner: str, repo: str, branch: str, from_branch: str = "main") -> bool:
|
||||||
|
"""Create a new branch."""
|
||||||
|
async with httpx.AsyncClient() as client:
|
||||||
|
try:
|
||||||
|
response = await client.post(
|
||||||
|
f"{self.base_url}/api/v1/repos/{owner}/{repo}/branches",
|
||||||
|
headers=self.headers,
|
||||||
|
json={"new_branch_name": branch, "old_branch_name": from_branch},
|
||||||
|
timeout=10.0
|
||||||
|
)
|
||||||
|
response.raise_for_status()
|
||||||
|
return True
|
||||||
|
except Exception:
|
||||||
|
return False
|
||||||
|
|
||||||
|
async def update_file(self, owner: str, repo: str, path: str, content: str,
|
||||||
|
message: str, branch: str, sha: Optional[str] = None) -> bool:
|
||||||
|
"""Update file in repository."""
|
||||||
|
import base64
|
||||||
|
async with httpx.AsyncClient() as client:
|
||||||
|
try:
|
||||||
|
payload = {
|
||||||
|
"content": base64.b64encode(content.encode()).decode(),
|
||||||
|
"message": message,
|
||||||
|
"branch": branch
|
||||||
|
}
|
||||||
|
if sha:
|
||||||
|
payload["sha"] = sha
|
||||||
|
|
||||||
|
response = await client.put(
|
||||||
|
f"{self.base_url}/api/v1/repos/{owner}/{repo}/contents/{path}",
|
||||||
|
headers=self.headers,
|
||||||
|
json=payload,
|
||||||
|
timeout=10.0
|
||||||
|
)
|
||||||
|
response.raise_for_status()
|
||||||
|
return True
|
||||||
|
except Exception:
|
||||||
|
return False
|
||||||
|
|
||||||
|
async def create_pull_request(self, owner: str, repo: str, title: str,
|
||||||
|
body: str, head: str, base: str = "main") -> Optional[str]:
|
||||||
|
"""Create a pull request."""
|
||||||
|
async with httpx.AsyncClient() as client:
|
||||||
|
try:
|
||||||
|
response = await client.post(
|
||||||
|
f"{self.base_url}/api/v1/repos/{owner}/{repo}/pulls",
|
||||||
|
headers=self.headers,
|
||||||
|
json={
|
||||||
|
"title": title,
|
||||||
|
"body": body,
|
||||||
|
"head": head,
|
||||||
|
"base": base
|
||||||
|
},
|
||||||
|
timeout=10.0
|
||||||
|
)
|
||||||
|
response.raise_for_status()
|
||||||
|
pr_data = response.json()
|
||||||
|
return pr_data.get("html_url")
|
||||||
|
except Exception:
|
||||||
|
return None
|
||||||
|
|
||||||
|
async def list_repositories(self, owner: str) -> List[Dict[str, Any]]:
|
||||||
|
"""List repositories for owner."""
|
||||||
|
async with httpx.AsyncClient() as client:
|
||||||
|
try:
|
||||||
|
response = await client.get(
|
||||||
|
f"{self.base_url}/api/v1/users/{owner}/repos",
|
||||||
|
headers=self.headers,
|
||||||
|
timeout=10.0
|
||||||
|
)
|
||||||
|
response.raise_for_status()
|
||||||
|
return response.json()
|
||||||
|
except Exception:
|
||||||
|
return []
|
||||||
|
|
@ -1,54 +0,0 @@
|
||||||
version: "3.8"
|
|
||||||
|
|
||||||
services:
|
|
||||||
api:
|
|
||||||
image: python:3.11-slim
|
|
||||||
working_dir: /app
|
|
||||||
entrypoint: ["/bin/sh", "-c"]
|
|
||||||
command:
|
|
||||||
- pip install --quiet --no-cache-dir fastapi uvicorn && echo "ZnJvbSBmYXN0YXBpIGltcG9ydCBGYXN0QVBJCmZyb20gZmFzdGFwaS5taWRkbGV3YXJlLmNvcnMgaW1wb3J0IENPUlNNaWRkbGV3YXJlCgphcHAgPSBGYXN0QVBJKHRpdGxlPSJKSVJBIEFJIEZpeGVyIiwgdmVyc2lvbj0iMC4xLjAiKQphcHAuYWRkX21pZGRsZXdhcmUoQ09SU01pZGRsZXdhcmUsIGFsbG93X29yaWdpbnM9WyIqIl0sIGFsbG93X2NyZWRlbnRpYWxzPVRydWUsIGFsbG93X21ldGhvZHM9WyIqIl0sIGFsbG93X2hlYWRlcnM9WyIqIl0pCgpAYXBwLmdldCgiLyIpCmFzeW5jIGRlZiByb290KCk6CiAgICByZXR1cm4geyJzdGF0dXMiOiAib2siLCAic2VydmljZSI6ICJKSVJBIEFJIEZpeGVyIiwgInZlcnNpb24iOiAiMC4xLjAifQoKQGFwcC5nZXQoIi9oZWFsdGgiKQphc3luYyBkZWYgaGVhbHRoKCk6CiAgICByZXR1cm4geyJzdGF0dXMiOiAiaGVhbHRoeSJ9CgpAYXBwLmdldCgiL2FwaS9pc3N1ZXMvc3RhdHMvc3VtbWFyeSIpCmFzeW5jIGRlZiBzdGF0cygpOgogICAgcmV0dXJuIHsidG90YWxfaXNzdWVzIjogMCwgInBlbmRpbmciOiAwLCAiYW5hbHl6ZWQiOiAwLCAiYWNjZXB0ZWQiOiAwLCAicmVqZWN0ZWQiOiAwLCAic3VjY2Vzc19yYXRlIjogMC4wfQoKQGFwcC5nZXQoIi9hcGkvaXNzdWVzIikKYXN5bmMgZGVmIGxpc3RfaXNzdWVzKCk6CiAgICByZXR1cm4geyJ0b3RhbCI6IDAsICJpdGVtcyI6IFtdfQoKQGFwcC5nZXQoIi9hcGkvY29uZmlnL2ludGVncmF0aW9ucyIpCmFzeW5jIGRlZiBnZXRfaW50ZWdyYXRpb25zKCk6CiAgICByZXR1cm4geyJqaXJhX3VybCI6ICIiLCAiamlyYV9wcm9qZWN0cyI6IFtdLCAibGxtX3Byb3ZpZGVyIjogIm9wZW5yb3V0ZXIiLCAib3BlbnJvdXRlcl9tb2RlbCI6ICJtZXRhLWxsYW1hL2xsYW1hLTMuMy03MGItaW5zdHJ1Y3Q6ZnJlZSIsICJlbWJlZGRpbmdfcHJvdmlkZXIiOiAibG9jYWwifQoKQGFwcC5nZXQoIi9hcGkvY29uZmlnL3JlcG9zaXRvcmllcyIpCmFzeW5jIGRlZiBsaXN0X3JlcG9zKCk6CiAgICByZXR1cm4gW10KCkBhcHAuZ2V0KCIvYXBpL2NvbmZpZy9tb2R1bGVzIikKYXN5bmMgZGVmIGxpc3RfbW9kdWxlcygpOgogICAgcmV0dXJuIFtdCgpAYXBwLnBvc3QoIi9hcGkvd2ViaG9vay9qaXJhIikKYXN5bmMgZGVmIGppcmFfd2ViaG9vaygpOgogICAgcmV0dXJuIHsic3RhdHVzIjogImFjY2VwdGVkIn0K" | base64 -d > main.py && exec uvicorn main:app --host 0.0.0.0 --port 8000
|
|
||||||
networks:
|
|
||||||
- internal
|
|
||||||
- traefik_public
|
|
||||||
deploy:
|
|
||||||
labels:
|
|
||||||
- traefik.enable=true
|
|
||||||
- traefik.http.routers.jira-fixer.rule=Host(`jira-fixer.startdata.com.br`)
|
|
||||||
- traefik.http.routers.jira-fixer.entrypoints=websecure
|
|
||||||
- traefik.http.routers.jira-fixer.tls.certresolver=le
|
|
||||||
- traefik.http.services.jira-fixer.loadbalancer.server.port=8000
|
|
||||||
|
|
||||||
postgres:
|
|
||||||
image: postgres:15-alpine
|
|
||||||
environment:
|
|
||||||
- POSTGRES_USER=jira
|
|
||||||
- POSTGRES_PASSWORD=jira_secret_2026
|
|
||||||
- POSTGRES_DB=jira_fixer
|
|
||||||
volumes:
|
|
||||||
- postgres_data:/var/lib/postgresql/data
|
|
||||||
networks:
|
|
||||||
- internal
|
|
||||||
|
|
||||||
redis:
|
|
||||||
image: redis:7-alpine
|
|
||||||
volumes:
|
|
||||||
- redis_data:/data
|
|
||||||
networks:
|
|
||||||
- internal
|
|
||||||
|
|
||||||
qdrant:
|
|
||||||
image: qdrant/qdrant:v1.7.4
|
|
||||||
volumes:
|
|
||||||
- qdrant_data:/qdrant/storage
|
|
||||||
networks:
|
|
||||||
- internal
|
|
||||||
|
|
||||||
volumes:
|
|
||||||
postgres_data:
|
|
||||||
redis_data:
|
|
||||||
qdrant_data:
|
|
||||||
|
|
||||||
networks:
|
|
||||||
internal:
|
|
||||||
traefik_public:
|
|
||||||
external: true
|
|
||||||
|
|
@ -1,63 +1,17 @@
|
||||||
version: "3.8"
|
version: '3.8'
|
||||||
|
|
||||||
services:
|
services:
|
||||||
api:
|
api:
|
||||||
build:
|
build: .
|
||||||
context: ./api
|
|
||||||
dockerfile: Dockerfile
|
|
||||||
ports:
|
ports:
|
||||||
- "8000:8000"
|
- "8000:8000"
|
||||||
environment:
|
environment:
|
||||||
- DATABASE_URL=postgresql://jira:jira@postgres:5432/jira_fixer
|
- DATABASE_URL=postgresql://postgres:postgres@host.docker.internal:5433/jira_fixer_v2
|
||||||
- REDIS_URL=redis://redis:6379
|
- REDIS_URL=redis://host.docker.internal:6379
|
||||||
- QDRANT_URL=http://qdrant:6333
|
- JWT_SECRET=dev-secret-change-in-production
|
||||||
env_file:
|
- RESEND_API_KEY=${RESEND_API_KEY}
|
||||||
- .env
|
- OPENROUTER_API_KEY=${OPENROUTER_API_KEY}
|
||||||
depends_on:
|
- GITEA_URL=https://gitea.startdata.com.br
|
||||||
- postgres
|
- GITEA_TOKEN=${GITEA_TOKEN}
|
||||||
- redis
|
|
||||||
- qdrant
|
|
||||||
volumes:
|
volumes:
|
||||||
- ./api:/app
|
- ./app:/app/app:ro
|
||||||
command: uvicorn main:app --host 0.0.0.0 --port 8000 --reload
|
|
||||||
|
|
||||||
portal:
|
|
||||||
build:
|
|
||||||
context: ./portal
|
|
||||||
dockerfile: Dockerfile
|
|
||||||
ports:
|
|
||||||
- "3000:3000"
|
|
||||||
volumes:
|
|
||||||
- ./portal:/app
|
|
||||||
- /app/node_modules
|
|
||||||
command: npm run dev
|
|
||||||
|
|
||||||
postgres:
|
|
||||||
image: postgres:15-alpine
|
|
||||||
environment:
|
|
||||||
- POSTGRES_USER=jira
|
|
||||||
- POSTGRES_PASSWORD=jira
|
|
||||||
- POSTGRES_DB=jira_fixer
|
|
||||||
volumes:
|
|
||||||
- postgres_data:/var/lib/postgresql/data
|
|
||||||
ports:
|
|
||||||
- "5432:5432"
|
|
||||||
|
|
||||||
redis:
|
|
||||||
image: redis:7-alpine
|
|
||||||
ports:
|
|
||||||
- "6379:6379"
|
|
||||||
volumes:
|
|
||||||
- redis_data:/data
|
|
||||||
|
|
||||||
qdrant:
|
|
||||||
image: qdrant/qdrant:v1.7.4
|
|
||||||
ports:
|
|
||||||
- "6333:6333"
|
|
||||||
volumes:
|
|
||||||
- qdrant_data:/qdrant/storage
|
|
||||||
|
|
||||||
volumes:
|
|
||||||
postgres_data:
|
|
||||||
redis_data:
|
|
||||||
qdrant_data:
|
|
||||||
|
|
|
||||||
|
|
@ -1,288 +0,0 @@
|
||||||
# JIRA AI Fixer - Architecture Document
|
|
||||||
|
|
||||||
## System Overview
|
|
||||||
|
|
||||||
JIRA AI Fixer is a microservice that provides AI-powered issue analysis and automated fix generation for enterprise issue tracking systems.
|
|
||||||
|
|
||||||
## High-Level Architecture
|
|
||||||
|
|
||||||
```
|
|
||||||
┌─────────────────────────────────────────────────────────────────────────────────┐
|
|
||||||
│ External Systems │
|
|
||||||
├─────────────────────────────────────────────────────────────────────────────────┤
|
|
||||||
│ │
|
|
||||||
│ ┌─────────┐ ┌─────────┐ ┌───────────┐ ┌─────────┐ ┌───────────┐ ┌─────────┐ │
|
|
||||||
│ │ JIRA │ │ServiceNow│ │ Zendesk │ │Azure DO │ │ GitHub │ │ GitLab │ │
|
|
||||||
│ └────┬────┘ └────┬────┘ └─────┬─────┘ └────┬────┘ └─────┬─────┘ └────┬────┘ │
|
|
||||||
│ │ │ │ │ │ │ │
|
|
||||||
│ └───────────┴────────────┴─────┬──────┴────────────┴────────────┘ │
|
|
||||||
│ │ │
|
|
||||||
│ HTTPS Webhooks │
|
|
||||||
│ │ │
|
|
||||||
└──────────────────────────────────────┼──────────────────────────────────────────┘
|
|
||||||
│
|
|
||||||
┌──────────────────────────────────────┼──────────────────────────────────────────┐
|
|
||||||
│ ▼ │
|
|
||||||
│ ┌─────────────────────────────────────────────────────────────────────────┐ │
|
|
||||||
│ │ JIRA AI Fixer API │ │
|
|
||||||
│ │ (FastAPI + Python 3.11) │ │
|
|
||||||
│ ├─────────────────────────────────────────────────────────────────────────┤ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ ┌──────────────────────────────────────────────────────────────────┐ │ │
|
|
||||||
│ │ │ Webhook Layer │ │ │
|
|
||||||
│ │ │ /api/webhook/jira /api/webhook/servicenow /api/webhook/... │ │ │
|
|
||||||
│ │ └───────────────────────────────┬──────────────────────────────────┘ │ │
|
|
||||||
│ │ │ │ │
|
|
||||||
│ │ ┌───────────────────────────────▼──────────────────────────────────┐ │ │
|
|
||||||
│ │ │ Adapter Layer │ │ │
|
|
||||||
│ │ │ normalize_jira() normalize_servicenow() normalize_zendesk() │ │ │
|
|
||||||
│ │ └───────────────────────────────┬──────────────────────────────────┘ │ │
|
|
||||||
│ │ │ │ │
|
|
||||||
│ │ NormalizedIssue │ │
|
|
||||||
│ │ │ │ │
|
|
||||||
│ │ ┌───────────────────────────────▼──────────────────────────────────┐ │ │
|
|
||||||
│ │ │ Core Analysis Engine │ │ │
|
|
||||||
│ │ │ save_and_queue_issue() → analyze_issue() (background) │ │ │
|
|
||||||
│ │ └───────────────────────────────┬──────────────────────────────────┘ │ │
|
|
||||||
│ │ │ │ │
|
|
||||||
│ │ ┌───────────────────────┼───────────────────────┐ │ │
|
|
||||||
│ │ │ │ │ │ │
|
|
||||||
│ │ ▼ ▼ ▼ │ │
|
|
||||||
│ │ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │ │
|
|
||||||
│ │ │ Code Fetcher │ │ LLM Client │ │ PR Creator │ │ │
|
|
||||||
│ │ │ (Gitea) │ │ (OpenRouter) │ │ (Gitea) │ │ │
|
|
||||||
│ │ └──────────────┘ └──────────────┘ └──────────────┘ │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ ┌──────────────────────────────────────────────────────────────────┐ │ │
|
|
||||||
│ │ │ Callback Layer │ │ │
|
|
||||||
│ │ │ post_analysis_to_source() - posts back to original system │ │ │
|
|
||||||
│ │ └──────────────────────────────────────────────────────────────────┘ │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ └─────────────────────────────────────────────────────────────────────────┘ │
|
|
||||||
│ │
|
|
||||||
│ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │
|
|
||||||
│ │ PostgreSQL │ │ Gitea │ │ OpenRouter │ │
|
|
||||||
│ │ Database │ │ (Code Host) │ │ (LLM API) │ │
|
|
||||||
│ └──────────────┘ └──────────────┘ └──────────────┘ │
|
|
||||||
│ │
|
|
||||||
│ JIRA AI Fixer Stack │
|
|
||||||
└─────────────────────────────────────────────────────────────────────────────────┘
|
|
||||||
```
|
|
||||||
|
|
||||||
## Component Details
|
|
||||||
|
|
||||||
### 1. Webhook Layer
|
|
||||||
|
|
||||||
Receives HTTP POST requests from external systems. Each endpoint is tailored to the specific payload format of the source system.
|
|
||||||
|
|
||||||
**Responsibilities:**
|
|
||||||
- Receive webhooks
|
|
||||||
- Basic validation
|
|
||||||
- Route to appropriate adapter
|
|
||||||
|
|
||||||
### 2. Adapter Layer (Normalizer)
|
|
||||||
|
|
||||||
Transforms vendor-specific payloads into a normalized internal format.
|
|
||||||
|
|
||||||
**NormalizedIssue Schema:**
|
|
||||||
```python
|
|
||||||
class NormalizedIssue(BaseModel):
|
|
||||||
external_id: str # Original ID in source system
|
|
||||||
external_key: str # Human-readable key (e.g., "JIRA-123")
|
|
||||||
source: str # Source system identifier
|
|
||||||
source_url: str # Link back to original issue
|
|
||||||
title: str # Issue title/summary
|
|
||||||
description: str # Full description
|
|
||||||
priority: str # Priority level
|
|
||||||
labels: List[str] # Tags/labels
|
|
||||||
callback_url: str # URL to post results back
|
|
||||||
metadata: Dict # System-specific extra data
|
|
||||||
```
|
|
||||||
|
|
||||||
### 3. Core Analysis Engine
|
|
||||||
|
|
||||||
The heart of the system. Runs as a background task.
|
|
||||||
|
|
||||||
**Pipeline:**
|
|
||||||
1. `fetch_cobol_files()` - Get source code from repositories
|
|
||||||
2. `build_analysis_prompt()` - Construct LLM prompt
|
|
||||||
3. `call_llm()` - Send to OpenRouter API
|
|
||||||
4. `parse_analysis()` - Extract structured data from response
|
|
||||||
5. `create_fix_branch_and_pr()` - Generate fix PR
|
|
||||||
6. `post_analysis_to_source()` - Report results
|
|
||||||
|
|
||||||
### 4. Database Layer
|
|
||||||
|
|
||||||
PostgreSQL stores all issues and their analysis results.
|
|
||||||
|
|
||||||
**Tables:**
|
|
||||||
```sql
|
|
||||||
issues (
|
|
||||||
id SERIAL PRIMARY KEY,
|
|
||||||
external_id TEXT,
|
|
||||||
external_key TEXT,
|
|
||||||
source TEXT,
|
|
||||||
source_url TEXT,
|
|
||||||
title TEXT,
|
|
||||||
description TEXT,
|
|
||||||
status TEXT, -- pending, analyzed, error
|
|
||||||
analysis TEXT,
|
|
||||||
confidence FLOAT,
|
|
||||||
affected_files TEXT, -- JSON array
|
|
||||||
suggested_fix TEXT,
|
|
||||||
pr_url TEXT,
|
|
||||||
pr_branch TEXT,
|
|
||||||
callback_url TEXT,
|
|
||||||
metadata JSONB,
|
|
||||||
created_at TIMESTAMP,
|
|
||||||
analyzed_at TIMESTAMP
|
|
||||||
)
|
|
||||||
|
|
||||||
integrations (
|
|
||||||
id SERIAL PRIMARY KEY,
|
|
||||||
name TEXT UNIQUE,
|
|
||||||
type TEXT,
|
|
||||||
config JSONB,
|
|
||||||
enabled BOOLEAN,
|
|
||||||
last_event_at TIMESTAMP,
|
|
||||||
event_count INT
|
|
||||||
)
|
|
||||||
```
|
|
||||||
|
|
||||||
### 5. Git Integration Layer
|
|
||||||
|
|
||||||
Interfaces with Gitea for:
|
|
||||||
- Fetching source code
|
|
||||||
- Creating branches
|
|
||||||
- Committing fixes
|
|
||||||
- Opening pull requests
|
|
||||||
|
|
||||||
### 6. Callback Layer
|
|
||||||
|
|
||||||
Posts analysis results back to the source system. Handles different API formats:
|
|
||||||
|
|
||||||
| System | Format |
|
|
||||||
|--------|--------|
|
|
||||||
| JIRA | REST API v2 |
|
|
||||||
| ServiceNow | Table API |
|
|
||||||
| Zendesk | Tickets API |
|
|
||||||
| Azure DevOps | Work Items API |
|
|
||||||
| GitHub | Issues API |
|
|
||||||
| GitLab | Notes API |
|
|
||||||
|
|
||||||
## Data Flow
|
|
||||||
|
|
||||||
```
|
|
||||||
1. Webhook Received
|
|
||||||
└─► POST /api/webhook/{source}
|
|
||||||
└─► normalize_{source}(payload)
|
|
||||||
└─► NormalizedIssue
|
|
||||||
└─► save_to_database()
|
|
||||||
└─► queue_background_task()
|
|
||||||
|
|
||||||
2. Background Analysis
|
|
||||||
└─► analyze_issue()
|
|
||||||
├─► fetch_cobol_files() ←── Gitea API
|
|
||||||
├─► build_analysis_prompt()
|
|
||||||
├─► call_llm() ←── OpenRouter API
|
|
||||||
├─► parse_analysis()
|
|
||||||
├─► create_fix_branch_and_pr() ──► Gitea API
|
|
||||||
├─► update_database()
|
|
||||||
└─► post_analysis_to_source() ──► Source System API
|
|
||||||
|
|
||||||
3. User Views Dashboard
|
|
||||||
└─► GET /api/issues
|
|
||||||
└─► query_database()
|
|
||||||
└─► return JSON
|
|
||||||
```
|
|
||||||
|
|
||||||
## Deployment Architecture
|
|
||||||
|
|
||||||
```
|
|
||||||
┌─────────────────────────────────────────────────────────────┐
|
|
||||||
│ Docker Swarm Cluster │
|
|
||||||
├─────────────────────────────────────────────────────────────┤
|
|
||||||
│ │
|
|
||||||
│ ┌─────────────────┐ ┌─────────────────┐ │
|
|
||||||
│ │ Traefik Proxy │◄────────►│ Let's Encrypt │ │
|
|
||||||
│ │ (Edge Router) │ │ (TLS Certs) │ │
|
|
||||||
│ └────────┬────────┘ └─────────────────┘ │
|
|
||||||
│ │ │
|
|
||||||
│ │ jira-fixer.startdata.com.br │
|
|
||||||
│ │ │
|
|
||||||
│ ┌────────▼────────┐ ┌─────────────────┐ │
|
|
||||||
│ │ JIRA AI Fixer │ │ PostgreSQL │ │
|
|
||||||
│ │ API (8000) │◄────────►│ (internal) │ │
|
|
||||||
│ │ Python 3.11 │ │ │ │
|
|
||||||
│ └─────────────────┘ └─────────────────┘ │
|
|
||||||
│ │
|
|
||||||
└─────────────────────────────────────────────────────────────┘
|
|
||||||
|
|
||||||
External Services:
|
|
||||||
- Gitea (gitea.startdata.com.br) - Code repository
|
|
||||||
- OpenRouter (openrouter.ai) - LLM API
|
|
||||||
```
|
|
||||||
|
|
||||||
## Security Considerations
|
|
||||||
|
|
||||||
### Network Security
|
|
||||||
- All external traffic through HTTPS (TLS 1.3)
|
|
||||||
- Internal services on isolated Docker network
|
|
||||||
- Database not exposed externally
|
|
||||||
|
|
||||||
### Authentication
|
|
||||||
- Webhook secrets (optional) for validation
|
|
||||||
- Gitea token for repository access
|
|
||||||
- OpenRouter API key for LLM
|
|
||||||
|
|
||||||
### Data Privacy
|
|
||||||
- Issue descriptions may contain sensitive data
|
|
||||||
- LLM calls go to external service (OpenRouter)
|
|
||||||
- Consider self-hosted LLM for sensitive environments
|
|
||||||
|
|
||||||
## Scalability
|
|
||||||
|
|
||||||
### Current Limits
|
|
||||||
- Single API instance
|
|
||||||
- ~50 concurrent analyses
|
|
||||||
- ~1000 issues/day throughput
|
|
||||||
|
|
||||||
### Scaling Options
|
|
||||||
1. **Horizontal**: Add more API replicas
|
|
||||||
2. **Queue**: Add Redis for job queue
|
|
||||||
3. **Database**: PostgreSQL read replicas
|
|
||||||
4. **LLM**: Multiple OpenRouter API keys
|
|
||||||
|
|
||||||
## Monitoring
|
|
||||||
|
|
||||||
### Health Check
|
|
||||||
```bash
|
|
||||||
GET /api/health
|
|
||||||
→ {"status": "healthy", "service": "jira-ai-fixer", "version": "2.0.0"}
|
|
||||||
```
|
|
||||||
|
|
||||||
### Metrics Endpoint
|
|
||||||
```bash
|
|
||||||
GET /api/stats
|
|
||||||
→ {
|
|
||||||
"total": 150,
|
|
||||||
"analyzed": 142,
|
|
||||||
"prs_created": 98,
|
|
||||||
"avg_confidence": 85,
|
|
||||||
"by_source": {"jira": 80, "servicenow": 50, "tickethub": 20}
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
## Future Enhancements
|
|
||||||
|
|
||||||
1. **Multi-language Support**: Java, Python, JavaScript analysis
|
|
||||||
2. **Custom LLM Models**: Support for local/private models
|
|
||||||
3. **Repository Indexing**: Full codebase embeddings for better context
|
|
||||||
4. **Automated Testing**: Run tests on fix branches
|
|
||||||
5. **Approval Workflow**: Require human approval before PR
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
*Document Version: 2.0*
|
|
||||||
*Last Updated: February 2026*
|
|
||||||
*StartData Engineering*
|
|
||||||
|
|
@ -1,229 +0,0 @@
|
||||||
# JIRA AI Fixer - Developer Guide
|
|
||||||
|
|
||||||
## Overview
|
|
||||||
|
|
||||||
JIRA AI Fixer is a universal AI-powered issue analysis engine that integrates with multiple issue tracking systems to automatically analyze support cases and suggest code fixes.
|
|
||||||
|
|
||||||
## Architecture
|
|
||||||
|
|
||||||
```
|
|
||||||
┌─────────────────────────────────────────────────────────────────┐
|
|
||||||
│ JIRA AI Fixer │
|
|
||||||
├─────────────────────────────────────────────────────────────────┤
|
|
||||||
│ ┌──────────┐ ┌──────────┐ ┌──────────┐ ┌──────────┐ │
|
|
||||||
│ │TicketHub │ │ JIRA │ │ServiceNow│ │ Zendesk │ ... │
|
|
||||||
│ │ Webhook │ │ Webhook │ │ Webhook │ │ Webhook │ │
|
|
||||||
│ └────┬─────┘ └────┬─────┘ └────┬─────┘ └────┬─────┘ │
|
|
||||||
│ │ │ │ │ │
|
|
||||||
│ └─────────────┴──────┬──────┴─────────────┘ │
|
|
||||||
│ │ │
|
|
||||||
│ ┌───────▼───────┐ │
|
|
||||||
│ │ Normalizer │ (Adapter Pattern) │
|
|
||||||
│ └───────┬───────┘ │
|
|
||||||
│ │ │
|
|
||||||
│ ┌───────▼───────┐ │
|
|
||||||
│ │ Analyzer │ (LLM + Code Analysis) │
|
|
||||||
│ └───────┬───────┘ │
|
|
||||||
│ │ │
|
|
||||||
│ ┌─────────────┼─────────────┐ │
|
|
||||||
│ │ │ │ │
|
|
||||||
│ ┌──────▼─────┐ ┌─────▼─────┐ ┌────▼────┐ │
|
|
||||||
│ │ Database │ │ PR Gen │ │Callback │ │
|
|
||||||
│ │ PostgreSQL │ │ Gitea │ │ to Src │ │
|
|
||||||
│ └────────────┘ └───────────┘ └─────────┘ │
|
|
||||||
└─────────────────────────────────────────────────────────────────┘
|
|
||||||
```
|
|
||||||
|
|
||||||
## Tech Stack
|
|
||||||
|
|
||||||
- **Language**: Python 3.11
|
|
||||||
- **Framework**: FastAPI (async)
|
|
||||||
- **Database**: PostgreSQL 15
|
|
||||||
- **LLM**: OpenRouter API (Llama 3.3 70B free tier)
|
|
||||||
- **Code Hosting**: Gitea (self-hosted)
|
|
||||||
|
|
||||||
## Project Structure
|
|
||||||
|
|
||||||
```
|
|
||||||
jira-ai-fixer/
|
|
||||||
├── api/
|
|
||||||
│ └── main_v3.py # Main application (monolith)
|
|
||||||
├── docs/
|
|
||||||
│ ├── DEVELOPER_GUIDE.md
|
|
||||||
│ ├── USER_GUIDE.md
|
|
||||||
│ └── ARCHITECTURE.md
|
|
||||||
└── README.md
|
|
||||||
```
|
|
||||||
|
|
||||||
## Key Components
|
|
||||||
|
|
||||||
### 1. Webhook Adapters
|
|
||||||
|
|
||||||
Each supported system has a dedicated adapter that normalizes payloads:
|
|
||||||
|
|
||||||
```python
|
|
||||||
def normalize_jira(payload: dict) -> Optional[NormalizedIssue]:
|
|
||||||
"""Normalize JIRA webhook payload"""
|
|
||||||
issue = payload.get("issue", {})
|
|
||||||
fields = issue.get("fields", {})
|
|
||||||
return NormalizedIssue(
|
|
||||||
external_id=str(issue.get("id")),
|
|
||||||
external_key=issue.get("key"),
|
|
||||||
source="jira",
|
|
||||||
title=fields.get("summary"),
|
|
||||||
description=fields.get("description"),
|
|
||||||
callback_url=f"{base_url}/rest/api/2/issue/{issue.get('key')}/comment"
|
|
||||||
)
|
|
||||||
```
|
|
||||||
|
|
||||||
### 2. Analysis Pipeline
|
|
||||||
|
|
||||||
```python
|
|
||||||
async def analyze_issue(issue_id: int, issue: NormalizedIssue):
|
|
||||||
# 1. Fetch source code from repositories
|
|
||||||
cobol_files = await fetch_cobol_files()
|
|
||||||
|
|
||||||
# 2. Build LLM prompt
|
|
||||||
prompt = build_analysis_prompt(issue, cobol_files)
|
|
||||||
|
|
||||||
# 3. Call LLM API
|
|
||||||
analysis = await call_llm(prompt)
|
|
||||||
|
|
||||||
# 4. Parse response
|
|
||||||
result = parse_analysis(analysis)
|
|
||||||
|
|
||||||
# 5. Create fix branch and PR
|
|
||||||
pr_info = await create_fix_branch_and_pr(issue, result)
|
|
||||||
|
|
||||||
# 6. Post back to source system
|
|
||||||
await post_analysis_to_source(issue, result, pr_info)
|
|
||||||
```
|
|
||||||
|
|
||||||
### 3. Callback System
|
|
||||||
|
|
||||||
Results are posted back to the source system in their native format:
|
|
||||||
|
|
||||||
| System | Method | Format |
|
|
||||||
|--------|--------|--------|
|
|
||||||
| TicketHub | POST /tickets/{id}/comments | `{"author": "...", "content": "..."}` |
|
|
||||||
| JIRA | POST /rest/api/2/issue/{key}/comment | `{"body": "..."}` |
|
|
||||||
| ServiceNow | PATCH /api/now/table/incident/{sys_id} | `{"work_notes": "..."}` |
|
|
||||||
| Zendesk | PUT /api/v2/tickets/{id}.json | `{"ticket": {"comment": {...}}}` |
|
|
||||||
| Azure DevOps | POST /workitems/{id}/comments | `{"text": "..."}` |
|
|
||||||
| GitHub | POST /repos/.../issues/{n}/comments | `{"body": "..."}` |
|
|
||||||
|
|
||||||
## Adding a New Integration
|
|
||||||
|
|
||||||
1. Create normalizer function:
|
|
||||||
|
|
||||||
```python
|
|
||||||
def normalize_newsystem(payload: dict) -> Optional[NormalizedIssue]:
|
|
||||||
# Extract fields from payload
|
|
||||||
return NormalizedIssue(
|
|
||||||
external_id=...,
|
|
||||||
external_key=...,
|
|
||||||
source="newsystem",
|
|
||||||
title=...,
|
|
||||||
description=...,
|
|
||||||
callback_url=...
|
|
||||||
)
|
|
||||||
```
|
|
||||||
|
|
||||||
2. Add webhook endpoint:
|
|
||||||
|
|
||||||
```python
|
|
||||||
@app.post("/api/webhook/newsystem")
|
|
||||||
async def webhook_newsystem(payload: dict, background_tasks: BackgroundTasks):
|
|
||||||
issue = normalize_newsystem(payload)
|
|
||||||
if not issue:
|
|
||||||
return WebhookResponse(status="ignored", message="Event not handled")
|
|
||||||
issue_id = await save_and_queue_issue(issue, background_tasks)
|
|
||||||
return WebhookResponse(status="accepted", issue_id=issue_id)
|
|
||||||
```
|
|
||||||
|
|
||||||
3. Add callback format in `post_analysis_to_source()`:
|
|
||||||
|
|
||||||
```python
|
|
||||||
elif issue.source == "newsystem":
|
|
||||||
await client.post(issue.callback_url, json={...})
|
|
||||||
```
|
|
||||||
|
|
||||||
## Environment Variables
|
|
||||||
|
|
||||||
| Variable | Description | Default |
|
|
||||||
|----------|-------------|---------|
|
|
||||||
| `DATABASE_URL` | PostgreSQL connection string | `postgresql://jira:jira_secret_2026@postgres:5432/jira_fixer` |
|
|
||||||
| `OPENROUTER_API_KEY` | OpenRouter API key for LLM | (empty = mock mode) |
|
|
||||||
| `GITEA_URL` | Gitea instance URL | `https://gitea.startdata.com.br` |
|
|
||||||
| `GITEA_TOKEN` | Gitea API token | (empty) |
|
|
||||||
| `COBOL_REPO` | Default repository to analyze | `startdata/cobol-sample-app` |
|
|
||||||
|
|
||||||
## API Endpoints
|
|
||||||
|
|
||||||
### Webhooks
|
|
||||||
|
|
||||||
| Endpoint | Description |
|
|
||||||
|----------|-------------|
|
|
||||||
| `POST /api/webhook/tickethub` | TicketHub webhooks |
|
|
||||||
| `POST /api/webhook/jira` | JIRA webhooks |
|
|
||||||
| `POST /api/webhook/servicenow` | ServiceNow webhooks |
|
|
||||||
| `POST /api/webhook/zendesk` | Zendesk webhooks |
|
|
||||||
| `POST /api/webhook/azure-devops` | Azure DevOps webhooks |
|
|
||||||
| `POST /api/webhook/github` | GitHub Issues webhooks |
|
|
||||||
| `POST /api/webhook/gitlab` | GitLab Issues webhooks |
|
|
||||||
| `POST /api/webhook/generic` | Generic webhook format |
|
|
||||||
|
|
||||||
### Management
|
|
||||||
|
|
||||||
| Endpoint | Description |
|
|
||||||
|----------|-------------|
|
|
||||||
| `GET /api/health` | Health check |
|
|
||||||
| `GET /api/issues` | List issues |
|
|
||||||
| `GET /api/issues/{id}` | Get issue details |
|
|
||||||
| `GET /api/integrations` | List integrations |
|
|
||||||
| `GET /api/stats` | Dashboard statistics |
|
|
||||||
|
|
||||||
## Running Locally
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Install dependencies
|
|
||||||
pip install fastapi uvicorn httpx asyncpg pydantic
|
|
||||||
|
|
||||||
# Run with PostgreSQL
|
|
||||||
export DATABASE_URL="postgresql://user:pass@localhost:5432/jira_fixer"
|
|
||||||
uvicorn main:app --reload --port 8000
|
|
||||||
```
|
|
||||||
|
|
||||||
## Testing Webhooks
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Test TicketHub webhook
|
|
||||||
curl -X POST http://localhost:8000/api/webhook/tickethub \
|
|
||||||
-H "Content-Type: application/json" \
|
|
||||||
-d '{
|
|
||||||
"event": "ticket.created",
|
|
||||||
"timestamp": "2026-02-18T18:00:00Z",
|
|
||||||
"data": {
|
|
||||||
"id": 1,
|
|
||||||
"key": "SUPP-1",
|
|
||||||
"title": "Test issue",
|
|
||||||
"description": "Test description"
|
|
||||||
}
|
|
||||||
}'
|
|
||||||
|
|
||||||
# Test generic webhook
|
|
||||||
curl -X POST http://localhost:8000/api/webhook/generic \
|
|
||||||
-H "Content-Type: application/json" \
|
|
||||||
-d '{
|
|
||||||
"id": "123",
|
|
||||||
"key": "CUSTOM-1",
|
|
||||||
"title": "Custom issue",
|
|
||||||
"description": "From custom system",
|
|
||||||
"source": "my-system",
|
|
||||||
"callback_url": "https://my-system.com/api/issues/123/comments"
|
|
||||||
}'
|
|
||||||
```
|
|
||||||
|
|
||||||
## License
|
|
||||||
|
|
||||||
MIT License - StartData 2026
|
|
||||||
|
|
@ -1,223 +0,0 @@
|
||||||
# JIRA AI Fixer - User Guide
|
|
||||||
|
|
||||||
## What is JIRA AI Fixer?
|
|
||||||
|
|
||||||
JIRA AI Fixer is an AI-powered system that automatically analyzes support tickets from your issue tracking system, identifies the root cause in your codebase, and creates pull requests with suggested fixes.
|
|
||||||
|
|
||||||
## Supported Platforms
|
|
||||||
|
|
||||||
| Platform | Status | Webhook Endpoint |
|
|
||||||
|----------|--------|------------------|
|
|
||||||
| TicketHub | ✅ Active | `/api/webhook/tickethub` |
|
|
||||||
| JIRA | ✅ Ready | `/api/webhook/jira` |
|
|
||||||
| ServiceNow | ✅ Ready | `/api/webhook/servicenow` |
|
|
||||||
| Zendesk | ✅ Ready | `/api/webhook/zendesk` |
|
|
||||||
| Azure DevOps | ✅ Ready | `/api/webhook/azure-devops` |
|
|
||||||
| GitHub Issues | ✅ Ready | `/api/webhook/github` |
|
|
||||||
| GitLab Issues | ✅ Ready | `/api/webhook/gitlab` |
|
|
||||||
| Custom Systems | ✅ Ready | `/api/webhook/generic` |
|
|
||||||
|
|
||||||
## How It Works
|
|
||||||
|
|
||||||
```
|
|
||||||
1. Ticket Created → 2. Webhook Sent → 3. AI Analyzes → 4. PR Created → 5. Result Posted
|
|
||||||
(JIRA) (automatic) (30 seconds) (Gitea) (back to JIRA)
|
|
||||||
```
|
|
||||||
|
|
||||||
### Step-by-Step Flow
|
|
||||||
|
|
||||||
1. **Ticket Created**: A support ticket is created in your issue tracker (JIRA, ServiceNow, etc.)
|
|
||||||
|
|
||||||
2. **Webhook Triggered**: Your issue tracker sends a webhook to JIRA AI Fixer
|
|
||||||
|
|
||||||
3. **AI Analysis**: The system:
|
|
||||||
- Fetches relevant source code from your repositories
|
|
||||||
- Sends the ticket description + code to an AI model
|
|
||||||
- Identifies root cause and affected files
|
|
||||||
- Generates a fix suggestion
|
|
||||||
|
|
||||||
4. **PR Created**: If a fix is found:
|
|
||||||
- Creates a new branch (`fix/TICKET-123-auto-fix`)
|
|
||||||
- Applies the code change
|
|
||||||
- Opens a Pull Request with full explanation
|
|
||||||
|
|
||||||
5. **Result Posted**: A comment is added to your original ticket with:
|
|
||||||
- Root cause analysis
|
|
||||||
- Affected files
|
|
||||||
- Suggested fix
|
|
||||||
- Link to the Pull Request
|
|
||||||
- Confidence score
|
|
||||||
|
|
||||||
## Setting Up Webhooks
|
|
||||||
|
|
||||||
### JIRA
|
|
||||||
|
|
||||||
1. Go to **Settings → System → Webhooks**
|
|
||||||
2. Click **Create a Webhook**
|
|
||||||
3. Set URL: `https://jira-fixer.startdata.com.br/api/webhook/jira`
|
|
||||||
4. Events: Select **Issue → created**
|
|
||||||
5. Save
|
|
||||||
|
|
||||||
### ServiceNow
|
|
||||||
|
|
||||||
1. Go to **System Web Services → Outbound → REST Message**
|
|
||||||
2. Create new REST Message pointing to: `https://jira-fixer.startdata.com.br/api/webhook/servicenow`
|
|
||||||
3. Create a Business Rule on Incident table to trigger on Insert
|
|
||||||
|
|
||||||
### Zendesk
|
|
||||||
|
|
||||||
1. Go to **Admin Center → Apps and integrations → Webhooks**
|
|
||||||
2. Create webhook with endpoint: `https://jira-fixer.startdata.com.br/api/webhook/zendesk`
|
|
||||||
3. Create Trigger: **When ticket is created → Notify webhook**
|
|
||||||
|
|
||||||
### Azure DevOps
|
|
||||||
|
|
||||||
1. Go to **Project Settings → Service hooks**
|
|
||||||
2. Create subscription for **Work item created**
|
|
||||||
3. Set URL: `https://jira-fixer.startdata.com.br/api/webhook/azure-devops`
|
|
||||||
|
|
||||||
### GitHub
|
|
||||||
|
|
||||||
1. Go to **Repository → Settings → Webhooks**
|
|
||||||
2. Add webhook: `https://jira-fixer.startdata.com.br/api/webhook/github`
|
|
||||||
3. Select events: **Issues**
|
|
||||||
4. Content type: `application/json`
|
|
||||||
|
|
||||||
### GitLab
|
|
||||||
|
|
||||||
1. Go to **Settings → Webhooks**
|
|
||||||
2. URL: `https://jira-fixer.startdata.com.br/api/webhook/gitlab`
|
|
||||||
3. Trigger: **Issues events**
|
|
||||||
|
|
||||||
### Custom System (Generic)
|
|
||||||
|
|
||||||
Send a POST request with this format:
|
|
||||||
|
|
||||||
```json
|
|
||||||
{
|
|
||||||
"id": "your-ticket-id",
|
|
||||||
"key": "PROJ-123",
|
|
||||||
"title": "Issue title",
|
|
||||||
"description": "Detailed description of the problem",
|
|
||||||
"source": "your-system-name",
|
|
||||||
"priority": "high",
|
|
||||||
"labels": ["bug", "production"],
|
|
||||||
"callback_url": "https://your-system.com/api/tickets/123/comments"
|
|
||||||
}
|
|
||||||
```
|
|
||||||
|
|
||||||
## Dashboard
|
|
||||||
|
|
||||||
Access the dashboard at: **https://jira-fixer.startdata.com.br**
|
|
||||||
|
|
||||||
### Features
|
|
||||||
|
|
||||||
- **Real-time Statistics**: Total issues, analyzed, PRs created, average confidence
|
|
||||||
- **Issue List**: View all processed issues with status
|
|
||||||
- **Issue Detail**: Click any issue to see full analysis, suggested fix, and PR link
|
|
||||||
- **Filter by Source**: Filter issues by origin system (JIRA, ServiceNow, etc.)
|
|
||||||
- **Filter by Status**: Filter by pending, analyzed, or error
|
|
||||||
|
|
||||||
## Understanding Results
|
|
||||||
|
|
||||||
### Analysis Comment Format
|
|
||||||
|
|
||||||
When analysis completes, you'll see a comment like this:
|
|
||||||
|
|
||||||
```
|
|
||||||
🤖 AI ANALYSIS COMPLETE
|
|
||||||
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
|
||||||
|
|
||||||
📋 ROOT CAUSE:
|
|
||||||
The WS-AVAILABLE-BALANCE field is declared as PIC 9(9)V99 which can only
|
|
||||||
hold values up to 9,999,999.99. Values above this are truncated.
|
|
||||||
|
|
||||||
📁 AFFECTED FILES: AUTH.CBL
|
|
||||||
|
|
||||||
🔧 SUGGESTED FIX:
|
|
||||||
────────────────────────────────────────
|
|
||||||
Change line 15 from:
|
|
||||||
05 WS-AVAILABLE-BALANCE PIC 9(9)V99.
|
|
||||||
To:
|
|
||||||
05 WS-AVAILABLE-BALANCE PIC 9(11)V99.
|
|
||||||
────────────────────────────────────────
|
|
||||||
|
|
||||||
🔀 PULL REQUEST CREATED:
|
|
||||||
────────────────────────────────────────
|
|
||||||
Branch: fix/supp-1-auto-fix
|
|
||||||
PR: #5
|
|
||||||
URL: https://gitea.startdata.com.br/startdata/cobol-sample-app/pulls/5
|
|
||||||
────────────────────────────────────────
|
|
||||||
|
|
||||||
📊 CONFIDENCE: 92%
|
|
||||||
|
|
||||||
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
|
||||||
Analyzed by JIRA AI Fixer
|
|
||||||
```
|
|
||||||
|
|
||||||
### Confidence Score
|
|
||||||
|
|
||||||
| Score | Meaning |
|
|
||||||
|-------|---------|
|
|
||||||
| 90-100% | Very likely correct - review and merge |
|
|
||||||
| 70-89% | Probably correct - review carefully |
|
|
||||||
| 50-69% | Uncertain - manual investigation recommended |
|
|
||||||
| <50% | Low confidence - use as a starting point only |
|
|
||||||
|
|
||||||
## Best Practices
|
|
||||||
|
|
||||||
### Writing Good Ticket Descriptions
|
|
||||||
|
|
||||||
The AI works best with detailed descriptions:
|
|
||||||
|
|
||||||
**Good:**
|
|
||||||
```
|
|
||||||
Transaction auth failing for amounts over $10 million.
|
|
||||||
- Error: "Insufficient funds" even when balance is adequate
|
|
||||||
- Affected accounts: Corporate accounts with high limits
|
|
||||||
- Started after last month's release
|
|
||||||
- Error code: AUTH-5012
|
|
||||||
```
|
|
||||||
|
|
||||||
**Poor:**
|
|
||||||
```
|
|
||||||
auth broken
|
|
||||||
```
|
|
||||||
|
|
||||||
### Reviewing PRs
|
|
||||||
|
|
||||||
1. Always review AI-generated PRs before merging
|
|
||||||
2. Run your test suite on the fix branch
|
|
||||||
3. Check if the analysis matches your understanding
|
|
||||||
4. Look for edge cases the AI might have missed
|
|
||||||
|
|
||||||
## Troubleshooting
|
|
||||||
|
|
||||||
### Issue Not Analyzed
|
|
||||||
|
|
||||||
1. Check webhook delivery in your issue tracker
|
|
||||||
2. Verify the endpoint URL is correct
|
|
||||||
3. Check the JIRA AI Fixer dashboard for errors
|
|
||||||
|
|
||||||
### Low Confidence Scores
|
|
||||||
|
|
||||||
1. Provide more detail in ticket descriptions
|
|
||||||
2. Ensure relevant code is in indexed repositories
|
|
||||||
3. Check if the issue type is supported
|
|
||||||
|
|
||||||
### PR Not Created
|
|
||||||
|
|
||||||
1. Repository must be connected to Gitea
|
|
||||||
2. Code must be in indexed directory
|
|
||||||
3. Fix must be auto-applicable (simple changes work best)
|
|
||||||
|
|
||||||
## Contact
|
|
||||||
|
|
||||||
- **Dashboard**: https://jira-fixer.startdata.com.br
|
|
||||||
- **Portal**: https://aifixerportal.startdata.com.br
|
|
||||||
- **Support**: support@startdata.com.br
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
*JIRA AI Fixer - Intelligent Support Case Resolution*
|
|
||||||
*Created by StartData*
|
|
||||||
|
|
@ -1,343 +0,0 @@
|
||||||
# JIRA AI Fixer
|
|
||||||
## Executive Proposal
|
|
||||||
|
|
||||||
**Date:** February 2026
|
|
||||||
**Version:** 1.1
|
|
||||||
**Classification:** Product Documentation
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Executive Summary
|
|
||||||
|
|
||||||
### The Problem
|
|
||||||
|
|
||||||
Support teams face growing challenges in resolving Support Cases:
|
|
||||||
|
|
||||||
| Challenge | Impact |
|
|
||||||
|-----------|--------|
|
|
||||||
| **Response time** | Initial analysis consumes hours of senior developer time |
|
|
||||||
| **Growing backlog** | Issues accumulate while team focuses on urgent demands |
|
|
||||||
| **Variable quality** | Dependency on individual knowledge about the code |
|
|
||||||
| **Concentrated knowledge** | Few specialists know all modules |
|
|
||||||
|
|
||||||
### The Solution
|
|
||||||
|
|
||||||
An **Artificial Intelligence** system that:
|
|
||||||
|
|
||||||
1. **Monitors** new Support Cases in JIRA automatically
|
|
||||||
2. **Analyzes** the problem and identifies affected source code
|
|
||||||
3. **Proposes** specific fixes in COBOL, SQL, and JCL
|
|
||||||
4. **Documents** the analysis directly in JIRA
|
|
||||||
5. **Creates** branches with fixes for human review
|
|
||||||
|
|
||||||
### Expected Result
|
|
||||||
|
|
||||||
```
|
|
||||||
┌─────────────────────────────────────────────────────────────────────────────┐
|
|
||||||
│ BEFORE vs AFTER │
|
|
||||||
├─────────────────────────────────────────────────────────────────────────────┤
|
|
||||||
│ │
|
|
||||||
│ BEFORE AFTER │
|
|
||||||
│ ────── ───── │
|
|
||||||
│ Issue created Issue created │
|
|
||||||
│ ↓ ↓ │
|
|
||||||
│ Dev analyzes (2-4h) AI analyzes (5min) │
|
|
||||||
│ ↓ ↓ │
|
|
||||||
│ Search code (1-2h) Code identified │
|
|
||||||
│ ↓ ↓ │
|
|
||||||
│ Investigate cause (2-4h) Cause + suggested fix │
|
|
||||||
│ ↓ ↓ │
|
|
||||||
│ Develop fix (2-4h) Dev reviews and approves │
|
|
||||||
│ ↓ ↓ │
|
|
||||||
│ Review + deploy Review + deploy │
|
|
||||||
│ │
|
|
||||||
│ TOTAL: 8-14 hours TOTAL: 2-4 hours │
|
|
||||||
│ │
|
|
||||||
│ ✅ 60-70% reduction in resolution time │
|
|
||||||
│ │
|
|
||||||
└─────────────────────────────────────────────────────────────────────────────┘
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Why Now?
|
|
||||||
|
|
||||||
### 1. Mature Technology
|
|
||||||
Language models (GPT-4, Claude, Llama) have reached sufficient quality for code analysis and generation, including legacy languages like COBOL.
|
|
||||||
|
|
||||||
### 2. Competitive Advantage
|
|
||||||
Leading companies are adopting AI to accelerate development. Those who don't adopt will fall behind in productivity.
|
|
||||||
|
|
||||||
### 3. Manageable Volume
|
|
||||||
With typical support volumes, the risk is low and the environment is ideal to validate the solution before scaling.
|
|
||||||
|
|
||||||
### 4. Accessible Cost
|
|
||||||
Operational cost is minimal, especially with free/low-cost LLM options available.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## How It Works
|
|
||||||
|
|
||||||
### Simplified Flow
|
|
||||||
|
|
||||||
```
|
|
||||||
┌──────────────┐ ┌──────────────┐ ┌──────────────┐
|
|
||||||
│ Support │ │ AI │ │ Dev │
|
|
||||||
│ Case │────▶│ Analyzes │────▶│ Reviews │
|
|
||||||
│ (JIRA) │ │ + Suggests │ │ + Approves │
|
|
||||||
└──────────────┘ └──────────────┘ └──────────────┘
|
|
||||||
5min 5min 30min-2h
|
|
||||||
|
|
||||||
┌─────────────────────┐
|
|
||||||
│ JIRA Comment: │
|
|
||||||
│ - Root cause │
|
|
||||||
│ - Affected files │
|
|
||||||
│ - Proposed fix │
|
|
||||||
│ - Link to PR │
|
|
||||||
└─────────────────────┘
|
|
||||||
```
|
|
||||||
|
|
||||||
### Real Example
|
|
||||||
|
|
||||||
**Issue:** "Transaction being declined with code 51 even with available balance"
|
|
||||||
|
|
||||||
**AI Response (in 5 minutes):**
|
|
||||||
|
|
||||||
```
|
|
||||||
📋 AUTOMATIC ANALYSIS
|
|
||||||
|
|
||||||
🔍 Identified Cause:
|
|
||||||
The AUTH.CBL program is comparing the WS-AVAILABLE-BALANCE field
|
|
||||||
with format PIC 9(9)V99, but the value returned from HOST uses
|
|
||||||
PIC 9(11)V99, causing truncation.
|
|
||||||
|
|
||||||
📁 Affected File:
|
|
||||||
- src/cobol/AUTH.CBL (lines 1234-1256)
|
|
||||||
|
|
||||||
💡 Proposed Fix:
|
|
||||||
Change WS-AVAILABLE-BALANCE declaration to PIC 9(11)V99
|
|
||||||
and adjust the comparison in SECTION 3000-VALIDATE.
|
|
||||||
|
|
||||||
📊 Confidence: 87%
|
|
||||||
|
|
||||||
🔗 PR with fix: bitbucket.example.com/projects/PRODUCT/repos/...
|
|
||||||
```
|
|
||||||
|
|
||||||
### Security: AI Does Not Alter Production Code
|
|
||||||
|
|
||||||
```
|
|
||||||
┌─────────────────────────────────────────────────────────────────────────────┐
|
|
||||||
│ SEPARATION OF RESPONSIBILITIES │
|
|
||||||
├─────────────────────────────────────────────────────────────────────────────┤
|
|
||||||
│ │
|
|
||||||
│ CLIENT Repository (production) │
|
|
||||||
│ Product-Client-Fork │
|
|
||||||
│ ├── AI has access: READ ONLY │
|
|
||||||
│ └── Changes: ONLY by developers │
|
|
||||||
│ │
|
|
||||||
│ AI Repository (isolated) │
|
|
||||||
│ Product-Client-AI │
|
|
||||||
│ ├── AI has access: READ AND WRITE │
|
|
||||||
│ └── Purpose: Branches with fix suggestions │
|
|
||||||
│ │
|
|
||||||
│ Approval Flow: │
|
|
||||||
│ 1. AI creates branch in isolated repository │
|
|
||||||
│ 2. AI opens Pull Request to client repository │
|
|
||||||
│ 3. HUMAN developer reviews │
|
|
||||||
│ 4. HUMAN developer approves or rejects │
|
|
||||||
│ 5. Only then code goes to production │
|
|
||||||
│ │
|
|
||||||
│ ✅ 100% of changes go through human review │
|
|
||||||
│ │
|
|
||||||
└─────────────────────────────────────────────────────────────────────────────┘
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Investment
|
|
||||||
|
|
||||||
### Pricing Models
|
|
||||||
|
|
||||||
| Model | Description | Price |
|
|
||||||
|-------|-------------|-------|
|
|
||||||
| **SaaS** | Hosted, managed by vendor | $2,000 - $5,000/month |
|
|
||||||
| **On-Premise License** | Self-hosted, perpetual | $50,000 - $100,000 one-time |
|
|
||||||
| **Enterprise** | Custom deployment + support | Contact for quote |
|
|
||||||
|
|
||||||
### ROI Calculation
|
|
||||||
|
|
||||||
```
|
|
||||||
Senior developer hourly cost: ~$40-80
|
|
||||||
Average time saved per issue: 6-10 hours
|
|
||||||
Monthly savings (10 issues): $2,400 - $8,000
|
|
||||||
|
|
||||||
SaaS payback: Immediate positive ROI
|
|
||||||
Enterprise license payback: 12-24 months
|
|
||||||
```
|
|
||||||
|
|
||||||
### Intangible Benefits
|
|
||||||
|
|
||||||
| Benefit | Impact |
|
|
||||||
|---------|--------|
|
|
||||||
| **Standardization** | All issues analyzed with same rigor |
|
|
||||||
| **Documentation** | Complete analysis history in JIRA |
|
|
||||||
| **Knowledge** | AI learns patterns, doesn't depend on people |
|
|
||||||
| **Speed** | Initial response in minutes, not hours |
|
|
||||||
| **Team morale** | Devs focus on complex problems, not repetitive ones |
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Deployment Options
|
|
||||||
|
|
||||||
### Option 1: SaaS (Recommended for Quick Start)
|
|
||||||
|
|
||||||
```
|
|
||||||
✅ Fastest time-to-value (days, not months)
|
|
||||||
✅ No infrastructure to manage
|
|
||||||
✅ Automatic updates
|
|
||||||
✅ Included support
|
|
||||||
```
|
|
||||||
|
|
||||||
### Option 2: On-Premise (For Compliance Requirements)
|
|
||||||
|
|
||||||
```
|
|
||||||
✅ 100% data stays in your infrastructure
|
|
||||||
✅ Air-gapped option (no internet required)
|
|
||||||
✅ Full control over updates
|
|
||||||
✅ One-time license cost
|
|
||||||
```
|
|
||||||
|
|
||||||
### Option 3: Hybrid
|
|
||||||
|
|
||||||
```
|
|
||||||
✅ You host, we manage
|
|
||||||
✅ Balance of control and convenience
|
|
||||||
✅ Flexible pricing
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Security and Compliance
|
|
||||||
|
|
||||||
### LLM Provider Options
|
|
||||||
|
|
||||||
| Provider | Data Location | Compliance Level |
|
|
||||||
|----------|---------------|------------------|
|
|
||||||
| **Azure OpenAI** | Your Azure tenant | Enterprise |
|
|
||||||
| **Local (Ollama)** | Your servers | Air-gapped |
|
|
||||||
| **OpenAI API** | OpenAI cloud | Standard |
|
|
||||||
| **OpenRouter** | Various | Development |
|
|
||||||
|
|
||||||
### Compliance Features
|
|
||||||
|
|
||||||
- ✅ Data segregation by client/product
|
|
||||||
- ✅ Complete audit trail
|
|
||||||
- ✅ Configurable log retention
|
|
||||||
- ✅ 100% on-premise deployment option
|
|
||||||
- ✅ Air-gapped deployment available
|
|
||||||
- ✅ No code sent to public training datasets
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Risks and Mitigations
|
|
||||||
|
|
||||||
| Risk | Probability | Mitigation |
|
|
||||||
|------|-------------|------------|
|
|
||||||
| **AI suggests incorrect fix** | Medium | Mandatory human review in 100% of cases |
|
|
||||||
| **Team resistance** | Low | Position as assistant, not replacement |
|
|
||||||
| **Code security** | Configurable | Choose Azure/local for compliance |
|
|
||||||
| **LLM cost increases** | Low | Multiple provider options |
|
|
||||||
|
|
||||||
### Conservative Approach
|
|
||||||
|
|
||||||
The system is designed for phased adoption:
|
|
||||||
|
|
||||||
```
|
|
||||||
Phase 1: Analysis and suggestion only
|
|
||||||
AI comments in JIRA, doesn't create code
|
|
||||||
|
|
||||||
Phase 2: Code generation in isolated repository
|
|
||||||
Human decides whether to use or not
|
|
||||||
|
|
||||||
Phase 3: Automatic Pull Requests
|
|
||||||
Human still approves
|
|
||||||
|
|
||||||
Phase 4: Auto-merge (only for high-confidence fixes)
|
|
||||||
Only after months of validation
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Implementation Timeline
|
|
||||||
|
|
||||||
```
|
|
||||||
┌─────────────────────────────────────────────────────────────────────────────┐
|
|
||||||
│ IMPLEMENTATION ROADMAP │
|
|
||||||
├─────────────────────────────────────────────────────────────────────────────┤
|
|
||||||
│ │
|
|
||||||
│ Week 1-2 Week 3-4 Week 5-6 Week 7+ │
|
|
||||||
│ ──────── ──────── ──────── ──────── │
|
|
||||||
│ Setup + Code Business Go-Live + │
|
|
||||||
│ Integrations Indexing Rules Refinement │
|
|
||||||
│ │
|
|
||||||
│ ✓ JIRA ✓ COBOL ✓ Modules ✓ Production │
|
|
||||||
│ ✓ Bitbucket ✓ SQL ✓ Validation ✓ Adjustments │
|
|
||||||
│ ✓ Portal ✓ JCL ✓ Testing ✓ Support │
|
|
||||||
│ │
|
|
||||||
│ │ │
|
|
||||||
│ ▼ │
|
|
||||||
│ LIVE │
|
|
||||||
│ ~5-7 weeks │
|
|
||||||
│ │
|
|
||||||
└─────────────────────────────────────────────────────────────────────────────┘
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Solution Differentiators
|
|
||||||
|
|
||||||
### Why JIRA AI Fixer?
|
|
||||||
|
|
||||||
| Aspect | Generic Tools | JIRA AI Fixer |
|
|
||||||
|--------|---------------|---------------|
|
|
||||||
| **JIRA Integration** | ❌ Manual | ✅ Automatic |
|
|
||||||
| **Domain knowledge** | ❌ Generic | ✅ Configurable business rules |
|
|
||||||
| **COBOL expertise** | ⚠️ Limited | ✅ Optimized for mainframe |
|
|
||||||
| **Support Case flow** | ❌ Doesn't exist | ✅ Native |
|
|
||||||
| **Deployment options** | ❌ Cloud only | ✅ SaaS, on-prem, or air-gapped |
|
|
||||||
| **Customization** | ❌ Generic | ✅ Fully configurable |
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Next Steps
|
|
||||||
|
|
||||||
### To Get Started
|
|
||||||
|
|
||||||
1. **Schedule Demo** - See JIRA AI Fixer in action with your data
|
|
||||||
2. **Pilot Program** - 30-day trial with limited scope
|
|
||||||
3. **Full Deployment** - Production rollout with support
|
|
||||||
|
|
||||||
### Contact
|
|
||||||
|
|
||||||
- **Email:** sales@yourcompany.com
|
|
||||||
- **Demo Request:** https://jira-ai-fixer.yourcompany.com/demo
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Conclusion
|
|
||||||
|
|
||||||
**JIRA AI Fixer** represents an opportunity to:
|
|
||||||
|
|
||||||
✅ **Increase productivity** of support team by 60%+
|
|
||||||
✅ **Reduce response time** from hours to minutes
|
|
||||||
✅ **Standardize quality** of analyses
|
|
||||||
✅ **Retain knowledge** independent of people
|
|
||||||
✅ **Choose your deployment** - SaaS, on-prem, or air-gapped
|
|
||||||
|
|
||||||
The timing is ideal: mature technology, flexible deployment options, and proven ROI.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
**JIRA AI Fixer - Intelligent Support Case Resolution**
|
|
||||||
|
|
||||||
*Ready to transform your support workflow?*
|
|
||||||
|
|
@ -1,537 +0,0 @@
|
||||||
# JIRA AI Fixer - Admin Portal
|
|
||||||
|
|
||||||
**Version:** 1.0
|
|
||||||
**Date:** February 2026
|
|
||||||
**Classification:** Product Documentation
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 1. Overview
|
|
||||||
|
|
||||||
The JIRA AI Fixer Admin Portal is a modern, intuitive web interface that allows managing all system configurations without the need to modify code or configuration files manually.
|
|
||||||
|
|
||||||
### 1.1 Objectives
|
|
||||||
|
|
||||||
- **Zero code** for configuration
|
|
||||||
- **Intuitive interface** for medium/large enterprises
|
|
||||||
- **Multi-tenant** to support multiple teams
|
|
||||||
- **Complete auditing** of all actions
|
|
||||||
- **SSO integration** with corporate providers
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 2. Portal Screens
|
|
||||||
|
|
||||||
### 2.1 Main Dashboard
|
|
||||||
|
|
||||||
```
|
|
||||||
┌─────────────────────────────────────────────────────────────────────────────┐
|
|
||||||
│ 🤖 JIRA AI Fixer admin@company.com ⚙️ 🔔 │
|
|
||||||
├─────────────────────────────────────────────────────────────────────────────┤
|
|
||||||
│ │
|
|
||||||
│ ┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐ │
|
|
||||||
│ │Dashboard│ │ Issues │ │ Repos │ │ Modules │ │Settings │ │
|
|
||||||
│ └─────────┘ └─────────┘ └─────────┘ └─────────┘ └─────────┘ │
|
|
||||||
│ │
|
|
||||||
│ ┌─────────────────────────────────────────────────────────────────────┐ │
|
|
||||||
│ │ 📊 DASHBOARD │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │ │
|
|
||||||
│ │ │ 12 │ │ 78% │ │ 2.3min │ │ │
|
|
||||||
│ │ │ Issues/month │ │ Success Rate │ │ Avg Time │ │ │
|
|
||||||
│ │ └──────────────┘ └──────────────┘ └──────────────┘ │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ 📈 Last 30 days │ │
|
|
||||||
│ │ ████████████████████░░░░░░ 78% fixes accepted │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ ┌─ Recent Activity ───────────────────────────────────────────┐ │ │
|
|
||||||
│ │ │ ✅ SUPPORT-4521 - Fix accepted 2 hours ago │ │ │
|
|
||||||
│ │ │ ⏳ SUPPORT-4519 - Awaiting review 5 hours ago │ │ │
|
|
||||||
│ │ │ ❌ SUPPORT-4515 - Fix rejected 1 day ago │ │ │
|
|
||||||
│ │ └─────────────────────────────────────────────────────────────┘ │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ └─────────────────────────────────────────────────────────────────────┘ │
|
|
||||||
│ │
|
|
||||||
└─────────────────────────────────────────────────────────────────────────────┘
|
|
||||||
```
|
|
||||||
|
|
||||||
**Displayed metrics:**
|
|
||||||
- Issues processed (day/week/month)
|
|
||||||
- Success rate (accepted vs rejected fixes)
|
|
||||||
- Average analysis time
|
|
||||||
- Trend chart
|
|
||||||
- Recent activity
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
### 2.2 Settings - Integrations
|
|
||||||
|
|
||||||
```
|
|
||||||
┌─────────────────────────────────────────────────────────────────────────────┐
|
|
||||||
│ ⚙️ SETTINGS > Integrations │
|
|
||||||
├─────────────────────────────────────────────────────────────────────────────┤
|
|
||||||
│ │
|
|
||||||
│ ┌─ JIRA ────────────────────────────────────────────────────────────────┐ │
|
|
||||||
│ │ ✅ Connected │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ Server URL │ │
|
|
||||||
│ │ ┌─────────────────────────────────────────────────────────────────┐ │ │
|
|
||||||
│ │ │ https://jira.yourcompany.com │ │ │
|
|
||||||
│ │ └─────────────────────────────────────────────────────────────────┘ │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ API Token │ │
|
|
||||||
│ │ ┌─────────────────────────────────────────────────────────────────┐ │ │
|
|
||||||
│ │ │ •••••••••••••••••••••••••••••••• 👁️ │ │ │
|
|
||||||
│ │ └─────────────────────────────────────────────────────────────────┘ │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ Webhook URL (copy and configure in JIRA) │ │
|
|
||||||
│ │ ┌─────────────────────────────────────────────────────────────────┐ │ │
|
|
||||||
│ │ │ https://jira-fixer.yourcompany.com/api/webhook/jira 📋 │ │ │
|
|
||||||
│ │ └─────────────────────────────────────────────────────────────────┘ │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ Monitored Projects │ │
|
|
||||||
│ │ ☑️ PROJECT-A ☑️ PROJECT-B ☐ PROJECT-C │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ [ 🔄 Test Connection ] │ │
|
|
||||||
│ └───────────────────────────────────────────────────────────────────────┘ │
|
|
||||||
│ │
|
|
||||||
│ ┌─ Bitbucket ───────────────────────────────────────────────────────────┐ │
|
|
||||||
│ │ ✅ Connected │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ Server URL │ │
|
|
||||||
│ │ ┌─────────────────────────────────────────────────────────────────┐ │ │
|
|
||||||
│ │ │ https://bitbucket.yourcompany.com │ │ │
|
|
||||||
│ │ └─────────────────────────────────────────────────────────────────┘ │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ Access Token │ │
|
|
||||||
│ │ ┌─────────────────────────────────────────────────────────────────┐ │ │
|
|
||||||
│ │ │ •••••••••••••••••••••••••••••••• 👁️ │ │ │
|
|
||||||
│ │ └─────────────────────────────────────────────────────────────────┘ │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ [ 🔄 Test Connection ] │ │
|
|
||||||
│ └───────────────────────────────────────────────────────────────────────┘ │
|
|
||||||
│ │
|
|
||||||
│ ┌─ LLM Provider ────────────────────────────────────────────────────────┐ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ Provider │ │
|
|
||||||
│ │ ┌─────────────────────────────────────────────────────────────────┐ │ │
|
|
||||||
│ │ │ Azure OpenAI ▼ │ │ │
|
|
||||||
│ │ └─────────────────────────────────────────────────────────────────┘ │ │
|
|
||||||
│ │ Options: Azure OpenAI | OpenAI | OpenRouter | Self-hosted (Ollama) │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ Endpoint (for Azure/Self-hosted) │ │
|
|
||||||
│ │ ┌─────────────────────────────────────────────────────────────────┐ │ │
|
|
||||||
│ │ │ https://your-resource.openai.azure.com │ │ │
|
|
||||||
│ │ └─────────────────────────────────────────────────────────────────┘ │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ API Key │ │
|
|
||||||
│ │ ┌─────────────────────────────────────────────────────────────────┐ │ │
|
|
||||||
│ │ │ •••••••••••••••••••••••••••••••• 👁️ │ │ │
|
|
||||||
│ │ └─────────────────────────────────────────────────────────────────┘ │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ Model │ │
|
|
||||||
│ │ ┌─────────────────────────────────────────────────────────────────┐ │ │
|
|
||||||
│ │ │ gpt-4o ▼ │ │ │
|
|
||||||
│ │ └─────────────────────────────────────────────────────────────────┘ │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ [ 🔄 Test Connection ] │ │
|
|
||||||
│ └───────────────────────────────────────────────────────────────────────┘ │
|
|
||||||
│ │
|
|
||||||
│ ┌─ Embeddings ──────────────────────────────────────────────────────────┐ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ Provider │ │
|
|
||||||
│ │ ┌─────────────────────────────────────────────────────────────────┐ │ │
|
|
||||||
│ │ │ Self-hosted (MiniLM-L6) ▼ │ │ │
|
|
||||||
│ │ └─────────────────────────────────────────────────────────────────┘ │ │
|
|
||||||
│ │ Options: Self-hosted | Azure OpenAI | OpenAI │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ ℹ️ Self-hosted embeddings are free and keep data on your servers. │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ [ 🔄 Test Connection ] │ │
|
|
||||||
│ └───────────────────────────────────────────────────────────────────────┘ │
|
|
||||||
│ │
|
|
||||||
│ [ 💾 Save All ] │
|
|
||||||
└─────────────────────────────────────────────────────────────────────────────┘
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
### 2.3 Repository Management
|
|
||||||
|
|
||||||
```
|
|
||||||
┌─────────────────────────────────────────────────────────────────────────────┐
|
|
||||||
│ 📁 REPOSITORIES [ + Add New ] │
|
|
||||||
├─────────────────────────────────────────────────────────────────────────────┤
|
|
||||||
│ │
|
|
||||||
│ ┌───────────────────────────────────────────────────────────────────────┐ │
|
|
||||||
│ │ 📦 Product-Client-Fork [ ⚙️ ] [ 🗑️ ]│ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ URL: bitbucket.company.com/projects/PROD/repos/Product-Client-Fork │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ ┌─────────────────────────────────────────────────────────────────┐ │ │
|
|
||||||
│ │ │ Status │ ✅ Indexed │ │ │
|
|
||||||
│ │ │ Files │ 2,847 files indexed │ │ │
|
|
||||||
│ │ │ Last Sync │ 02/18/2026 11:30 │ │ │
|
|
||||||
│ │ │ AI Fork │ Product-Client-AI ✅ │ │ │
|
|
||||||
│ │ └─────────────────────────────────────────────────────────────────┘ │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ Detected languages: │ │
|
|
||||||
│ │ ████████████████████░░░░░░░░░░ COBOL 68% │ │
|
|
||||||
│ │ ██████░░░░░░░░░░░░░░░░░░░░░░░░ SQL 22% │ │
|
|
||||||
│ │ ███░░░░░░░░░░░░░░░░░░░░░░░░░░░ JCL 10% │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ [ 🔄 Re-index Now ] [ 📊 View Details ] [ ⏰ Schedule Sync ] │ │
|
|
||||||
│ └───────────────────────────────────────────────────────────────────────┘ │
|
|
||||||
│ │
|
|
||||||
└─────────────────────────────────────────────────────────────────────────────┘
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
### 2.4 Business Rules Editor
|
|
||||||
|
|
||||||
```
|
|
||||||
┌─────────────────────────────────────────────────────────────────────────────┐
|
|
||||||
│ 🧠 BUSINESS RULES [ + New Module ] │
|
|
||||||
├─────────────────────────────────────────────────────────────────────────────┤
|
|
||||||
│ │
|
|
||||||
│ Configured modules: │
|
|
||||||
│ ┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐ │
|
|
||||||
│ │ Author. │ │Clearing │ │HostComm │ │ Batch │ │
|
|
||||||
│ │ ● │ │ │ │ │ │ │ │
|
|
||||||
│ └─────────┘ └─────────┘ └─────────┘ └─────────┘ │
|
|
||||||
│ │
|
|
||||||
│ ═══════════════════════════════════════════════════════════════════════ │
|
|
||||||
│ │
|
|
||||||
│ 📌 Module: Authorization │
|
|
||||||
│ │
|
|
||||||
│ ┌─ Description ─────────────────────────────────────────────────────────┐ │
|
|
||||||
│ │ Card transaction authorization module. Responsible for validation, │ │
|
|
||||||
│ │ HOST communication, and response generation. │ │
|
|
||||||
│ └───────────────────────────────────────────────────────────────────────┘ │
|
|
||||||
│ │
|
|
||||||
│ ┌─ Related Programs ────────────────────────────────────────────────────┐ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ ┌──────────────┐ ┌──────────────┐ ┌──────────────┐ │ │
|
|
||||||
│ │ │ AUTH* ✕ │ │ VALID* ✕ │ │ HOST* ✕ │ [ + Add ] │ │
|
|
||||||
│ │ └──────────────┘ └──────────────┘ └──────────────┘ │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ └───────────────────────────────────────────────────────────────────────┘ │
|
|
||||||
│ │
|
|
||||||
│ ┌─ Detection Keywords ──────────────────────────────────────────────────┐ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ ┌──────────────┐ ┌────────────┐ ┌────────────┐ ┌────────────┐ │ │
|
|
||||||
│ │ │authorization✕│ │ decline ✕ │ │ code 51 ✕ │ │ timeout ✕ │ [+] │ │
|
|
||||||
│ │ └──────────────┘ └────────────┘ └────────────┘ └────────────┘ │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ ℹ️ When an issue contains these words, the system automatically │ │
|
|
||||||
│ │ associates it with the Authorization module. │ │
|
|
||||||
│ └───────────────────────────────────────────────────────────────────────┘ │
|
|
||||||
│ │
|
|
||||||
│ ┌─ Context Rules (instructions for AI) ─────────────────────────────────┐ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ 📋 Rule 1 [ ✏️ ] [ 🗑️ ]│ │
|
|
||||||
│ │ ┌─────────────────────────────────────────────────────────────────┐ │ │
|
|
||||||
│ │ │ Transactions above $10,000 require additional validation in │ │ │
|
|
||||||
│ │ │ program VALIDATE through SECTION 5000-VALIDATE-HIGH-VALUE │ │ │
|
|
||||||
│ │ │ before sending to HOST. │ │ │
|
|
||||||
│ │ └─────────────────────────────────────────────────────────────────┘ │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ 📋 Rule 2 [ ✏️ ] [ 🗑️ ]│ │
|
|
||||||
│ │ ┌─────────────────────────────────────────────────────────────────┐ │ │
|
|
||||||
│ │ │ Response codes follow ISO 8583 standard: │ │ │
|
|
||||||
│ │ │ - 00: Approved │ │ │
|
|
||||||
│ │ │ - 51: Insufficient funds (check WS-AVAILABLE-BALANCE) │ │ │
|
|
||||||
│ │ │ - 14: Invalid card │ │ │
|
|
||||||
│ │ │ - 91: Issuer unavailable │ │ │
|
|
||||||
│ │ └─────────────────────────────────────────────────────────────────┘ │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ [ + Add New Rule ] │ │
|
|
||||||
│ └───────────────────────────────────────────────────────────────────────┘ │
|
|
||||||
│ │
|
|
||||||
│ ┌─ Restrictions (files AI cannot modify) ───────────────────────────────┐ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ ┌────────────────┐ ┌──────────────────┐ │ │
|
|
||||||
│ │ │ /interfaces/* ✕│ │ /copybooks/HOST* ✕│ [ + Add ] │ │
|
|
||||||
│ │ └────────────────┘ └──────────────────┘ │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ ⚠️ Files in these folders will only be analyzed, never modified. │ │
|
|
||||||
│ └───────────────────────────────────────────────────────────────────────┘ │
|
|
||||||
│ │
|
|
||||||
│ [ Cancel ] [ 💾 Save Module ] │
|
|
||||||
└─────────────────────────────────────────────────────────────────────────────┘
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
### 2.5 Issues View
|
|
||||||
|
|
||||||
```
|
|
||||||
┌─────────────────────────────────────────────────────────────────────────────┐
|
|
||||||
│ 📋 ANALYZED ISSUES │
|
|
||||||
├─────────────────────────────────────────────────────────────────────────────┤
|
|
||||||
│ │
|
|
||||||
│ 🔍 Search... Status: [ All ▼ ] [ 📅 Period ] │
|
|
||||||
│ │
|
|
||||||
│ ┌───────────────────────────────────────────────────────────────────────┐ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ ┌─ SUPPORT-4521 ──────────────────────────────────────────────────┐ │ │
|
|
||||||
│ │ │ │ │ │
|
|
||||||
│ │ │ Transaction declined code 51 with available balance │ │ │
|
|
||||||
│ │ │ │ │ │
|
|
||||||
│ │ │ ┌────────────┐ ┌────────────┐ ┌────────────┐ ┌────────────┐ │ │ │
|
|
||||||
│ │ │ │ ✅ Accepted│ │ 🎯 87% │ │ ⏱️ 2m 34s │ │ 📁 1 file │ │ │ │
|
|
||||||
│ │ │ └────────────┘ └────────────┘ └────────────┘ └────────────┘ │ │ │
|
|
||||||
│ │ │ │ │ │
|
|
||||||
│ │ │ Module: Authorization Created: 02/18/2026 09:15 │ │ │
|
|
||||||
│ │ │ │ │ │
|
|
||||||
│ │ │ [ 👁️ View Full Analysis ] [ 📝 View PR ] [ 🔗 Open JIRA ] │ │ │
|
|
||||||
│ │ └─────────────────────────────────────────────────────────────────┘ │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ ┌─ SUPPORT-4519 ──────────────────────────────────────────────────┐ │ │
|
|
||||||
│ │ │ │ │ │
|
|
||||||
│ │ │ Formatting error in clearing file │ │ │
|
|
||||||
│ │ │ │ │ │
|
|
||||||
│ │ │ ┌────────────┐ ┌────────────┐ ┌────────────┐ ┌────────────┐ │ │ │
|
|
||||||
│ │ │ │ ⏳ Review │ │ 🎯 72% │ │ ⏱️ 3m 12s │ │ 📁 2 files │ │ │ │
|
|
||||||
│ │ │ └────────────┘ └────────────┘ └────────────┘ └────────────┘ │ │ │
|
|
||||||
│ │ │ │ │ │
|
|
||||||
│ │ │ Module: Clearing Created: 02/18/2026 06:45 │ │ │
|
|
||||||
│ │ │ │ │ │
|
|
||||||
│ │ │ [ 👁️ View Full Analysis ] [ 📝 View PR ] [ 🔗 Open JIRA ] │ │ │
|
|
||||||
│ │ └─────────────────────────────────────────────────────────────────┘ │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ └───────────────────────────────────────────────────────────────────────┘ │
|
|
||||||
│ │
|
|
||||||
│ Showing 1-10 of 47 issues [ ← Previous ] [ Next → ] │
|
|
||||||
│ │
|
|
||||||
└─────────────────────────────────────────────────────────────────────────────┘
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
### 2.6 Analysis Details
|
|
||||||
|
|
||||||
```
|
|
||||||
┌─────────────────────────────────────────────────────────────────────────────┐
|
|
||||||
│ 👁️ ANALYSIS: SUPPORT-4521 [ ← Back ]│
|
|
||||||
├─────────────────────────────────────────────────────────────────────────────┤
|
|
||||||
│ │
|
|
||||||
│ ┌─ Issue Information ───────────────────────────────────────────────────┐ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ Title: Transaction declined code 51 with available balance │ │
|
|
||||||
│ │ Status: ✅ Fix Accepted │ │
|
|
||||||
│ │ Confidence: 87% │ │
|
|
||||||
│ │ Analysis time: 2 minutes 34 seconds │ │
|
|
||||||
│ │ Analyzed at: 02/18/2026 09:17:34 │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ └───────────────────────────────────────────────────────────────────────┘ │
|
|
||||||
│ │
|
|
||||||
│ ┌─ Original Description ────────────────────────────────────────────────┐ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ Client reports that transactions are being declined with code 51 │ │
|
|
||||||
│ │ (insufficient funds) even when the customer has available balance. │ │
|
|
||||||
│ │ Occurs on transactions above $100,000.00. │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ Stack trace: │ │
|
|
||||||
│ │ AUTH - SECTION 3000-VALIDATE - EVALUATE WS-RESPONSE-CODE │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ └───────────────────────────────────────────────────────────────────────┘ │
|
|
||||||
│ │
|
|
||||||
│ ┌─ AI Analysis ─────────────────────────────────────────────────────────┐ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ 🔍 IDENTIFIED ROOT CAUSE │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ The AUTH.CBL program is comparing the WS-AVAILABLE-BALANCE field │ │
|
|
||||||
│ │ with format PIC 9(9)V99 (maximum 9,999,999.99), but the value │ │
|
|
||||||
│ │ returned from HOST uses PIC 9(11)V99 (max 999,999,999.99). │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ This causes truncation on values above $100,000.00, │ │
|
|
||||||
│ │ making the balance appear as insufficient. │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ 📁 AFFECTED FILES │ │
|
|
||||||
│ │ • src/cobol/AUTH.CBL (lines 1234-1256) │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ └───────────────────────────────────────────────────────────────────────┘ │
|
|
||||||
│ │
|
|
||||||
│ ┌─ Proposed Fix ────────────────────────────────────────────────────────┐ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ ```cobol │ │
|
|
||||||
│ │ * BEFORE (line 1234) │ │
|
|
||||||
│ │ 05 WS-AVAILABLE-BALANCE PIC 9(9)V99. │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ * AFTER │ │
|
|
||||||
│ │ 05 WS-AVAILABLE-BALANCE PIC 9(11)V99. │ │
|
|
||||||
│ │ ``` │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ Also adjust SECTION 3000-VALIDATE to use the new size. │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ └───────────────────────────────────────────────────────────────────────┘ │
|
|
||||||
│ │
|
|
||||||
│ ┌─ Links ───────────────────────────────────────────────────────────────┐ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ 🔗 Issue in JIRA: jira.company.com/browse/SUPPORT-4521 │ │
|
|
||||||
│ │ 📝 Pull Request: bitbucket.company.com/.../pull-requests/142 │ │
|
|
||||||
│ │ 💬 AI Comment: View in JIRA │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ └───────────────────────────────────────────────────────────────────────┘ │
|
|
||||||
│ │
|
|
||||||
│ [ 🔄 Re-analyze ] [ 📥 Export PDF ] │
|
|
||||||
└─────────────────────────────────────────────────────────────────────────────┘
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 3. Portal Technology Stack
|
|
||||||
|
|
||||||
### 3.1 Frontend
|
|
||||||
```yaml
|
|
||||||
Framework: React 18 + TypeScript
|
|
||||||
Styling: Tailwind CSS + shadcn/ui
|
|
||||||
Components:
|
|
||||||
- Tables with sorting and filters
|
|
||||||
- Forms with validation
|
|
||||||
- Charts (Recharts)
|
|
||||||
- Code editor (Monaco Editor)
|
|
||||||
- Toast notifications
|
|
||||||
State Management: React Query + Zustand
|
|
||||||
Routing: React Router v6
|
|
||||||
Build: Vite
|
|
||||||
```
|
|
||||||
|
|
||||||
### 3.2 Backend (API)
|
|
||||||
```yaml
|
|
||||||
Framework: FastAPI (Python 3.11+)
|
|
||||||
Documentation: Automatic OpenAPI/Swagger
|
|
||||||
Authentication: JWT + OAuth2/OIDC
|
|
||||||
Rate Limiting: slowapi
|
|
||||||
Validation: Pydantic v2
|
|
||||||
```
|
|
||||||
|
|
||||||
### 3.3 Database
|
|
||||||
```yaml
|
|
||||||
Primary: PostgreSQL 15+
|
|
||||||
Cache: Redis 7+
|
|
||||||
Vector DB: Qdrant (embeddings)
|
|
||||||
Migrations: Alembic
|
|
||||||
```
|
|
||||||
|
|
||||||
### 3.4 Authentication
|
|
||||||
```yaml
|
|
||||||
Supported options:
|
|
||||||
- Azure AD (SAML/OIDC)
|
|
||||||
- Okta
|
|
||||||
- Google Workspace
|
|
||||||
- Email/Password with MFA (TOTP)
|
|
||||||
|
|
||||||
Permissions (RBAC):
|
|
||||||
- Admin: Full access
|
|
||||||
- Editor: Configure rules, view everything
|
|
||||||
- Viewer: View only
|
|
||||||
- API: Programmatic access only
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 4. Configuration Simplicity
|
|
||||||
|
|
||||||
| Action | Method | Time |
|
|
||||||
|--------|--------|------|
|
|
||||||
| Connect JIRA | Paste URL + Token | 2 minutes |
|
|
||||||
| Connect Bitbucket | Paste URL + Token | 2 minutes |
|
|
||||||
| Change LLM provider | Select from dropdown | 30 seconds |
|
|
||||||
| Add repository | Paste URL + Configure AI fork | 5 minutes |
|
|
||||||
| Create business rule | Visual editor | 5-10 minutes |
|
|
||||||
| Add restriction | Type path | 30 seconds |
|
|
||||||
| View logs | Click on tab | Immediate |
|
|
||||||
| Export report | "Export" button | Immediate |
|
|
||||||
|
|
||||||
**Principle: Zero code for any configuration.**
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 5. Multi-Tenant (Multiple Teams/Products)
|
|
||||||
|
|
||||||
The portal supports multiple isolated tenants:
|
|
||||||
|
|
||||||
```
|
|
||||||
┌─────────────────────────────────────────────────────────────────────────────┐
|
|
||||||
│ MULTI-TENANT ARCHITECTURE │
|
|
||||||
├─────────────────────────────────────────────────────────────────────────────┤
|
|
||||||
│ │
|
|
||||||
│ Tenant: Team Alpha Tenant: Team Beta │
|
|
||||||
│ ┌─────────────────────────┐ ┌─────────────────────────┐ │
|
|
||||||
│ │ - Alpha Repos │ │ - Beta Repos │ │
|
|
||||||
│ │ - Alpha Rules │ │ - Beta Rules │ │
|
|
||||||
│ │ - Alpha Users │ │ - Beta Users │ │
|
|
||||||
│ │ - Isolated Logs │ │ - Isolated Logs │ │
|
|
||||||
│ └─────────────────────────┘ └─────────────────────────┘ │
|
|
||||||
│ │ │ │
|
|
||||||
│ └────────────────┬───────────────────┘ │
|
|
||||||
│ │ │
|
|
||||||
│ ┌──────────▼──────────┐ │
|
|
||||||
│ │ Shared │ │
|
|
||||||
│ │ Infrastructure │ │
|
|
||||||
│ │ (LLM, Embeddings) │ │
|
|
||||||
│ └─────────────────────┘ │
|
|
||||||
│ │
|
|
||||||
└─────────────────────────────────────────────────────────────────────────────┘
|
|
||||||
```
|
|
||||||
|
|
||||||
**Guaranteed isolation:**
|
|
||||||
- Data from one tenant never visible to another
|
|
||||||
- Independent configurations
|
|
||||||
- Separate billing (if applicable)
|
|
||||||
- Audit logs per tenant
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 6. Responsiveness
|
|
||||||
|
|
||||||
The portal is responsive and works on:
|
|
||||||
|
|
||||||
| Device | Support |
|
|
||||||
|--------|---------|
|
|
||||||
| Desktop (1920px+) | ✅ Optimized |
|
|
||||||
| Laptop (1366px) | ✅ Optimized |
|
|
||||||
| Tablet (768px) | ✅ Adapted |
|
|
||||||
| Mobile (375px) | ⚠️ View only |
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 7. Accessibility
|
|
||||||
|
|
||||||
- Full keyboard navigation
|
|
||||||
- Screen reader compatible (ARIA)
|
|
||||||
- Adequate contrast (WCAG 2.1 AA)
|
|
||||||
- Resizable text
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 8. Roadmap
|
|
||||||
|
|
||||||
### Version 1.0 (MVP)
|
|
||||||
- ✅ Dashboard with metrics
|
|
||||||
- ✅ JIRA/Bitbucket integration
|
|
||||||
- ✅ LLM configuration
|
|
||||||
- ✅ Repository management
|
|
||||||
- ✅ Business rules editor
|
|
||||||
- ✅ Issue list view
|
|
||||||
|
|
||||||
### Version 1.1
|
|
||||||
- [ ] Email notifications
|
|
||||||
- [ ] Slack/Teams integration
|
|
||||||
- [ ] Scheduled reports
|
|
||||||
- [ ] API rate limiting dashboard
|
|
||||||
|
|
||||||
### Version 2.0
|
|
||||||
- [ ] Custom AI prompts editor
|
|
||||||
- [ ] A/B testing for prompts
|
|
||||||
- [ ] Advanced analytics
|
|
||||||
- [ ] Workflow automation
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
**JIRA AI Fixer - Intelligent Support Case Resolution**
|
|
||||||
|
|
||||||
*For questions, contact: support@yourcompany.com*
|
|
||||||
|
|
@ -1,594 +0,0 @@
|
||||||
# JIRA AI Fixer - Technical Document
|
|
||||||
|
|
||||||
**Version:** 1.1
|
|
||||||
**Date:** February 2026
|
|
||||||
**Classification:** Product Documentation
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 1. Overview
|
|
||||||
|
|
||||||
### 1.1 Objective
|
|
||||||
JIRA AI Fixer is an artificial intelligence system that integrates with JIRA and Bitbucket to automate Support Case analysis, identify affected modules in source code (COBOL/SQL/JCL), propose fixes, and automatically document solutions.
|
|
||||||
|
|
||||||
### 1.2 Scope
|
|
||||||
- **Languages:** COBOL, SQL, JCL (mainframe-focused)
|
|
||||||
- **Issues:** Support Cases in JIRA
|
|
||||||
- **Repositories:** Any Bitbucket Server repositories
|
|
||||||
- **Flexibility:** Configurable per client/product
|
|
||||||
|
|
||||||
### 1.3 High-Level Architecture
|
|
||||||
|
|
||||||
```
|
|
||||||
┌─────────────────────────────────────────────────────────────────────────────┐
|
|
||||||
│ JIRA AI FIXER - ARCHITECTURE │
|
|
||||||
├─────────────────────────────────────────────────────────────────────────────┤
|
|
||||||
│ │
|
|
||||||
│ ┌───────────────┐ │
|
|
||||||
│ │ JIRA │ │
|
|
||||||
│ │ Server │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ └───────┬───────┘ │
|
|
||||||
│ │ Webhook (issue_created, issue_updated) │
|
|
||||||
│ ▼ │
|
|
||||||
│ ┌───────────────────────────────────────────────────────────────────┐ │
|
|
||||||
│ │ EVENT PROCESSOR │ │
|
|
||||||
│ │ ┌─────────────┐ ┌─────────────┐ ┌─────────────────────────┐ │ │
|
|
||||||
│ │ │ Queue │ │ Filter │ │ Issue Classifier │ │ │
|
|
||||||
│ │ │ (Redis) │──▶ (Support │──▶ (Product, Module, │ │ │
|
|
||||||
│ │ │ │ │ Cases) │ │ Severity) │ │ │
|
|
||||||
│ │ └─────────────┘ └─────────────┘ └─────────────────────────┘ │ │
|
|
||||||
│ └───────────────────────────────────────────────────────────────────┘ │
|
|
||||||
│ │ │
|
|
||||||
│ ▼ │
|
|
||||||
│ ┌───────────────────────────────────────────────────────────────────┐ │
|
|
||||||
│ │ CODE INTELLIGENCE ENGINE │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ ┌─────────────────┐ ┌─────────────────┐ ┌──────────────┐ │ │
|
|
||||||
│ │ │ Bitbucket │ │ Code Index │ │ Context │ │ │
|
|
||||||
│ │ │ Connector │ │ (Embeddings) │ │ Builder │ │ │
|
|
||||||
│ │ │ │ │ │ │ │ │ │
|
|
||||||
│ │ │ │ │ - COBOL procs │ │ - CALLs │ │ │
|
|
||||||
│ │ │ │ │ - SQL tables │ │ - COPYBOOKs │ │ │
|
|
||||||
│ │ │ │ │ - JCL jobs │ │ - Includes │ │ │
|
|
||||||
│ │ └─────────────────┘ └─────────────────┘ └──────────────┘ │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ Repositories: │ │
|
|
||||||
│ │ ├── Product-Base │ │
|
|
||||||
│ │ │ └── Product-Client-Fork │ │
|
|
||||||
│ │ │ └── Product-Client-AI (AI workspace) ← NEW │ │
|
|
||||||
│ └───────────────────────────────────────────────────────────────────┘ │
|
|
||||||
│ │ │
|
|
||||||
│ ▼ │
|
|
||||||
│ ┌───────────────────────────────────────────────────────────────────┐ │
|
|
||||||
│ │ FIX GENERATION ENGINE │ │
|
|
||||||
│ │ │ │
|
|
||||||
│ │ ┌─────────────────┐ ┌─────────────────┐ ┌──────────────┐ │ │
|
|
||||||
│ │ │ LLM Engine │ │ Fix Validator │ │ Output │ │ │
|
|
||||||
│ │ │ │ │ │ │ Generator │ │ │
|
|
||||||
│ │ │ - GPT-4o │ │ - Syntax check │ │ │ │ │
|
|
||||||
│ │ │ - Claude │ │ - COBOL rules │ │ - JIRA │ │ │
|
|
||||||
│ │ │ - Llama │ │ - SQL lint │ │ comment │ │ │
|
|
||||||
│ │ │ │ │ - JCL validate │ │ - PR/Branch │ │ │
|
|
||||||
│ │ └─────────────────┘ └─────────────────┘ └──────────────┘ │ │
|
|
||||||
│ └───────────────────────────────────────────────────────────────────┘ │
|
|
||||||
│ │ │
|
|
||||||
│ ┌───────────────┴───────────────┐ │
|
|
||||||
│ ▼ ▼ │
|
|
||||||
│ ┌──────────────┐ ┌──────────────┐ │
|
|
||||||
│ │ JIRA │ │ Bitbucket │ │
|
|
||||||
│ │ Comment │ │ Pull Request│ │
|
|
||||||
│ │ (Analysis + │ │ (AI Fork) │ │
|
|
||||||
│ │ Suggestion)│ │ │ │
|
|
||||||
│ └──────────────┘ └──────────────┘ │
|
|
||||||
│ │
|
|
||||||
└─────────────────────────────────────────────────────────────────────────────┘
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 2. Detailed Components
|
|
||||||
|
|
||||||
### 2.1 Event Processor
|
|
||||||
|
|
||||||
#### 2.1.1 JIRA Webhook Receiver
|
|
||||||
```yaml
|
|
||||||
Endpoint: POST /api/webhook/jira
|
|
||||||
Events:
|
|
||||||
- jira:issue_created
|
|
||||||
- jira:issue_updated
|
|
||||||
Filters:
|
|
||||||
- issueType: "Support Case" (configurable)
|
|
||||||
- project: Configurable per installation
|
|
||||||
Authentication: Webhook Secret (HMAC-SHA256)
|
|
||||||
```
|
|
||||||
|
|
||||||
#### 2.1.2 Queue System
|
|
||||||
```yaml
|
|
||||||
Technology: Redis + Bull Queue
|
|
||||||
Queues:
|
|
||||||
- jira-events: Raw JIRA events
|
|
||||||
- analysis-jobs: Pending analysis jobs
|
|
||||||
- fix-generation: Fix generation tasks
|
|
||||||
Retry Policy:
|
|
||||||
- Max attempts: 3
|
|
||||||
- Backoff: exponential (1min, 5min, 15min)
|
|
||||||
Dead Letter Queue: jira-events-dlq
|
|
||||||
```
|
|
||||||
|
|
||||||
#### 2.1.3 Issue Classifier
|
|
||||||
Responsible for extracting metadata from issues:
|
|
||||||
|
|
||||||
```python
|
|
||||||
class IssueClassifier:
|
|
||||||
def classify(self, issue: JiraIssue) -> ClassifiedIssue:
|
|
||||||
return ClassifiedIssue(
|
|
||||||
product=self._detect_product(issue), # Configurable
|
|
||||||
module=self._detect_module(issue), # Authorization, Clearing, etc.
|
|
||||||
severity=self._detect_severity(issue), # P1, P2, P3
|
|
||||||
keywords=self._extract_keywords(issue), # Technical terms
|
|
||||||
stack_trace=self._parse_stack_trace(issue),
|
|
||||||
affected_programs=self._detect_programs(issue)
|
|
||||||
)
|
|
||||||
```
|
|
||||||
|
|
||||||
### 2.2 Code Intelligence Engine
|
|
||||||
|
|
||||||
#### 2.2.1 Bitbucket Connector
|
|
||||||
```yaml
|
|
||||||
Supported: Bitbucket Server (REST API 1.0)
|
|
||||||
Authentication: Personal Access Token or OAuth
|
|
||||||
|
|
||||||
Operations:
|
|
||||||
- Clone/Pull: Sparse checkout (relevant directories only)
|
|
||||||
- Read: Specific file contents
|
|
||||||
- Branches: Create/list branches in AI fork
|
|
||||||
- Pull Requests: Create PR from AI fork → client fork
|
|
||||||
```
|
|
||||||
|
|
||||||
**Access Structure per Repository:**
|
|
||||||
|
|
||||||
| Repository | AI Permission | Usage |
|
|
||||||
|------------|---------------|-------|
|
|
||||||
| Product-Base | READ | Reference, standards |
|
|
||||||
| Product-Client-Fork | READ | Current client code |
|
|
||||||
| Product-Client-AI | WRITE | AI branches and commits |
|
|
||||||
|
|
||||||
#### 2.2.2 Code Index (Embeddings)
|
|
||||||
|
|
||||||
**Embedding Providers (Configurable):**
|
|
||||||
|
|
||||||
| Provider | Use Case | Compliance |
|
|
||||||
|----------|----------|------------|
|
|
||||||
| Azure OpenAI | Enterprise (data stays in Azure) | High |
|
|
||||||
| OpenAI API | Standard deployments | Medium |
|
|
||||||
| Local (MiniLM) | Air-gapped / cost-sensitive | Maximum |
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
Models:
|
|
||||||
- Azure: text-embedding-3-large (3072 dims)
|
|
||||||
- OpenAI: text-embedding-3-large (3072 dims)
|
|
||||||
- Local: all-MiniLM-L6-v2 (384 dims)
|
|
||||||
|
|
||||||
Vector DB: Qdrant (self-hosted)
|
|
||||||
Index separated by: product + client
|
|
||||||
```
|
|
||||||
|
|
||||||
**COBOL Code Indexing:**
|
|
||||||
```yaml
|
|
||||||
Granularity: By PROGRAM-ID / SECTION / PARAGRAPH
|
|
||||||
Extracted metadata:
|
|
||||||
- PROGRAM-ID
|
|
||||||
- COPY statements (dependencies)
|
|
||||||
- CALL statements (called programs)
|
|
||||||
- FILE-CONTROL (accessed files)
|
|
||||||
- SQL EXEC (tables/queries)
|
|
||||||
- Working Storage (main variables)
|
|
||||||
```
|
|
||||||
|
|
||||||
**SQL Indexing:**
|
|
||||||
```yaml
|
|
||||||
Granularity: By table/view/procedure
|
|
||||||
Extracted metadata:
|
|
||||||
- Object name
|
|
||||||
- Columns and types
|
|
||||||
- Foreign keys
|
|
||||||
- Referencing procedures
|
|
||||||
```
|
|
||||||
|
|
||||||
**JCL Indexing:**
|
|
||||||
```yaml
|
|
||||||
Granularity: By JOB / STEP
|
|
||||||
Extracted metadata:
|
|
||||||
- JOB name
|
|
||||||
- Executed PGMs
|
|
||||||
- DD statements (datasets)
|
|
||||||
- Passed PARMs
|
|
||||||
- Dependencies (JCL INCLUDEs)
|
|
||||||
```
|
|
||||||
|
|
||||||
#### 2.2.3 Context Builder
|
|
||||||
|
|
||||||
Assembles relevant context for LLM analysis:
|
|
||||||
|
|
||||||
```python
|
|
||||||
class ContextBuilder:
|
|
||||||
def build_context(self, issue: ClassifiedIssue) -> AnalysisContext:
|
|
||||||
# 1. Search programs mentioned in the issue
|
|
||||||
mentioned_programs = self._search_by_keywords(issue.keywords)
|
|
||||||
|
|
||||||
# 2. Search similar programs from past issues
|
|
||||||
similar_issues = self._find_similar_issues(issue)
|
|
||||||
|
|
||||||
# 3. Expand dependencies (COPYBOOKs, CALLs)
|
|
||||||
dependencies = self._expand_dependencies(mentioned_programs)
|
|
||||||
|
|
||||||
# 4. Get configured business rules
|
|
||||||
business_rules = self._get_business_rules(issue.product)
|
|
||||||
|
|
||||||
# 5. Build final context (respecting token limit)
|
|
||||||
return AnalysisContext(
|
|
||||||
primary_code=mentioned_programs[:5], # Max 5 main programs
|
|
||||||
dependencies=dependencies[:10], # Max 10 dependencies
|
|
||||||
similar_fixes=similar_issues[:3], # Max 3 examples
|
|
||||||
business_rules=business_rules,
|
|
||||||
total_tokens=self._count_tokens()
|
|
||||||
)
|
|
||||||
```
|
|
||||||
|
|
||||||
### 2.3 Fix Generation Engine
|
|
||||||
|
|
||||||
#### 2.3.1 LLM Engine
|
|
||||||
|
|
||||||
**Supported Providers:**
|
|
||||||
|
|
||||||
| Provider | Models | Use Case |
|
|
||||||
|----------|--------|----------|
|
|
||||||
| Azure OpenAI | GPT-4o, GPT-4 Turbo | Enterprise compliance |
|
|
||||||
| OpenAI | GPT-4o, GPT-4 Turbo | Standard deployment |
|
|
||||||
| OpenRouter | Llama 3.3, Claude, Mixtral | Cost-effective / free |
|
|
||||||
| Local | Ollama (Llama, CodeLlama) | Air-gapped |
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
Configuration:
|
|
||||||
temperature: 0.2 # Low for code
|
|
||||||
max_tokens: 4096
|
|
||||||
top_p: 0.95
|
|
||||||
Gateway: LiteLLM (unified interface)
|
|
||||||
```
|
|
||||||
|
|
||||||
**COBOL Prompt Template:**
|
|
||||||
```
|
|
||||||
You are an expert in mainframe payment systems and COBOL programming.
|
|
||||||
|
|
||||||
## System Context
|
|
||||||
{business_rules}
|
|
||||||
|
|
||||||
## Reported Issue
|
|
||||||
{issue_description}
|
|
||||||
|
|
||||||
## Current Code
|
|
||||||
{code_context}
|
|
||||||
|
|
||||||
## Similar Fix History
|
|
||||||
{similar_fixes}
|
|
||||||
|
|
||||||
## Task
|
|
||||||
Analyze the issue and:
|
|
||||||
1. Identify the probable root cause
|
|
||||||
2. Locate the affected program(s)
|
|
||||||
3. Propose a specific fix
|
|
||||||
4. Explain the impact of the change
|
|
||||||
|
|
||||||
## Rules
|
|
||||||
- Maintain COBOL-85 compatibility
|
|
||||||
- Preserve existing copybook structure
|
|
||||||
- Do not change interfaces with other systems without explicit mention
|
|
||||||
- Document all proposed changes
|
|
||||||
|
|
||||||
## Response Format
|
|
||||||
{response_format}
|
|
||||||
```
|
|
||||||
|
|
||||||
#### 2.3.2 Fix Validator
|
|
||||||
|
|
||||||
**COBOL Validations:**
|
|
||||||
```yaml
|
|
||||||
Syntax:
|
|
||||||
- Compilation with GnuCOBOL (syntax check)
|
|
||||||
- Verification of referenced copybooks
|
|
||||||
|
|
||||||
Semantics:
|
|
||||||
- CALLs to existing programs
|
|
||||||
- Variables declared before use
|
|
||||||
- Compatible PIC clauses
|
|
||||||
|
|
||||||
Style:
|
|
||||||
- Standard indentation (Area A/B)
|
|
||||||
- Naming conventions
|
|
||||||
- Mandatory comments
|
|
||||||
```
|
|
||||||
|
|
||||||
**SQL Validations:**
|
|
||||||
```yaml
|
|
||||||
- Syntax check with SQL parser
|
|
||||||
- Verification of existing tables/columns
|
|
||||||
- Performance analysis (EXPLAIN)
|
|
||||||
```
|
|
||||||
|
|
||||||
**JCL Validations:**
|
|
||||||
```yaml
|
|
||||||
- JCL syntax check
|
|
||||||
- Referenced datasets exist
|
|
||||||
- Referenced PGMs exist
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 3. Repository Structure (AI Fork)
|
|
||||||
|
|
||||||
### 3.1 AI Fork Creation
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Proposed structure in Bitbucket
|
|
||||||
projects/
|
|
||||||
├── PRODUCT/
|
|
||||||
│ ├── Product-Base # Base product (existing)
|
|
||||||
│ ├── Product-Client-Fork # Client fork (existing)
|
|
||||||
│ └── Product-Client-AI # AI fork (NEW)
|
|
||||||
```
|
|
||||||
|
|
||||||
### 3.2 Branch Flow
|
|
||||||
|
|
||||||
```
|
|
||||||
Product-Client-Fork (client)
|
|
||||||
│
|
|
||||||
│ fork
|
|
||||||
▼
|
|
||||||
Product-Client-AI (AI workspace)
|
|
||||||
│
|
|
||||||
├── main (sync with client)
|
|
||||||
│
|
|
||||||
└── ai-fix/JIRA-1234-description
|
|
||||||
│
|
|
||||||
│ Pull Request
|
|
||||||
▼
|
|
||||||
Product-Client-Fork (client)
|
|
||||||
│
|
|
||||||
│ Review + Approve
|
|
||||||
▼
|
|
||||||
merge
|
|
||||||
```
|
|
||||||
|
|
||||||
### 3.3 Commit Convention
|
|
||||||
|
|
||||||
```
|
|
||||||
[AI-FIX] JIRA-1234: Short fix description
|
|
||||||
|
|
||||||
Problem:
|
|
||||||
- Original problem description
|
|
||||||
|
|
||||||
Solution:
|
|
||||||
- What was changed and why
|
|
||||||
|
|
||||||
Modified files:
|
|
||||||
- src/cobol/PROGRAM.CBL (lines 1234-1256)
|
|
||||||
|
|
||||||
Confidence: 85%
|
|
||||||
Generated by: JIRA AI Fixer v1.0
|
|
||||||
|
|
||||||
Co-authored-by: ai-fixer@company.com
|
|
||||||
```
|
|
||||||
|
|
||||||
### 3.4 Recommended Permissions
|
|
||||||
|
|
||||||
| User/Group | Base Product | Client Fork | AI Fork |
|
|
||||||
|------------|--------------|-------------|---------|
|
|
||||||
| ai-fixer-svc | READ | READ | WRITE |
|
|
||||||
| developers | WRITE | WRITE | READ |
|
|
||||||
| tech-leads | ADMIN | ADMIN | ADMIN |
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 4. Technology Stack
|
|
||||||
|
|
||||||
### 4.1 Backend
|
|
||||||
```yaml
|
|
||||||
Runtime: Python 3.11+
|
|
||||||
Framework: FastAPI
|
|
||||||
Async: asyncio + httpx
|
|
||||||
Queue: Redis 7+ with Bull Queue
|
|
||||||
Database: PostgreSQL 15+ (metadata, configurations, logs)
|
|
||||||
Vector DB: Qdrant 1.7+ (self-hosted)
|
|
||||||
Cache: Redis
|
|
||||||
```
|
|
||||||
|
|
||||||
### 4.2 Frontend (Admin Portal)
|
|
||||||
```yaml
|
|
||||||
Framework: React 18+
|
|
||||||
UI Kit: Tailwind CSS + shadcn/ui
|
|
||||||
State: React Query
|
|
||||||
Build: Vite
|
|
||||||
```
|
|
||||||
|
|
||||||
### 4.3 Infrastructure
|
|
||||||
```yaml
|
|
||||||
Container: Docker + Docker Compose
|
|
||||||
Orchestration: Docker Swarm or Kubernetes
|
|
||||||
CI/CD: Configurable (Bitbucket Pipelines, GitHub Actions, etc.)
|
|
||||||
Reverse Proxy: Traefik
|
|
||||||
SSL: Let's Encrypt
|
|
||||||
Monitoring: Prometheus + Grafana
|
|
||||||
Logs: ELK Stack or Loki
|
|
||||||
```
|
|
||||||
|
|
||||||
### 4.4 External Integrations
|
|
||||||
```yaml
|
|
||||||
LLM (Configurable):
|
|
||||||
- Azure OpenAI (enterprise)
|
|
||||||
- OpenAI API (standard)
|
|
||||||
- OpenRouter (cost-effective)
|
|
||||||
- Local Ollama (air-gapped)
|
|
||||||
|
|
||||||
Embeddings (Configurable):
|
|
||||||
- Azure OpenAI text-embedding-3-large
|
|
||||||
- OpenAI text-embedding-3-large
|
|
||||||
- Local MiniLM-L6-v2
|
|
||||||
|
|
||||||
JIRA:
|
|
||||||
API: REST API v2 (Server)
|
|
||||||
Auth: Personal Access Token
|
|
||||||
|
|
||||||
Bitbucket:
|
|
||||||
API: REST API 1.0 (Server)
|
|
||||||
Auth: Personal Access Token
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 5. Security
|
|
||||||
|
|
||||||
### 5.1 Sensitive Data
|
|
||||||
```yaml
|
|
||||||
Source code:
|
|
||||||
- Processed in memory, not persisted to disk
|
|
||||||
- Embeddings stored in Qdrant (encrypted at-rest)
|
|
||||||
- Sanitized logs (no complete code)
|
|
||||||
|
|
||||||
Credentials:
|
|
||||||
- Environment variables or secrets manager
|
|
||||||
- Automatic token rotation supported
|
|
||||||
- Access audit log
|
|
||||||
|
|
||||||
LLM and Embeddings:
|
|
||||||
- Configurable: Azure (compliance) or local (air-gapped)
|
|
||||||
- No data used for training when using Azure OpenAI
|
|
||||||
```
|
|
||||||
|
|
||||||
### 5.2 Network
|
|
||||||
```yaml
|
|
||||||
Deployment:
|
|
||||||
- Can be internal network only (not exposed to internet)
|
|
||||||
- HTTPS/TLS 1.3 communication
|
|
||||||
- Firewall: only JIRA and Bitbucket can access webhooks
|
|
||||||
|
|
||||||
Authentication:
|
|
||||||
- Admin Portal: Token-based or SSO integration
|
|
||||||
- API: JWT tokens with configurable expiration
|
|
||||||
- Webhooks: HMAC-SHA256 signature verification
|
|
||||||
```
|
|
||||||
|
|
||||||
### 5.3 Compliance Options
|
|
||||||
```yaml
|
|
||||||
Options:
|
|
||||||
- [ ] Data segregation by client/product
|
|
||||||
- [ ] Complete audit trail (who, when, what)
|
|
||||||
- [ ] Configurable log retention
|
|
||||||
- [ ] 100% on-premise deployment option
|
|
||||||
- [ ] Air-gapped deployment (local LLM + embeddings)
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 6. Deployment Options
|
|
||||||
|
|
||||||
### 6.1 SaaS (Hosted)
|
|
||||||
```yaml
|
|
||||||
Infrastructure: Managed by vendor
|
|
||||||
Updates: Automatic
|
|
||||||
Cost: Monthly subscription
|
|
||||||
Best for: Quick start, low maintenance
|
|
||||||
```
|
|
||||||
|
|
||||||
### 6.2 On-Premise
|
|
||||||
```yaml
|
|
||||||
Infrastructure: Customer's data center
|
|
||||||
Updates: Customer-controlled
|
|
||||||
Cost: License + internal infra
|
|
||||||
Best for: Compliance requirements, air-gapped
|
|
||||||
```
|
|
||||||
|
|
||||||
### 6.3 Hybrid
|
|
||||||
```yaml
|
|
||||||
Infrastructure: Customer hosts, vendor manages
|
|
||||||
Updates: Coordinated
|
|
||||||
Cost: License + reduced support
|
|
||||||
Best for: Balance of control and convenience
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 7. Estimates
|
|
||||||
|
|
||||||
### 7.1 Implementation Timeline
|
|
||||||
|
|
||||||
| Phase | Duration | Deliverables |
|
|
||||||
|-------|----------|--------------|
|
|
||||||
| **1. Initial Setup** | 1-2 weeks | Infra, repos, basic configuration |
|
|
||||||
| **2. Integration** | 1 week | JIRA webhook, Bitbucket connector |
|
|
||||||
| **3. Code Indexing** | 1-2 weeks | Repository indexing, embeddings |
|
|
||||||
| **4. Business Rules** | 1 week | Module configuration |
|
|
||||||
| **5. Testing** | 1 week | Validation with real issues |
|
|
||||||
| **Total** | **5-7 weeks** | |
|
|
||||||
|
|
||||||
### 7.2 Monthly Operational Costs (Estimate)
|
|
||||||
|
|
||||||
| Deployment | Infra | LLM APIs | Total |
|
|
||||||
|------------|-------|----------|-------|
|
|
||||||
| SaaS | Included | Included | $2,000-5,000/mo |
|
|
||||||
| On-Premise | Customer | ~$50-200/mo | License + infra |
|
|
||||||
| Air-gapped | Customer | $0 | License + infra |
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 8. Success Metrics
|
|
||||||
|
|
||||||
### 8.1 Technical KPIs
|
|
||||||
|
|
||||||
| Metric | MVP Target | 6-Month Target |
|
|
||||||
|--------|------------|----------------|
|
|
||||||
| Successful analysis rate | 80% | 95% |
|
|
||||||
| Accepted fixes (no modification) | 30% | 50% |
|
|
||||||
| Accepted fixes (with adjustments) | 50% | 70% |
|
|
||||||
| Average analysis time | < 5 min | < 2 min |
|
|
||||||
| System uptime | 95% | 99% |
|
|
||||||
|
|
||||||
### 8.2 Business KPIs
|
|
||||||
|
|
||||||
| Metric | Target |
|
|
||||||
|--------|--------|
|
|
||||||
| Initial analysis time reduction | 50% |
|
|
||||||
| Issues with useful suggestion | 70% |
|
|
||||||
| Team satisfaction | > 4/5 |
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 9. Getting Started
|
|
||||||
|
|
||||||
### 9.1 Prerequisites
|
|
||||||
- JIRA Server with webhook capability
|
|
||||||
- Bitbucket Server with API access
|
|
||||||
- Docker environment (SaaS) or Kubernetes (on-premise)
|
|
||||||
|
|
||||||
### 9.2 Quick Start
|
|
||||||
```bash
|
|
||||||
# Clone repository
|
|
||||||
git clone https://github.com/your-org/jira-ai-fixer.git
|
|
||||||
cd jira-ai-fixer
|
|
||||||
|
|
||||||
# Configure
|
|
||||||
cp .env.example .env
|
|
||||||
# Edit .env with your credentials
|
|
||||||
|
|
||||||
# Run
|
|
||||||
docker compose up -d
|
|
||||||
|
|
||||||
# Access portal
|
|
||||||
open https://localhost:8080
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
**JIRA AI Fixer - Intelligent Support Case Resolution**
|
|
||||||
|
|
||||||
*Contact: sales@yourcompany.com*
|
|
||||||
|
|
@ -0,0 +1,14 @@
|
||||||
|
<!DOCTYPE html>
|
||||||
|
<html lang="en" class="dark">
|
||||||
|
<head>
|
||||||
|
<meta charset="UTF-8">
|
||||||
|
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||||
|
<title>JIRA AI Fixer</title>
|
||||||
|
<link rel="icon" type="image/svg+xml" href="/favicon.svg">
|
||||||
|
<script type="module" crossorigin src="/assets/index-CfAFg710.js"></script>
|
||||||
|
<link rel="stylesheet" crossorigin href="/assets/index-4u66p920.css">
|
||||||
|
</head>
|
||||||
|
<body class="bg-gray-900 text-white">
|
||||||
|
<div id="root"></div>
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
|
|
@ -0,0 +1,13 @@
|
||||||
|
<!DOCTYPE html>
|
||||||
|
<html lang="en" class="dark">
|
||||||
|
<head>
|
||||||
|
<meta charset="UTF-8">
|
||||||
|
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||||
|
<title>JIRA AI Fixer</title>
|
||||||
|
<link rel="icon" type="image/svg+xml" href="/favicon.svg">
|
||||||
|
</head>
|
||||||
|
<body class="bg-gray-900 text-white">
|
||||||
|
<div id="root"></div>
|
||||||
|
<script type="module" src="/src/main.jsx"></script>
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
../autoprefixer/bin/autoprefixer
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
../baseline-browser-mapping/dist/cli.js
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
../browserslist/cli.js
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
../cssesc/bin/cssesc
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
../esbuild/bin/esbuild
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
../jiti/bin/jiti.js
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
../jsesc/bin/jsesc
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
../json5/lib/cli.js
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
../loose-envify/cli.js
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
../nanoid/bin/nanoid.cjs
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
../@babel/parser/bin/babel-parser.js
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
../resolve/bin/resolve
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
../rollup/dist/bin/rollup
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
../semver/bin/semver.js
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
../sucrase/bin/sucrase
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
../sucrase/bin/sucrase-node
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
../tailwindcss/lib/cli.js
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
../tailwindcss/lib/cli.js
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
../update-browserslist-db/cli.js
|
||||||
|
|
@ -0,0 +1 @@
|
||||||
|
../vite/bin/vite.js
|
||||||
File diff suppressed because it is too large
Load Diff
|
|
@ -0,0 +1,128 @@
|
||||||
|
declare namespace QuickLRU {
|
||||||
|
interface Options<KeyType, ValueType> {
|
||||||
|
/**
|
||||||
|
The maximum number of milliseconds an item should remain in the cache.
|
||||||
|
|
||||||
|
@default Infinity
|
||||||
|
|
||||||
|
By default, `maxAge` will be `Infinity`, which means that items will never expire.
|
||||||
|
Lazy expiration upon the next write or read call.
|
||||||
|
|
||||||
|
Individual expiration of an item can be specified by the `set(key, value, maxAge)` method.
|
||||||
|
*/
|
||||||
|
readonly maxAge?: number;
|
||||||
|
|
||||||
|
/**
|
||||||
|
The maximum number of items before evicting the least recently used items.
|
||||||
|
*/
|
||||||
|
readonly maxSize: number;
|
||||||
|
|
||||||
|
/**
|
||||||
|
Called right before an item is evicted from the cache.
|
||||||
|
|
||||||
|
Useful for side effects or for items like object URLs that need explicit cleanup (`revokeObjectURL`).
|
||||||
|
*/
|
||||||
|
onEviction?: (key: KeyType, value: ValueType) => void;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
declare class QuickLRU<KeyType, ValueType>
|
||||||
|
implements Iterable<[KeyType, ValueType]> {
|
||||||
|
/**
|
||||||
|
The stored item count.
|
||||||
|
*/
|
||||||
|
readonly size: number;
|
||||||
|
|
||||||
|
/**
|
||||||
|
Simple ["Least Recently Used" (LRU) cache](https://en.m.wikipedia.org/wiki/Cache_replacement_policies#Least_Recently_Used_.28LRU.29).
|
||||||
|
|
||||||
|
The instance is [`iterable`](https://developer.mozilla.org/en/docs/Web/JavaScript/Reference/Iteration_protocols) so you can use it directly in a [`for…of`](https://developer.mozilla.org/en/docs/Web/JavaScript/Reference/Statements/for...of) loop.
|
||||||
|
|
||||||
|
@example
|
||||||
|
```
|
||||||
|
import QuickLRU = require('quick-lru');
|
||||||
|
|
||||||
|
const lru = new QuickLRU({maxSize: 1000});
|
||||||
|
|
||||||
|
lru.set('🦄', '🌈');
|
||||||
|
|
||||||
|
lru.has('🦄');
|
||||||
|
//=> true
|
||||||
|
|
||||||
|
lru.get('🦄');
|
||||||
|
//=> '🌈'
|
||||||
|
```
|
||||||
|
*/
|
||||||
|
constructor(options: QuickLRU.Options<KeyType, ValueType>);
|
||||||
|
|
||||||
|
[Symbol.iterator](): IterableIterator<[KeyType, ValueType]>;
|
||||||
|
|
||||||
|
/**
|
||||||
|
Set an item. Returns the instance.
|
||||||
|
|
||||||
|
Individual expiration of an item can be specified with the `maxAge` option. If not specified, the global `maxAge` value will be used in case it is specified in the constructor, otherwise the item will never expire.
|
||||||
|
|
||||||
|
@returns The list instance.
|
||||||
|
*/
|
||||||
|
set(key: KeyType, value: ValueType, options?: {maxAge?: number}): this;
|
||||||
|
|
||||||
|
/**
|
||||||
|
Get an item.
|
||||||
|
|
||||||
|
@returns The stored item or `undefined`.
|
||||||
|
*/
|
||||||
|
get(key: KeyType): ValueType | undefined;
|
||||||
|
|
||||||
|
/**
|
||||||
|
Check if an item exists.
|
||||||
|
*/
|
||||||
|
has(key: KeyType): boolean;
|
||||||
|
|
||||||
|
/**
|
||||||
|
Get an item without marking it as recently used.
|
||||||
|
|
||||||
|
@returns The stored item or `undefined`.
|
||||||
|
*/
|
||||||
|
peek(key: KeyType): ValueType | undefined;
|
||||||
|
|
||||||
|
/**
|
||||||
|
Delete an item.
|
||||||
|
|
||||||
|
@returns `true` if the item is removed or `false` if the item doesn't exist.
|
||||||
|
*/
|
||||||
|
delete(key: KeyType): boolean;
|
||||||
|
|
||||||
|
/**
|
||||||
|
Delete all items.
|
||||||
|
*/
|
||||||
|
clear(): void;
|
||||||
|
|
||||||
|
/**
|
||||||
|
Update the `maxSize` in-place, discarding items as necessary. Insertion order is mostly preserved, though this is not a strong guarantee.
|
||||||
|
|
||||||
|
Useful for on-the-fly tuning of cache sizes in live systems.
|
||||||
|
*/
|
||||||
|
resize(maxSize: number): void;
|
||||||
|
|
||||||
|
/**
|
||||||
|
Iterable for all the keys.
|
||||||
|
*/
|
||||||
|
keys(): IterableIterator<KeyType>;
|
||||||
|
|
||||||
|
/**
|
||||||
|
Iterable for all the values.
|
||||||
|
*/
|
||||||
|
values(): IterableIterator<ValueType>;
|
||||||
|
|
||||||
|
/**
|
||||||
|
Iterable for all entries, starting with the oldest (ascending in recency).
|
||||||
|
*/
|
||||||
|
entriesAscending(): IterableIterator<[KeyType, ValueType]>;
|
||||||
|
|
||||||
|
/**
|
||||||
|
Iterable for all entries, starting with the newest (descending in recency).
|
||||||
|
*/
|
||||||
|
entriesDescending(): IterableIterator<[KeyType, ValueType]>;
|
||||||
|
}
|
||||||
|
|
||||||
|
export = QuickLRU;
|
||||||
|
|
@ -0,0 +1,263 @@
|
||||||
|
'use strict';
|
||||||
|
|
||||||
|
class QuickLRU {
|
||||||
|
constructor(options = {}) {
|
||||||
|
if (!(options.maxSize && options.maxSize > 0)) {
|
||||||
|
throw new TypeError('`maxSize` must be a number greater than 0');
|
||||||
|
}
|
||||||
|
|
||||||
|
if (typeof options.maxAge === 'number' && options.maxAge === 0) {
|
||||||
|
throw new TypeError('`maxAge` must be a number greater than 0');
|
||||||
|
}
|
||||||
|
|
||||||
|
this.maxSize = options.maxSize;
|
||||||
|
this.maxAge = options.maxAge || Infinity;
|
||||||
|
this.onEviction = options.onEviction;
|
||||||
|
this.cache = new Map();
|
||||||
|
this.oldCache = new Map();
|
||||||
|
this._size = 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
_emitEvictions(cache) {
|
||||||
|
if (typeof this.onEviction !== 'function') {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
for (const [key, item] of cache) {
|
||||||
|
this.onEviction(key, item.value);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
_deleteIfExpired(key, item) {
|
||||||
|
if (typeof item.expiry === 'number' && item.expiry <= Date.now()) {
|
||||||
|
if (typeof this.onEviction === 'function') {
|
||||||
|
this.onEviction(key, item.value);
|
||||||
|
}
|
||||||
|
|
||||||
|
return this.delete(key);
|
||||||
|
}
|
||||||
|
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
_getOrDeleteIfExpired(key, item) {
|
||||||
|
const deleted = this._deleteIfExpired(key, item);
|
||||||
|
if (deleted === false) {
|
||||||
|
return item.value;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
_getItemValue(key, item) {
|
||||||
|
return item.expiry ? this._getOrDeleteIfExpired(key, item) : item.value;
|
||||||
|
}
|
||||||
|
|
||||||
|
_peek(key, cache) {
|
||||||
|
const item = cache.get(key);
|
||||||
|
|
||||||
|
return this._getItemValue(key, item);
|
||||||
|
}
|
||||||
|
|
||||||
|
_set(key, value) {
|
||||||
|
this.cache.set(key, value);
|
||||||
|
this._size++;
|
||||||
|
|
||||||
|
if (this._size >= this.maxSize) {
|
||||||
|
this._size = 0;
|
||||||
|
this._emitEvictions(this.oldCache);
|
||||||
|
this.oldCache = this.cache;
|
||||||
|
this.cache = new Map();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
_moveToRecent(key, item) {
|
||||||
|
this.oldCache.delete(key);
|
||||||
|
this._set(key, item);
|
||||||
|
}
|
||||||
|
|
||||||
|
* _entriesAscending() {
|
||||||
|
for (const item of this.oldCache) {
|
||||||
|
const [key, value] = item;
|
||||||
|
if (!this.cache.has(key)) {
|
||||||
|
const deleted = this._deleteIfExpired(key, value);
|
||||||
|
if (deleted === false) {
|
||||||
|
yield item;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
for (const item of this.cache) {
|
||||||
|
const [key, value] = item;
|
||||||
|
const deleted = this._deleteIfExpired(key, value);
|
||||||
|
if (deleted === false) {
|
||||||
|
yield item;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
get(key) {
|
||||||
|
if (this.cache.has(key)) {
|
||||||
|
const item = this.cache.get(key);
|
||||||
|
|
||||||
|
return this._getItemValue(key, item);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.oldCache.has(key)) {
|
||||||
|
const item = this.oldCache.get(key);
|
||||||
|
if (this._deleteIfExpired(key, item) === false) {
|
||||||
|
this._moveToRecent(key, item);
|
||||||
|
return item.value;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
set(key, value, {maxAge = this.maxAge === Infinity ? undefined : Date.now() + this.maxAge} = {}) {
|
||||||
|
if (this.cache.has(key)) {
|
||||||
|
this.cache.set(key, {
|
||||||
|
value,
|
||||||
|
maxAge
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
this._set(key, {value, expiry: maxAge});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
has(key) {
|
||||||
|
if (this.cache.has(key)) {
|
||||||
|
return !this._deleteIfExpired(key, this.cache.get(key));
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.oldCache.has(key)) {
|
||||||
|
return !this._deleteIfExpired(key, this.oldCache.get(key));
|
||||||
|
}
|
||||||
|
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
peek(key) {
|
||||||
|
if (this.cache.has(key)) {
|
||||||
|
return this._peek(key, this.cache);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (this.oldCache.has(key)) {
|
||||||
|
return this._peek(key, this.oldCache);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
delete(key) {
|
||||||
|
const deleted = this.cache.delete(key);
|
||||||
|
if (deleted) {
|
||||||
|
this._size--;
|
||||||
|
}
|
||||||
|
|
||||||
|
return this.oldCache.delete(key) || deleted;
|
||||||
|
}
|
||||||
|
|
||||||
|
clear() {
|
||||||
|
this.cache.clear();
|
||||||
|
this.oldCache.clear();
|
||||||
|
this._size = 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
resize(newSize) {
|
||||||
|
if (!(newSize && newSize > 0)) {
|
||||||
|
throw new TypeError('`maxSize` must be a number greater than 0');
|
||||||
|
}
|
||||||
|
|
||||||
|
const items = [...this._entriesAscending()];
|
||||||
|
const removeCount = items.length - newSize;
|
||||||
|
if (removeCount < 0) {
|
||||||
|
this.cache = new Map(items);
|
||||||
|
this.oldCache = new Map();
|
||||||
|
this._size = items.length;
|
||||||
|
} else {
|
||||||
|
if (removeCount > 0) {
|
||||||
|
this._emitEvictions(items.slice(0, removeCount));
|
||||||
|
}
|
||||||
|
|
||||||
|
this.oldCache = new Map(items.slice(removeCount));
|
||||||
|
this.cache = new Map();
|
||||||
|
this._size = 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
this.maxSize = newSize;
|
||||||
|
}
|
||||||
|
|
||||||
|
* keys() {
|
||||||
|
for (const [key] of this) {
|
||||||
|
yield key;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
* values() {
|
||||||
|
for (const [, value] of this) {
|
||||||
|
yield value;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
* [Symbol.iterator]() {
|
||||||
|
for (const item of this.cache) {
|
||||||
|
const [key, value] = item;
|
||||||
|
const deleted = this._deleteIfExpired(key, value);
|
||||||
|
if (deleted === false) {
|
||||||
|
yield [key, value.value];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
for (const item of this.oldCache) {
|
||||||
|
const [key, value] = item;
|
||||||
|
if (!this.cache.has(key)) {
|
||||||
|
const deleted = this._deleteIfExpired(key, value);
|
||||||
|
if (deleted === false) {
|
||||||
|
yield [key, value.value];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
* entriesDescending() {
|
||||||
|
let items = [...this.cache];
|
||||||
|
for (let i = items.length - 1; i >= 0; --i) {
|
||||||
|
const item = items[i];
|
||||||
|
const [key, value] = item;
|
||||||
|
const deleted = this._deleteIfExpired(key, value);
|
||||||
|
if (deleted === false) {
|
||||||
|
yield [key, value.value];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
items = [...this.oldCache];
|
||||||
|
for (let i = items.length - 1; i >= 0; --i) {
|
||||||
|
const item = items[i];
|
||||||
|
const [key, value] = item;
|
||||||
|
if (!this.cache.has(key)) {
|
||||||
|
const deleted = this._deleteIfExpired(key, value);
|
||||||
|
if (deleted === false) {
|
||||||
|
yield [key, value.value];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
* entriesAscending() {
|
||||||
|
for (const [key, value] of this._entriesAscending()) {
|
||||||
|
yield [key, value.value];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
get size() {
|
||||||
|
if (!this._size) {
|
||||||
|
return this.oldCache.size;
|
||||||
|
}
|
||||||
|
|
||||||
|
let oldCacheSize = 0;
|
||||||
|
for (const key of this.oldCache.keys()) {
|
||||||
|
if (!this.cache.has(key)) {
|
||||||
|
oldCacheSize++;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return Math.min(this._size + oldCacheSize, this.maxSize);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
module.exports = QuickLRU;
|
||||||
|
|
@ -0,0 +1,9 @@
|
||||||
|
MIT License
|
||||||
|
|
||||||
|
Copyright (c) Sindre Sorhus <sindresorhus@gmail.com> (sindresorhus.com)
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||||
|
|
@ -0,0 +1,43 @@
|
||||||
|
{
|
||||||
|
"name": "@alloc/quick-lru",
|
||||||
|
"version": "5.2.0",
|
||||||
|
"description": "Simple “Least Recently Used” (LRU) cache",
|
||||||
|
"license": "MIT",
|
||||||
|
"repository": "sindresorhus/quick-lru",
|
||||||
|
"funding": "https://github.com/sponsors/sindresorhus",
|
||||||
|
"author": {
|
||||||
|
"name": "Sindre Sorhus",
|
||||||
|
"email": "sindresorhus@gmail.com",
|
||||||
|
"url": "https://sindresorhus.com"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">=10"
|
||||||
|
},
|
||||||
|
"scripts": {
|
||||||
|
"test": "xo && nyc ava && tsd"
|
||||||
|
},
|
||||||
|
"files": [
|
||||||
|
"index.js",
|
||||||
|
"index.d.ts"
|
||||||
|
],
|
||||||
|
"keywords": [
|
||||||
|
"lru",
|
||||||
|
"quick",
|
||||||
|
"cache",
|
||||||
|
"caching",
|
||||||
|
"least",
|
||||||
|
"recently",
|
||||||
|
"used",
|
||||||
|
"fast",
|
||||||
|
"map",
|
||||||
|
"hash",
|
||||||
|
"buffer"
|
||||||
|
],
|
||||||
|
"devDependencies": {
|
||||||
|
"ava": "^2.0.0",
|
||||||
|
"coveralls": "^3.0.3",
|
||||||
|
"nyc": "^15.0.0",
|
||||||
|
"tsd": "^0.11.0",
|
||||||
|
"xo": "^0.26.0"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,139 @@
|
||||||
|
# quick-lru [](https://travis-ci.org/sindresorhus/quick-lru) [](https://coveralls.io/github/sindresorhus/quick-lru?branch=master)
|
||||||
|
|
||||||
|
> Simple [“Least Recently Used” (LRU) cache](https://en.m.wikipedia.org/wiki/Cache_replacement_policies#Least_Recently_Used_.28LRU.29)
|
||||||
|
|
||||||
|
Useful when you need to cache something and limit memory usage.
|
||||||
|
|
||||||
|
Inspired by the [`hashlru` algorithm](https://github.com/dominictarr/hashlru#algorithm), but instead uses [`Map`](https://developer.mozilla.org/en/docs/Web/JavaScript/Reference/Global_Objects/Map) to support keys of any type, not just strings, and values can be `undefined`.
|
||||||
|
|
||||||
|
## Install
|
||||||
|
|
||||||
|
```
|
||||||
|
$ npm install quick-lru
|
||||||
|
```
|
||||||
|
|
||||||
|
## Usage
|
||||||
|
|
||||||
|
```js
|
||||||
|
const QuickLRU = require('quick-lru');
|
||||||
|
|
||||||
|
const lru = new QuickLRU({maxSize: 1000});
|
||||||
|
|
||||||
|
lru.set('🦄', '🌈');
|
||||||
|
|
||||||
|
lru.has('🦄');
|
||||||
|
//=> true
|
||||||
|
|
||||||
|
lru.get('🦄');
|
||||||
|
//=> '🌈'
|
||||||
|
```
|
||||||
|
|
||||||
|
## API
|
||||||
|
|
||||||
|
### new QuickLRU(options?)
|
||||||
|
|
||||||
|
Returns a new instance.
|
||||||
|
|
||||||
|
### options
|
||||||
|
|
||||||
|
Type: `object`
|
||||||
|
|
||||||
|
#### maxSize
|
||||||
|
|
||||||
|
*Required*\
|
||||||
|
Type: `number`
|
||||||
|
|
||||||
|
The maximum number of items before evicting the least recently used items.
|
||||||
|
|
||||||
|
#### maxAge
|
||||||
|
|
||||||
|
Type: `number`\
|
||||||
|
Default: `Infinity`
|
||||||
|
|
||||||
|
The maximum number of milliseconds an item should remain in cache.
|
||||||
|
By default maxAge will be Infinity, which means that items will never expire.
|
||||||
|
|
||||||
|
Lazy expiration happens upon the next `write` or `read` call.
|
||||||
|
|
||||||
|
Individual expiration of an item can be specified by the `set(key, value, options)` method.
|
||||||
|
|
||||||
|
#### onEviction
|
||||||
|
|
||||||
|
*Optional*\
|
||||||
|
Type: `(key, value) => void`
|
||||||
|
|
||||||
|
Called right before an item is evicted from the cache.
|
||||||
|
|
||||||
|
Useful for side effects or for items like object URLs that need explicit cleanup (`revokeObjectURL`).
|
||||||
|
|
||||||
|
### Instance
|
||||||
|
|
||||||
|
The instance is [`iterable`](https://developer.mozilla.org/en/docs/Web/JavaScript/Reference/Iteration_protocols) so you can use it directly in a [`for…of`](https://developer.mozilla.org/en/docs/Web/JavaScript/Reference/Statements/for...of) loop.
|
||||||
|
|
||||||
|
Both `key` and `value` can be of any type.
|
||||||
|
|
||||||
|
#### .set(key, value, options?)
|
||||||
|
|
||||||
|
Set an item. Returns the instance.
|
||||||
|
|
||||||
|
Individual expiration of an item can be specified with the `maxAge` option. If not specified, the global `maxAge` value will be used in case it is specified on the constructor, otherwise the item will never expire.
|
||||||
|
|
||||||
|
#### .get(key)
|
||||||
|
|
||||||
|
Get an item.
|
||||||
|
|
||||||
|
#### .has(key)
|
||||||
|
|
||||||
|
Check if an item exists.
|
||||||
|
|
||||||
|
#### .peek(key)
|
||||||
|
|
||||||
|
Get an item without marking it as recently used.
|
||||||
|
|
||||||
|
#### .delete(key)
|
||||||
|
|
||||||
|
Delete an item.
|
||||||
|
|
||||||
|
Returns `true` if the item is removed or `false` if the item doesn't exist.
|
||||||
|
|
||||||
|
#### .clear()
|
||||||
|
|
||||||
|
Delete all items.
|
||||||
|
|
||||||
|
#### .resize(maxSize)
|
||||||
|
|
||||||
|
Update the `maxSize`, discarding items as necessary. Insertion order is mostly preserved, though this is not a strong guarantee.
|
||||||
|
|
||||||
|
Useful for on-the-fly tuning of cache sizes in live systems.
|
||||||
|
|
||||||
|
#### .keys()
|
||||||
|
|
||||||
|
Iterable for all the keys.
|
||||||
|
|
||||||
|
#### .values()
|
||||||
|
|
||||||
|
Iterable for all the values.
|
||||||
|
|
||||||
|
#### .entriesAscending()
|
||||||
|
|
||||||
|
Iterable for all entries, starting with the oldest (ascending in recency).
|
||||||
|
|
||||||
|
#### .entriesDescending()
|
||||||
|
|
||||||
|
Iterable for all entries, starting with the newest (descending in recency).
|
||||||
|
|
||||||
|
#### .size
|
||||||
|
|
||||||
|
The stored item count.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
<div align="center">
|
||||||
|
<b>
|
||||||
|
<a href="https://tidelift.com/subscription/pkg/npm-quick-lru?utm_source=npm-quick-lru&utm_medium=referral&utm_campaign=readme">Get professional support for this package with a Tidelift subscription</a>
|
||||||
|
</b>
|
||||||
|
<br>
|
||||||
|
<sub>
|
||||||
|
Tidelift helps make open source sustainable for maintainers while giving companies<br>assurances about security, maintenance, and licensing for their dependencies.
|
||||||
|
</sub>
|
||||||
|
</div>
|
||||||
|
|
@ -0,0 +1,22 @@
|
||||||
|
MIT License
|
||||||
|
|
||||||
|
Copyright (c) 2014-present Sebastian McKenzie and other contributors
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining
|
||||||
|
a copy of this software and associated documentation files (the
|
||||||
|
"Software"), to deal in the Software without restriction, including
|
||||||
|
without limitation the rights to use, copy, modify, merge, publish,
|
||||||
|
distribute, sublicense, and/or sell copies of the Software, and to
|
||||||
|
permit persons to whom the Software is furnished to do so, subject to
|
||||||
|
the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be
|
||||||
|
included in all copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
|
||||||
|
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
|
||||||
|
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
|
||||||
|
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
|
||||||
|
LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
|
||||||
|
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
|
||||||
|
WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||||
|
|
@ -0,0 +1,19 @@
|
||||||
|
# @babel/code-frame
|
||||||
|
|
||||||
|
> Generate errors that contain a code frame that point to source locations.
|
||||||
|
|
||||||
|
See our website [@babel/code-frame](https://babeljs.io/docs/babel-code-frame) for more information.
|
||||||
|
|
||||||
|
## Install
|
||||||
|
|
||||||
|
Using npm:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
npm install --save-dev @babel/code-frame
|
||||||
|
```
|
||||||
|
|
||||||
|
or using yarn:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
yarn add @babel/code-frame --dev
|
||||||
|
```
|
||||||
|
|
@ -0,0 +1,217 @@
|
||||||
|
'use strict';
|
||||||
|
|
||||||
|
Object.defineProperty(exports, '__esModule', { value: true });
|
||||||
|
|
||||||
|
var picocolors = require('picocolors');
|
||||||
|
var jsTokens = require('js-tokens');
|
||||||
|
var helperValidatorIdentifier = require('@babel/helper-validator-identifier');
|
||||||
|
|
||||||
|
function isColorSupported() {
|
||||||
|
return (typeof process === "object" && (process.env.FORCE_COLOR === "0" || process.env.FORCE_COLOR === "false") ? false : picocolors.isColorSupported
|
||||||
|
);
|
||||||
|
}
|
||||||
|
const compose = (f, g) => v => f(g(v));
|
||||||
|
function buildDefs(colors) {
|
||||||
|
return {
|
||||||
|
keyword: colors.cyan,
|
||||||
|
capitalized: colors.yellow,
|
||||||
|
jsxIdentifier: colors.yellow,
|
||||||
|
punctuator: colors.yellow,
|
||||||
|
number: colors.magenta,
|
||||||
|
string: colors.green,
|
||||||
|
regex: colors.magenta,
|
||||||
|
comment: colors.gray,
|
||||||
|
invalid: compose(compose(colors.white, colors.bgRed), colors.bold),
|
||||||
|
gutter: colors.gray,
|
||||||
|
marker: compose(colors.red, colors.bold),
|
||||||
|
message: compose(colors.red, colors.bold),
|
||||||
|
reset: colors.reset
|
||||||
|
};
|
||||||
|
}
|
||||||
|
const defsOn = buildDefs(picocolors.createColors(true));
|
||||||
|
const defsOff = buildDefs(picocolors.createColors(false));
|
||||||
|
function getDefs(enabled) {
|
||||||
|
return enabled ? defsOn : defsOff;
|
||||||
|
}
|
||||||
|
|
||||||
|
const sometimesKeywords = new Set(["as", "async", "from", "get", "of", "set"]);
|
||||||
|
const NEWLINE$1 = /\r\n|[\n\r\u2028\u2029]/;
|
||||||
|
const BRACKET = /^[()[\]{}]$/;
|
||||||
|
let tokenize;
|
||||||
|
const JSX_TAG = /^[a-z][\w-]*$/i;
|
||||||
|
const getTokenType = function (token, offset, text) {
|
||||||
|
if (token.type === "name") {
|
||||||
|
const tokenValue = token.value;
|
||||||
|
if (helperValidatorIdentifier.isKeyword(tokenValue) || helperValidatorIdentifier.isStrictReservedWord(tokenValue, true) || sometimesKeywords.has(tokenValue)) {
|
||||||
|
return "keyword";
|
||||||
|
}
|
||||||
|
if (JSX_TAG.test(tokenValue) && (text[offset - 1] === "<" || text.slice(offset - 2, offset) === "</")) {
|
||||||
|
return "jsxIdentifier";
|
||||||
|
}
|
||||||
|
const firstChar = String.fromCodePoint(tokenValue.codePointAt(0));
|
||||||
|
if (firstChar !== firstChar.toLowerCase()) {
|
||||||
|
return "capitalized";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (token.type === "punctuator" && BRACKET.test(token.value)) {
|
||||||
|
return "bracket";
|
||||||
|
}
|
||||||
|
if (token.type === "invalid" && (token.value === "@" || token.value === "#")) {
|
||||||
|
return "punctuator";
|
||||||
|
}
|
||||||
|
return token.type;
|
||||||
|
};
|
||||||
|
tokenize = function* (text) {
|
||||||
|
let match;
|
||||||
|
while (match = jsTokens.default.exec(text)) {
|
||||||
|
const token = jsTokens.matchToToken(match);
|
||||||
|
yield {
|
||||||
|
type: getTokenType(token, match.index, text),
|
||||||
|
value: token.value
|
||||||
|
};
|
||||||
|
}
|
||||||
|
};
|
||||||
|
function highlight(text) {
|
||||||
|
if (text === "") return "";
|
||||||
|
const defs = getDefs(true);
|
||||||
|
let highlighted = "";
|
||||||
|
for (const {
|
||||||
|
type,
|
||||||
|
value
|
||||||
|
} of tokenize(text)) {
|
||||||
|
if (type in defs) {
|
||||||
|
highlighted += value.split(NEWLINE$1).map(str => defs[type](str)).join("\n");
|
||||||
|
} else {
|
||||||
|
highlighted += value;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return highlighted;
|
||||||
|
}
|
||||||
|
|
||||||
|
let deprecationWarningShown = false;
|
||||||
|
const NEWLINE = /\r\n|[\n\r\u2028\u2029]/;
|
||||||
|
function getMarkerLines(loc, source, opts, startLineBaseZero) {
|
||||||
|
const startLoc = Object.assign({
|
||||||
|
column: 0,
|
||||||
|
line: -1
|
||||||
|
}, loc.start);
|
||||||
|
const endLoc = Object.assign({}, startLoc, loc.end);
|
||||||
|
const {
|
||||||
|
linesAbove = 2,
|
||||||
|
linesBelow = 3
|
||||||
|
} = opts || {};
|
||||||
|
const startLine = startLoc.line - startLineBaseZero;
|
||||||
|
const startColumn = startLoc.column;
|
||||||
|
const endLine = endLoc.line - startLineBaseZero;
|
||||||
|
const endColumn = endLoc.column;
|
||||||
|
let start = Math.max(startLine - (linesAbove + 1), 0);
|
||||||
|
let end = Math.min(source.length, endLine + linesBelow);
|
||||||
|
if (startLine === -1) {
|
||||||
|
start = 0;
|
||||||
|
}
|
||||||
|
if (endLine === -1) {
|
||||||
|
end = source.length;
|
||||||
|
}
|
||||||
|
const lineDiff = endLine - startLine;
|
||||||
|
const markerLines = {};
|
||||||
|
if (lineDiff) {
|
||||||
|
for (let i = 0; i <= lineDiff; i++) {
|
||||||
|
const lineNumber = i + startLine;
|
||||||
|
if (!startColumn) {
|
||||||
|
markerLines[lineNumber] = true;
|
||||||
|
} else if (i === 0) {
|
||||||
|
const sourceLength = source[lineNumber - 1].length;
|
||||||
|
markerLines[lineNumber] = [startColumn, sourceLength - startColumn + 1];
|
||||||
|
} else if (i === lineDiff) {
|
||||||
|
markerLines[lineNumber] = [0, endColumn];
|
||||||
|
} else {
|
||||||
|
const sourceLength = source[lineNumber - i].length;
|
||||||
|
markerLines[lineNumber] = [0, sourceLength];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
if (startColumn === endColumn) {
|
||||||
|
if (startColumn) {
|
||||||
|
markerLines[startLine] = [startColumn, 0];
|
||||||
|
} else {
|
||||||
|
markerLines[startLine] = true;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
markerLines[startLine] = [startColumn, endColumn - startColumn];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
start,
|
||||||
|
end,
|
||||||
|
markerLines
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function codeFrameColumns(rawLines, loc, opts = {}) {
|
||||||
|
const shouldHighlight = opts.forceColor || isColorSupported() && opts.highlightCode;
|
||||||
|
const startLineBaseZero = (opts.startLine || 1) - 1;
|
||||||
|
const defs = getDefs(shouldHighlight);
|
||||||
|
const lines = rawLines.split(NEWLINE);
|
||||||
|
const {
|
||||||
|
start,
|
||||||
|
end,
|
||||||
|
markerLines
|
||||||
|
} = getMarkerLines(loc, lines, opts, startLineBaseZero);
|
||||||
|
const hasColumns = loc.start && typeof loc.start.column === "number";
|
||||||
|
const numberMaxWidth = String(end + startLineBaseZero).length;
|
||||||
|
const highlightedLines = shouldHighlight ? highlight(rawLines) : rawLines;
|
||||||
|
let frame = highlightedLines.split(NEWLINE, end).slice(start, end).map((line, index) => {
|
||||||
|
const number = start + 1 + index;
|
||||||
|
const paddedNumber = ` ${number + startLineBaseZero}`.slice(-numberMaxWidth);
|
||||||
|
const gutter = ` ${paddedNumber} |`;
|
||||||
|
const hasMarker = markerLines[number];
|
||||||
|
const lastMarkerLine = !markerLines[number + 1];
|
||||||
|
if (hasMarker) {
|
||||||
|
let markerLine = "";
|
||||||
|
if (Array.isArray(hasMarker)) {
|
||||||
|
const markerSpacing = line.slice(0, Math.max(hasMarker[0] - 1, 0)).replace(/[^\t]/g, " ");
|
||||||
|
const numberOfMarkers = hasMarker[1] || 1;
|
||||||
|
markerLine = ["\n ", defs.gutter(gutter.replace(/\d/g, " ")), " ", markerSpacing, defs.marker("^").repeat(numberOfMarkers)].join("");
|
||||||
|
if (lastMarkerLine && opts.message) {
|
||||||
|
markerLine += " " + defs.message(opts.message);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return [defs.marker(">"), defs.gutter(gutter), line.length > 0 ? ` ${line}` : "", markerLine].join("");
|
||||||
|
} else {
|
||||||
|
return ` ${defs.gutter(gutter)}${line.length > 0 ? ` ${line}` : ""}`;
|
||||||
|
}
|
||||||
|
}).join("\n");
|
||||||
|
if (opts.message && !hasColumns) {
|
||||||
|
frame = `${" ".repeat(numberMaxWidth + 1)}${opts.message}\n${frame}`;
|
||||||
|
}
|
||||||
|
if (shouldHighlight) {
|
||||||
|
return defs.reset(frame);
|
||||||
|
} else {
|
||||||
|
return frame;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
function index (rawLines, lineNumber, colNumber, opts = {}) {
|
||||||
|
if (!deprecationWarningShown) {
|
||||||
|
deprecationWarningShown = true;
|
||||||
|
const message = "Passing lineNumber and colNumber is deprecated to @babel/code-frame. Please use `codeFrameColumns`.";
|
||||||
|
if (process.emitWarning) {
|
||||||
|
process.emitWarning(message, "DeprecationWarning");
|
||||||
|
} else {
|
||||||
|
const deprecationError = new Error(message);
|
||||||
|
deprecationError.name = "DeprecationWarning";
|
||||||
|
console.warn(new Error(message));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
colNumber = Math.max(colNumber, 0);
|
||||||
|
const location = {
|
||||||
|
start: {
|
||||||
|
column: colNumber,
|
||||||
|
line: lineNumber
|
||||||
|
}
|
||||||
|
};
|
||||||
|
return codeFrameColumns(rawLines, location, opts);
|
||||||
|
}
|
||||||
|
|
||||||
|
exports.codeFrameColumns = codeFrameColumns;
|
||||||
|
exports.default = index;
|
||||||
|
exports.highlight = highlight;
|
||||||
|
//# sourceMappingURL=index.js.map
|
||||||
File diff suppressed because one or more lines are too long
|
|
@ -0,0 +1,32 @@
|
||||||
|
{
|
||||||
|
"name": "@babel/code-frame",
|
||||||
|
"version": "7.29.0",
|
||||||
|
"description": "Generate errors that contain a code frame that point to source locations.",
|
||||||
|
"author": "The Babel Team (https://babel.dev/team)",
|
||||||
|
"homepage": "https://babel.dev/docs/en/next/babel-code-frame",
|
||||||
|
"bugs": "https://github.com/babel/babel/issues?utf8=%E2%9C%93&q=is%3Aissue+is%3Aopen",
|
||||||
|
"license": "MIT",
|
||||||
|
"publishConfig": {
|
||||||
|
"access": "public"
|
||||||
|
},
|
||||||
|
"repository": {
|
||||||
|
"type": "git",
|
||||||
|
"url": "https://github.com/babel/babel.git",
|
||||||
|
"directory": "packages/babel-code-frame"
|
||||||
|
},
|
||||||
|
"main": "./lib/index.js",
|
||||||
|
"dependencies": {
|
||||||
|
"@babel/helper-validator-identifier": "^7.28.5",
|
||||||
|
"js-tokens": "^4.0.0",
|
||||||
|
"picocolors": "^1.1.1"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"charcodes": "^0.2.0",
|
||||||
|
"import-meta-resolve": "^4.1.0",
|
||||||
|
"strip-ansi": "^4.0.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">=6.9.0"
|
||||||
|
},
|
||||||
|
"type": "commonjs"
|
||||||
|
}
|
||||||
|
|
@ -0,0 +1,22 @@
|
||||||
|
MIT License
|
||||||
|
|
||||||
|
Copyright (c) 2014-present Sebastian McKenzie and other contributors
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining
|
||||||
|
a copy of this software and associated documentation files (the
|
||||||
|
"Software"), to deal in the Software without restriction, including
|
||||||
|
without limitation the rights to use, copy, modify, merge, publish,
|
||||||
|
distribute, sublicense, and/or sell copies of the Software, and to
|
||||||
|
permit persons to whom the Software is furnished to do so, subject to
|
||||||
|
the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be
|
||||||
|
included in all copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
|
||||||
|
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
|
||||||
|
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
|
||||||
|
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
|
||||||
|
LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
|
||||||
|
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
|
||||||
|
WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
||||||
|
|
@ -0,0 +1,19 @@
|
||||||
|
# @babel/compat-data
|
||||||
|
|
||||||
|
> The compat-data to determine required Babel plugins
|
||||||
|
|
||||||
|
See our website [@babel/compat-data](https://babeljs.io/docs/babel-compat-data) for more information.
|
||||||
|
|
||||||
|
## Install
|
||||||
|
|
||||||
|
Using npm:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
npm install --save @babel/compat-data
|
||||||
|
```
|
||||||
|
|
||||||
|
or using yarn:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
yarn add @babel/compat-data
|
||||||
|
```
|
||||||
|
|
@ -0,0 +1,2 @@
|
||||||
|
// Todo (Babel 8): remove this file as Babel 8 drop support of core-js 2
|
||||||
|
module.exports = require("./data/corejs2-built-ins.json");
|
||||||
2
frontend/node_modules/@babel/compat-data/corejs3-shipped-proposals.js
generated
vendored
Normal file
2
frontend/node_modules/@babel/compat-data/corejs3-shipped-proposals.js
generated
vendored
Normal file
|
|
@ -0,0 +1,2 @@
|
||||||
|
// Todo (Babel 8): remove this file now that it is included in babel-plugin-polyfill-corejs3
|
||||||
|
module.exports = require("./data/corejs3-shipped-proposals.json");
|
||||||
2106
frontend/node_modules/@babel/compat-data/data/corejs2-built-ins.json
generated
vendored
Normal file
2106
frontend/node_modules/@babel/compat-data/data/corejs2-built-ins.json
generated
vendored
Normal file
File diff suppressed because it is too large
Load Diff
5
frontend/node_modules/@babel/compat-data/data/corejs3-shipped-proposals.json
generated
vendored
Normal file
5
frontend/node_modules/@babel/compat-data/data/corejs3-shipped-proposals.json
generated
vendored
Normal file
|
|
@ -0,0 +1,5 @@
|
||||||
|
[
|
||||||
|
"esnext.promise.all-settled",
|
||||||
|
"esnext.string.match-all",
|
||||||
|
"esnext.global-this"
|
||||||
|
]
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue