mirror of
https://github.com/rcourtman/Pulse.git
synced 2026-02-18 00:17:39 +01:00
AI Problem Solver implementation and various fixes
- Implement 'Show Problems Only' toggle combining degraded status, high CPU/memory alerts, and needs backup filters - Add 'Investigate with AI' button to filter bar for problematic guests - Fix dashboard column sizing inconsistencies between bars and sparklines view modes - Fix PBS backups display and polling - Refine AI prompt for general-purpose usage - Fix frontend flickering and reload loops during initial load - Integrate persistent SQLite metrics store with Monitor - Fortify AI command routing with improved validation and logging - Fix CSRF token handling for note deletion - Debug and fix AI command execution issues - Various AI reliability improvements and command safety enhancements
This commit is contained in:
182
.gemini/tasks/persistent-metrics-storage.md
Normal file
182
.gemini/tasks/persistent-metrics-storage.md
Normal file
@@ -0,0 +1,182 @@
|
||||
# Task: Persistent Metrics Storage for Sparklines
|
||||
|
||||
## Problem
|
||||
Currently, metrics history for sparklines is stored **in-memory only**. When the Pulse backend restarts, all historical metrics are lost. Users expect to see historical trends even after being away for days.
|
||||
|
||||
## Goal
|
||||
Implement SQLite-based persistent metrics storage that:
|
||||
- Survives backend restarts
|
||||
- Provides historical data for sparklines/trends view
|
||||
- Supports configurable retention periods
|
||||
- Minimizes disk I/O and storage footprint
|
||||
|
||||
## Architecture
|
||||
|
||||
### Storage Tiers (Data Rollup)
|
||||
```
|
||||
┌─────────────────────────────────────────────────────────┐
|
||||
│ RAW (5s intervals) → Keep 2 hours → ~1,440 pts │
|
||||
│ MINUTE (1min avg) → Keep 24 hours → ~1,440 pts │
|
||||
│ HOURLY (1hr avg) → Keep 7 days → ~168 pts │
|
||||
│ DAILY (1day avg) → Keep 90 days → ~90 pts │
|
||||
└─────────────────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
### Database Schema
|
||||
```sql
|
||||
-- Main metrics table (partitioned by time for efficient pruning)
|
||||
CREATE TABLE metrics (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
resource_type TEXT NOT NULL, -- 'node', 'vm', 'container', 'storage'
|
||||
resource_id TEXT NOT NULL,
|
||||
metric_type TEXT NOT NULL, -- 'cpu', 'memory', 'disk'
|
||||
value REAL NOT NULL,
|
||||
timestamp INTEGER NOT NULL, -- Unix timestamp in seconds
|
||||
tier TEXT DEFAULT 'raw' -- 'raw', 'minute', 'hourly', 'daily'
|
||||
);
|
||||
|
||||
-- Indexes for efficient queries
|
||||
CREATE INDEX idx_metrics_lookup ON metrics(resource_type, resource_id, metric_type, tier, timestamp);
|
||||
CREATE INDEX idx_metrics_timestamp ON metrics(timestamp);
|
||||
CREATE INDEX idx_metrics_tier_time ON metrics(tier, timestamp);
|
||||
```
|
||||
|
||||
### Configuration
|
||||
```yaml
|
||||
metrics:
|
||||
enabled: true
|
||||
database_path: "${PULSE_DATA_DIR}/metrics.db"
|
||||
retention:
|
||||
raw: 2h # 2 hours of raw data
|
||||
minute: 24h # 24 hours of 1-minute averages
|
||||
hourly: 168h # 7 days of hourly averages
|
||||
daily: 2160h # 90 days of daily averages
|
||||
write_buffer: 100 # Buffer size before batch write
|
||||
rollup_interval: 5m # How often to run rollup job
|
||||
```
|
||||
|
||||
## Implementation Steps
|
||||
|
||||
### Phase 1: SQLite Foundation ✅ COMPLETED
|
||||
- [x] Add SQLite dependency (`modernc.org/sqlite` - pure Go, no CGO)
|
||||
- [x] Create `internal/metrics/store.go` with:
|
||||
- `Store` struct
|
||||
- `NewStore(config StoreConfig) (*Store, error)`
|
||||
- `Close() error`
|
||||
- Schema auto-migration on startup
|
||||
|
||||
### Phase 2: Write Path ✅ COMPLETED
|
||||
- [x] Create `Write(resourceType, resourceID, metricType string, value float64, timestamp time.Time)`
|
||||
- [x] Implement write buffering (batch inserts every 100 records or 5 seconds)
|
||||
- [x] Integrate with existing `AddGuestMetric`, `AddNodeMetric` calls in monitor.go and monitor_polling.go
|
||||
- [x] Add graceful shutdown to flush buffer
|
||||
|
||||
### Phase 3: Read Path ✅ COMPLETED
|
||||
- [x] Create `Query(resourceType, resourceID, metricType string, start, end time.Time) ([]MetricPoint, error)`
|
||||
- [x] Auto-select appropriate tier based on time range:
|
||||
- < 2 hours → raw data
|
||||
- 2-24 hours → minute data
|
||||
- 1-7 days → hourly data
|
||||
- 7+ days → daily data
|
||||
- [x] Add `/api/metrics-store/stats` endpoint for monitoring
|
||||
|
||||
### Phase 4: Rollup & Retention ✅ COMPLETED
|
||||
- [x] Create background rollup job:
|
||||
- Runs every 5 minutes
|
||||
- Aggregates raw → minute (AVG, MIN, MAX)
|
||||
- Aggregates minute → hourly
|
||||
- Aggregates hourly → daily
|
||||
- [x] Create retention pruning job:
|
||||
- Runs every hour
|
||||
- Deletes data older than configured retention
|
||||
- [x] Use SQLite transactions for atomic operations
|
||||
|
||||
### Phase 5: Integration
|
||||
- [ ] Add configuration to `system.json` or new `metrics.json`
|
||||
- [ ] Add Settings UI for metrics retention config
|
||||
- [ ] Add database file size monitoring
|
||||
- [ ] Add vacuum/optimize scheduled job (weekly)
|
||||
|
||||
## Files to Create/Modify
|
||||
|
||||
### New Files
|
||||
```
|
||||
internal/metrics/
|
||||
├── store.go # MetricsStore implementation
|
||||
├── store_test.go # Unit tests
|
||||
├── rollup.go # Rollup/aggregation logic
|
||||
├── retention.go # Retention/pruning logic
|
||||
└── config.go # Metrics configuration
|
||||
```
|
||||
|
||||
### Files to Modify
|
||||
```
|
||||
internal/monitoring/monitor.go # Initialize MetricsStore, call Write()
|
||||
internal/monitoring/metrics_history.go # Keep in-memory as cache, backed by SQLite
|
||||
internal/api/router.go # Update handleCharts to query from store
|
||||
internal/config/persistence.go # Add metrics config persistence
|
||||
```
|
||||
|
||||
## API Changes
|
||||
|
||||
### `/api/charts` Query Parameters
|
||||
```
|
||||
GET /api/charts?range=1h # Last hour (raw/minute data)
|
||||
GET /api/charts?range=24h # Last 24 hours (minute data)
|
||||
GET /api/charts?range=7d # Last 7 days (hourly data)
|
||||
GET /api/charts?range=30d # Last 30 days (daily data)
|
||||
GET /api/charts?start=...&end=... # Custom range
|
||||
```
|
||||
|
||||
### Response Enhancement
|
||||
```json
|
||||
{
|
||||
"data": { ... },
|
||||
"nodeData": { ... },
|
||||
"stats": {
|
||||
"oldestDataTimestamp": 1699900000000,
|
||||
"tier": "hourly",
|
||||
"pointCount": 168
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Performance Considerations
|
||||
|
||||
1. **Write Buffering**: Batch inserts to reduce I/O
|
||||
2. **WAL Mode**: Enable SQLite WAL for concurrent reads/writes
|
||||
3. **Prepared Statements**: Reuse for repeated queries
|
||||
4. **Index Strategy**: Composite index on (resource_type, resource_id, metric_type, tier, timestamp)
|
||||
5. **Connection Pooling**: Single connection with proper locking for SQLite
|
||||
6. **Memory Mapping**: Use `PRAGMA mmap_size` for faster reads
|
||||
|
||||
## Storage Estimates
|
||||
For a typical Pulse installation (5 nodes, 50 VMs, 20 containers, 10 storage):
|
||||
- 85 resources × 3 metrics = 255 metric series
|
||||
- Raw (2h at 5s): ~86,400 rows → ~10 MB
|
||||
- Minute (24h): ~367,200 rows → ~40 MB
|
||||
- Hourly (7d): ~42,840 rows → ~5 MB
|
||||
- Daily (90d): ~22,950 rows → ~3 MB
|
||||
- **Total: ~60-100 MB** for comprehensive historical data
|
||||
|
||||
## Testing Plan
|
||||
1. Unit tests for store CRUD operations
|
||||
2. Unit tests for rollup logic
|
||||
3. Integration tests with mock monitor
|
||||
4. Performance tests with 100+ resources
|
||||
5. Restart resilience tests
|
||||
|
||||
## Rollout Plan
|
||||
1. Implement as opt-in feature (disable by default initially)
|
||||
2. Add migration path from in-memory to SQLite
|
||||
3. Test in dev environment for 1 week
|
||||
4. Enable by default in next minor release
|
||||
|
||||
## Definition of Done
|
||||
- [ ] SQLite metrics storage implemented
|
||||
- [ ] Data survives backend restart
|
||||
- [ ] Rollup/retention working correctly
|
||||
- [ ] Charts endpoint serves historical data
|
||||
- [ ] Documentation updated
|
||||
- [ ] Settings UI for retention config
|
||||
- [ ] Performance validated (no noticeable slowdown)
|
||||
@@ -183,8 +183,8 @@ func runServer() {
|
||||
Addr: fmt.Sprintf("%s:%d", cfg.BackendHost, cfg.FrontendPort),
|
||||
Handler: router.Handler(),
|
||||
ReadHeaderTimeout: 15 * time.Second,
|
||||
WriteTimeout: 60 * time.Second, // Increased from 15s to 60s to support large JSON responses (e.g., mock data)
|
||||
IdleTimeout: 60 * time.Second,
|
||||
WriteTimeout: 0, // Disabled to support SSE/streaming - each handler manages its own deadline
|
||||
IdleTimeout: 120 * time.Second,
|
||||
}
|
||||
|
||||
// Start config watcher for .env file changes
|
||||
|
||||
@@ -6,6 +6,34 @@
|
||||
<meta name="theme-color" content="#000000" />
|
||||
<link rel="icon" type="image/svg+xml" href="/logo.svg" />
|
||||
<title>Pulse</title>
|
||||
<!-- Inline critical styles to prevent flash of unstyled content -->
|
||||
<style>
|
||||
/* Apply theme immediately before any CSS loads */
|
||||
html { background-color: #f3f4f6; } /* gray-100 */
|
||||
html.dark { background-color: #111827; } /* gray-900 */
|
||||
body { margin: 0; min-height: 100vh; }
|
||||
#root { min-height: 100vh; }
|
||||
/* Hide content until app is ready to prevent layout shift */
|
||||
#root:empty {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
background-color: inherit;
|
||||
}
|
||||
</style>
|
||||
<!-- Apply dark mode immediately from localStorage to prevent flash -->
|
||||
<script>
|
||||
(function() {
|
||||
try {
|
||||
var darkMode = localStorage.getItem('pulse_dark_mode');
|
||||
var prefersDark = darkMode === 'true' ||
|
||||
(darkMode === null && window.matchMedia('(prefers-color-scheme: dark)').matches);
|
||||
if (prefersDark) {
|
||||
document.documentElement.classList.add('dark');
|
||||
}
|
||||
} catch (e) {}
|
||||
})();
|
||||
</script>
|
||||
</head>
|
||||
<body>
|
||||
<noscript>You need to enable JavaScript to run this app.</noscript>
|
||||
|
||||
@@ -35,6 +35,8 @@ import { createTooltipSystem } from './components/shared/Tooltip';
|
||||
import type { State } from '@/types/api';
|
||||
import { ProxmoxIcon } from '@/components/icons/ProxmoxIcon';
|
||||
import { startMetricsSampler } from './stores/metricsSampler';
|
||||
import { seedFromBackend } from './stores/metricsHistory';
|
||||
import { getMetricsViewMode } from './stores/metricsViewMode';
|
||||
import BoxesIcon from 'lucide-solid/icons/boxes';
|
||||
import MonitorIcon from 'lucide-solid/icons/monitor';
|
||||
import BellIcon from 'lucide-solid/icons/bell';
|
||||
@@ -213,6 +215,13 @@ function App() {
|
||||
// Start metrics sampler for sparklines
|
||||
onMount(() => {
|
||||
startMetricsSampler();
|
||||
|
||||
// If user already has sparklines mode enabled, seed historical data immediately
|
||||
if (getMetricsViewMode() === 'sparklines') {
|
||||
seedFromBackend('1h').catch(() => {
|
||||
// Errors are already logged in seedFromBackend
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
let hasPreloadedRoutes = false;
|
||||
@@ -253,23 +262,23 @@ function App() {
|
||||
pmg: [],
|
||||
replicationJobs: [],
|
||||
metrics: [],
|
||||
pveBackups: {
|
||||
backupTasks: [],
|
||||
storageBackups: [],
|
||||
guestSnapshots: [],
|
||||
},
|
||||
pbsBackups: [],
|
||||
pmgBackups: [],
|
||||
backups: {
|
||||
pve: {
|
||||
pveBackups: {
|
||||
backupTasks: [],
|
||||
storageBackups: [],
|
||||
guestSnapshots: [],
|
||||
},
|
||||
pbs: [],
|
||||
pmg: [],
|
||||
},
|
||||
performance: {
|
||||
pbsBackups: [],
|
||||
pmgBackups: [],
|
||||
backups: {
|
||||
pve: {
|
||||
backupTasks: [],
|
||||
storageBackups: [],
|
||||
guestSnapshots: [],
|
||||
},
|
||||
pbs: [],
|
||||
pmg: [],
|
||||
},
|
||||
performance: {
|
||||
apiCallDuration: {},
|
||||
lastPollDuration: 0,
|
||||
pollingStartTime: '',
|
||||
@@ -500,9 +509,9 @@ function App() {
|
||||
|
||||
// Detect legacy DISABLE_AUTH flag (now ignored) so we can surface a warning
|
||||
if (securityData.deprecatedDisableAuth === true) {
|
||||
logger.warn(
|
||||
'[App] Legacy DISABLE_AUTH flag detected; authentication remains enabled. Remove the flag and restart Pulse to silence this warning.',
|
||||
);
|
||||
logger.warn(
|
||||
'[App] Legacy DISABLE_AUTH flag detected; authentication remains enabled. Remove the flag and restart Pulse to silence this warning.',
|
||||
);
|
||||
}
|
||||
|
||||
const authConfigured = securityData.hasAuthentication || false;
|
||||
@@ -727,16 +736,19 @@ function App() {
|
||||
const RootLayout = (props: { children?: JSX.Element }) => {
|
||||
// Check AI settings on mount and setup keyboard shortcut
|
||||
onMount(() => {
|
||||
// Check if AI is enabled
|
||||
import('./api/ai').then(({ AIAPI }) => {
|
||||
AIAPI.getSettings()
|
||||
.then((settings) => {
|
||||
aiChatStore.setEnabled(settings.enabled && settings.configured);
|
||||
})
|
||||
.catch(() => {
|
||||
aiChatStore.setEnabled(false);
|
||||
});
|
||||
});
|
||||
// Only check AI settings if already authenticated (not on login screen)
|
||||
// Otherwise, the 401 response triggers a redirect loop
|
||||
if (!needsAuth()) {
|
||||
import('./api/ai').then(({ AIAPI }) => {
|
||||
AIAPI.getSettings()
|
||||
.then((settings) => {
|
||||
aiChatStore.setEnabled(settings.enabled && settings.configured);
|
||||
})
|
||||
.catch(() => {
|
||||
aiChatStore.setEnabled(false);
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
// Keyboard shortcut: Cmd/Ctrl+K to toggle AI
|
||||
const handleKeyDown = (e: KeyboardEvent) => {
|
||||
@@ -762,14 +774,18 @@ function App() {
|
||||
<Show
|
||||
when={!isLoading()}
|
||||
fallback={
|
||||
<div class="min-h-screen flex items-center justify-center bg-gray-50 dark:bg-gray-900">
|
||||
<div class="min-h-screen flex items-center justify-center bg-gray-100 dark:bg-gray-900">
|
||||
<div class="text-gray-600 dark:text-gray-400">Loading...</div>
|
||||
</div>
|
||||
}
|
||||
>
|
||||
<Show when={!needsAuth()} fallback={<Login onLogin={handleLogin} />}>
|
||||
<Show when={!needsAuth()} fallback={<Login onLogin={handleLogin} hasAuth={hasAuth()} />}>
|
||||
<ErrorBoundary>
|
||||
<Show when={enhancedStore()} fallback={<div>Initializing...</div>}>
|
||||
<Show when={enhancedStore()} fallback={
|
||||
<div class="min-h-screen flex items-center justify-center bg-gray-100 dark:bg-gray-900">
|
||||
<div class="text-gray-600 dark:text-gray-400">Initializing...</div>
|
||||
</div>
|
||||
}>
|
||||
<WebSocketContext.Provider value={enhancedStore()!}>
|
||||
<DarkModeContext.Provider value={darkMode}>
|
||||
<SecurityWarning />
|
||||
@@ -870,13 +886,12 @@ function ConnectionStatusBadge(props: {
|
||||
}) {
|
||||
return (
|
||||
<div
|
||||
class={`group status text-xs rounded-full flex items-center justify-center transition-all duration-500 ease-in-out px-1.5 ${
|
||||
props.connected()
|
||||
? 'connected bg-green-200 dark:bg-green-700 text-green-700 dark:text-green-300 min-w-6 h-6 group-hover:px-3'
|
||||
: props.reconnecting()
|
||||
? 'reconnecting bg-yellow-200 dark:bg-yellow-700 text-yellow-700 dark:text-yellow-300 py-1'
|
||||
: 'disconnected bg-gray-200 dark:bg-gray-700 text-gray-700 dark:text-gray-300 min-w-6 h-6 group-hover:px-3'
|
||||
} ${props.class ?? ''}`}
|
||||
class={`group status text-xs rounded-full flex items-center justify-center transition-all duration-500 ease-in-out px-1.5 ${props.connected()
|
||||
? 'connected bg-green-200 dark:bg-green-700 text-green-700 dark:text-green-300 min-w-6 h-6 group-hover:px-3'
|
||||
: props.reconnecting()
|
||||
? 'reconnecting bg-yellow-200 dark:bg-yellow-700 text-yellow-700 dark:text-yellow-300 py-1'
|
||||
: 'disconnected bg-gray-200 dark:bg-gray-700 text-gray-700 dark:text-gray-300 min-w-6 h-6 group-hover:px-3'
|
||||
} ${props.class ?? ''}`}
|
||||
>
|
||||
<Show when={props.reconnecting()}>
|
||||
<svg class="animate-spin h-3 w-3 flex-shrink-0" fill="none" viewBox="0 0 24 24">
|
||||
@@ -902,11 +917,10 @@ function ConnectionStatusBadge(props: {
|
||||
<span class="h-2.5 w-2.5 rounded-full bg-gray-600 dark:bg-gray-400 flex-shrink-0"></span>
|
||||
</Show>
|
||||
<span
|
||||
class={`whitespace-nowrap overflow-hidden transition-all duration-500 ${
|
||||
props.connected() || (!props.connected() && !props.reconnecting())
|
||||
? 'max-w-0 group-hover:max-w-[100px] group-hover:ml-2 group-hover:mr-1 opacity-0 group-hover:opacity-100'
|
||||
: 'max-w-[100px] ml-1 opacity-100'
|
||||
}`}
|
||||
class={`whitespace-nowrap overflow-hidden transition-all duration-500 ${props.connected() || (!props.connected() && !props.reconnecting())
|
||||
? 'max-w-0 group-hover:max-w-[100px] group-hover:ml-2 group-hover:mr-1 opacity-0 group-hover:opacity-100'
|
||||
: 'max-w-[100px] ml-1 opacity-100'
|
||||
}`}
|
||||
>
|
||||
{props.connected()
|
||||
? 'Connected'
|
||||
@@ -1241,12 +1255,12 @@ function AppLayout(props: {
|
||||
const baseClasses =
|
||||
'tab relative px-2 sm:px-3 py-1.5 text-xs sm:text-sm font-medium flex items-center gap-1 sm:gap-1.5 rounded-t border border-transparent transition-colors whitespace-nowrap cursor-pointer';
|
||||
|
||||
const className = () => {
|
||||
if (isActive()) {
|
||||
return `${baseClasses} bg-white dark:bg-gray-800 text-blue-600 dark:text-blue-400 border-gray-300 dark:border-gray-700 border-b border-b-white dark:border-b-gray-800 shadow-sm font-semibold`;
|
||||
}
|
||||
return `${baseClasses} text-gray-500 dark:text-gray-400 hover:text-gray-700 dark:hover:text-gray-300 hover:bg-gray-200/60 dark:hover:bg-gray-700/60`;
|
||||
};
|
||||
const className = () => {
|
||||
if (isActive()) {
|
||||
return `${baseClasses} bg-white dark:bg-gray-800 text-blue-600 dark:text-blue-400 border-gray-300 dark:border-gray-700 border-b border-b-white dark:border-b-gray-800 shadow-sm font-semibold`;
|
||||
}
|
||||
return `${baseClasses} text-gray-500 dark:text-gray-400 hover:text-gray-700 dark:hover:text-gray-300 hover:bg-gray-200/60 dark:hover:bg-gray-700/60`;
|
||||
};
|
||||
|
||||
return (
|
||||
<div
|
||||
|
||||
@@ -46,13 +46,127 @@ export class AIAPI {
|
||||
target_id: string;
|
||||
run_on_host: boolean;
|
||||
vmid?: string;
|
||||
target_host?: string; // Explicit host for command routing
|
||||
}): Promise<{ output: string; success: boolean; error?: string }> {
|
||||
// Ensure run_on_host is explicitly a boolean (not undefined)
|
||||
const sanitizedRequest = {
|
||||
command: request.command,
|
||||
target_type: request.target_type,
|
||||
target_id: request.target_id,
|
||||
run_on_host: Boolean(request.run_on_host),
|
||||
...(request.vmid ? { vmid: String(request.vmid) } : {}),
|
||||
...(request.target_host ? { target_host: request.target_host } : {}),
|
||||
};
|
||||
const body = JSON.stringify(sanitizedRequest);
|
||||
console.log('[AI] runCommand request:', request);
|
||||
console.log('[AI] runCommand sanitized:', sanitizedRequest);
|
||||
console.log('[AI] runCommand body:', body);
|
||||
console.log('[AI] runCommand body length:', body.length);
|
||||
return apiFetchJSON(`${this.baseUrl}/ai/run-command`, {
|
||||
method: 'POST',
|
||||
body: JSON.stringify(request),
|
||||
body,
|
||||
}) as Promise<{ output: string; success: boolean; error?: string }>;
|
||||
}
|
||||
|
||||
|
||||
// Investigate an alert with AI (one-click investigation)
|
||||
static async investigateAlert(
|
||||
request: {
|
||||
alert_id: string;
|
||||
resource_id: string;
|
||||
resource_name: string;
|
||||
resource_type: string;
|
||||
alert_type: string;
|
||||
level: string;
|
||||
value: number;
|
||||
threshold: number;
|
||||
message: string;
|
||||
duration: string;
|
||||
node?: string;
|
||||
vmid?: number;
|
||||
},
|
||||
onEvent: (event: AIStreamEvent) => void,
|
||||
signal?: AbortSignal
|
||||
): Promise<void> {
|
||||
console.log('[AI] Starting alert investigation:', request);
|
||||
|
||||
const response = await apiFetch(`${this.baseUrl}/ai/investigate-alert`, {
|
||||
method: 'POST',
|
||||
body: JSON.stringify(request),
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
Accept: 'text/event-stream',
|
||||
},
|
||||
signal,
|
||||
});
|
||||
|
||||
if (!response.ok) {
|
||||
const text = await response.text();
|
||||
throw new Error(text || `Request failed with status ${response.status}`);
|
||||
}
|
||||
|
||||
const reader = response.body?.getReader();
|
||||
if (!reader) {
|
||||
throw new Error('No response body');
|
||||
}
|
||||
|
||||
const decoder = new TextDecoder();
|
||||
let buffer = '';
|
||||
// 5 minutes timeout - Opus models can take a long time
|
||||
const STREAM_TIMEOUT_MS = 300000;
|
||||
let lastEventTime = Date.now();
|
||||
|
||||
try {
|
||||
while (true) {
|
||||
if (Date.now() - lastEventTime > STREAM_TIMEOUT_MS) {
|
||||
console.warn('[AI] Alert investigation stream timeout');
|
||||
break;
|
||||
}
|
||||
|
||||
const readPromise = reader.read();
|
||||
const timeoutPromise = new Promise<never>((_, reject) => {
|
||||
setTimeout(() => reject(new Error('Read timeout')), STREAM_TIMEOUT_MS);
|
||||
});
|
||||
|
||||
let result: ReadableStreamReadResult<Uint8Array>;
|
||||
try {
|
||||
result = await Promise.race([readPromise, timeoutPromise]);
|
||||
} catch (e) {
|
||||
if ((e as Error).message === 'Read timeout') break;
|
||||
throw e;
|
||||
}
|
||||
|
||||
const { done, value } = result;
|
||||
if (done) break;
|
||||
|
||||
lastEventTime = Date.now();
|
||||
buffer += decoder.decode(value, { stream: true });
|
||||
|
||||
const normalizedBuffer = buffer.replace(/\r\n/g, '\n');
|
||||
const messages = normalizedBuffer.split('\n\n');
|
||||
buffer = messages.pop() || '';
|
||||
|
||||
for (const message of messages) {
|
||||
if (!message.trim() || message.trim().startsWith(':')) continue;
|
||||
|
||||
const dataLines = message.split('\n').filter((line) => line.startsWith('data: '));
|
||||
for (const line of dataLines) {
|
||||
try {
|
||||
const jsonStr = line.slice(6);
|
||||
if (!jsonStr.trim()) continue;
|
||||
const data = JSON.parse(jsonStr);
|
||||
onEvent(data as AIStreamEvent);
|
||||
} catch (e) {
|
||||
console.error('[AI] Failed to parse investigation event:', e);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
} finally {
|
||||
reader.releaseLock();
|
||||
}
|
||||
}
|
||||
|
||||
// Execute an AI prompt with streaming
|
||||
// Returns an abort function to cancel the request
|
||||
static async executeStream(
|
||||
@@ -88,30 +202,89 @@ export class AIAPI {
|
||||
|
||||
const decoder = new TextDecoder();
|
||||
let buffer = '';
|
||||
let lastEventTime = Date.now();
|
||||
let receivedComplete = false;
|
||||
let receivedDone = false;
|
||||
|
||||
// Timeout to detect stalled streams (5 minutes - Opus models can take a long time)
|
||||
const STREAM_TIMEOUT_MS = 300000;
|
||||
|
||||
console.log('[AI SSE] Starting to read stream...');
|
||||
|
||||
try {
|
||||
while (true) {
|
||||
const { done, value } = await reader.read();
|
||||
if (done) {
|
||||
console.log('[AI SSE] Stream ended');
|
||||
// Check for stream timeout
|
||||
if (Date.now() - lastEventTime > STREAM_TIMEOUT_MS) {
|
||||
console.warn('[AI SSE] Stream timeout - no data for', STREAM_TIMEOUT_MS / 1000, 'seconds');
|
||||
break;
|
||||
}
|
||||
|
||||
// Create a promise with timeout for the read operation
|
||||
const readPromise = reader.read();
|
||||
const timeoutPromise = new Promise<never>((_, reject) => {
|
||||
setTimeout(() => reject(new Error('Read timeout')), STREAM_TIMEOUT_MS);
|
||||
});
|
||||
|
||||
let result: ReadableStreamReadResult<Uint8Array>;
|
||||
try {
|
||||
result = await Promise.race([readPromise, timeoutPromise]);
|
||||
} catch (e) {
|
||||
if ((e as Error).message === 'Read timeout') {
|
||||
console.warn('[AI SSE] Read timeout, ending stream');
|
||||
break;
|
||||
}
|
||||
throw e;
|
||||
}
|
||||
|
||||
const { done, value } = result;
|
||||
if (done) {
|
||||
console.log('[AI SSE] Stream ended normally');
|
||||
break;
|
||||
}
|
||||
|
||||
lastEventTime = Date.now();
|
||||
const chunk = decoder.decode(value, { stream: true });
|
||||
console.log('[AI SSE] Received chunk:', chunk.length, 'bytes');
|
||||
|
||||
// Log chunk info only if it's not just a heartbeat
|
||||
if (!chunk.includes(': heartbeat')) {
|
||||
console.log('[AI SSE] Received chunk:', chunk.length, 'bytes');
|
||||
}
|
||||
|
||||
buffer += chunk;
|
||||
|
||||
// Process complete SSE messages
|
||||
const lines = buffer.split('\n\n');
|
||||
buffer = lines.pop() || ''; // Keep incomplete message in buffer
|
||||
// Process complete SSE messages (separated by double newlines)
|
||||
// Handle both \n\n and \r\n\r\n for cross-platform compatibility
|
||||
const normalizedBuffer = buffer.replace(/\r\n/g, '\n');
|
||||
const messages = normalizedBuffer.split('\n\n');
|
||||
buffer = messages.pop() || ''; // Keep incomplete message in buffer
|
||||
|
||||
for (const line of lines) {
|
||||
if (line.startsWith('data: ')) {
|
||||
for (const message of messages) {
|
||||
// Skip empty messages and heartbeat comments
|
||||
if (!message.trim() || message.trim().startsWith(':')) {
|
||||
if (message.includes('heartbeat')) {
|
||||
console.debug('[AI SSE] Received heartbeat');
|
||||
}
|
||||
continue;
|
||||
}
|
||||
|
||||
// Parse SSE message (can have multiple lines, look for data: prefix)
|
||||
const dataLines = message.split('\n').filter(line => line.startsWith('data: '));
|
||||
for (const line of dataLines) {
|
||||
try {
|
||||
const data = JSON.parse(line.slice(6));
|
||||
const jsonStr = line.slice(6); // Remove 'data: ' prefix
|
||||
if (!jsonStr.trim()) continue;
|
||||
|
||||
const data = JSON.parse(jsonStr);
|
||||
console.log('[AI SSE] Parsed event:', data.type, data);
|
||||
|
||||
// Track completion events
|
||||
if (data.type === 'complete') {
|
||||
receivedComplete = true;
|
||||
}
|
||||
if (data.type === 'done') {
|
||||
receivedDone = true;
|
||||
}
|
||||
|
||||
onEvent(data as AIStreamEvent);
|
||||
} catch (e) {
|
||||
console.error('[AI SSE] Failed to parse event:', e, line);
|
||||
@@ -119,9 +292,33 @@ export class AIAPI {
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Process any remaining buffer content
|
||||
if (buffer.trim() && buffer.trim().startsWith('data: ')) {
|
||||
try {
|
||||
const jsonStr = buffer.slice(6);
|
||||
if (jsonStr.trim()) {
|
||||
const data = JSON.parse(jsonStr);
|
||||
console.log('[AI SSE] Parsed final buffered event:', data.type);
|
||||
onEvent(data as AIStreamEvent);
|
||||
if (data.type === 'complete') receivedComplete = true;
|
||||
if (data.type === 'done') receivedDone = true;
|
||||
}
|
||||
} catch (e) {
|
||||
console.warn('[AI SSE] Could not parse remaining buffer:', buffer.substring(0, 100));
|
||||
}
|
||||
}
|
||||
|
||||
// If we ended without receiving a done event, send a synthetic one
|
||||
// This ensures the UI properly clears the streaming state
|
||||
if (!receivedDone) {
|
||||
console.warn('[AI SSE] Stream ended without done event, sending synthetic done');
|
||||
onEvent({ type: 'done', data: undefined });
|
||||
}
|
||||
|
||||
} finally {
|
||||
reader.releaseLock();
|
||||
console.log('[AI SSE] Reader released');
|
||||
console.log('[AI SSE] Reader released, receivedComplete:', receivedComplete, 'receivedDone:', receivedDone);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
65
frontend-modern/src/api/charts.ts
Normal file
65
frontend-modern/src/api/charts.ts
Normal file
@@ -0,0 +1,65 @@
|
||||
/**
|
||||
* Charts API
|
||||
*
|
||||
* Fetches historical metrics data from the backend for sparkline visualizations.
|
||||
* The backend maintains proper historical data with 30s sample intervals.
|
||||
*/
|
||||
|
||||
import { apiFetchJSON } from '@/utils/apiClient';
|
||||
|
||||
// Types matching backend response format
|
||||
export interface MetricPoint {
|
||||
timestamp: number; // Unix timestamp in milliseconds
|
||||
value: number;
|
||||
}
|
||||
|
||||
export interface ChartData {
|
||||
cpu?: MetricPoint[];
|
||||
memory?: MetricPoint[];
|
||||
disk?: MetricPoint[];
|
||||
diskread?: MetricPoint[];
|
||||
diskwrite?: MetricPoint[];
|
||||
netin?: MetricPoint[];
|
||||
netout?: MetricPoint[];
|
||||
}
|
||||
|
||||
export interface ChartStats {
|
||||
oldestDataTimestamp: number;
|
||||
}
|
||||
|
||||
export interface ChartsResponse {
|
||||
data: Record<string, ChartData>; // VM/Container data keyed by ID
|
||||
nodeData: Record<string, ChartData>; // Node data keyed by ID
|
||||
storageData: Record<string, ChartData>; // Storage data keyed by ID
|
||||
timestamp: number;
|
||||
stats: ChartStats;
|
||||
}
|
||||
|
||||
export type TimeRange = '5m' | '15m' | '30m' | '1h' | '4h' | '12h' | '24h' | '7d';
|
||||
|
||||
export class ChartsAPI {
|
||||
private static baseUrl = '/api';
|
||||
|
||||
/**
|
||||
* Fetch historical chart data for all resources
|
||||
* @param range Time range to fetch (default: 1h)
|
||||
*/
|
||||
static async getCharts(range: TimeRange = '1h'): Promise<ChartsResponse> {
|
||||
const url = `${this.baseUrl}/charts?range=${range}`;
|
||||
return apiFetchJSON(url);
|
||||
}
|
||||
|
||||
/**
|
||||
* Fetch storage-specific chart data
|
||||
* @param rangeMinutes Range in minutes (default: 60)
|
||||
*/
|
||||
static async getStorageCharts(rangeMinutes: number = 60): Promise<Record<string, {
|
||||
usage?: MetricPoint[];
|
||||
used?: MetricPoint[];
|
||||
total?: MetricPoint[];
|
||||
avail?: MetricPoint[];
|
||||
}>> {
|
||||
const url = `${this.baseUrl}/storage/charts?range=${rangeMinutes}`;
|
||||
return apiFetchJSON(url);
|
||||
}
|
||||
}
|
||||
@@ -5,6 +5,7 @@ import { notificationStore } from '@/stores/notifications';
|
||||
import { logger } from '@/utils/logger';
|
||||
import { aiChatStore } from '@/stores/aiChat';
|
||||
import { useWebSocket } from '@/App';
|
||||
import { GuestNotes } from './GuestNotes';
|
||||
import type {
|
||||
AIToolExecution,
|
||||
AIStreamEvent,
|
||||
@@ -41,13 +42,16 @@ interface PendingApproval {
|
||||
toolId: string;
|
||||
toolName: string;
|
||||
runOnHost: boolean;
|
||||
targetHost?: string; // Explicit host for command routing
|
||||
isExecuting?: boolean;
|
||||
}
|
||||
|
||||
|
||||
interface Message {
|
||||
id: string;
|
||||
role: 'user' | 'assistant';
|
||||
content: string;
|
||||
thinking?: string; // DeepSeek reasoning/thinking content
|
||||
timestamp: Date;
|
||||
model?: string;
|
||||
tokens?: { input: number; output: number };
|
||||
@@ -325,6 +329,31 @@ export const AIChat: Component<AIChatProps> = (props) => {
|
||||
};
|
||||
setMessages((prev) => [...prev, streamingMessage]);
|
||||
|
||||
// Safety timeout - clear streaming state if we don't get any completion event
|
||||
// This prevents the UI from getting stuck in a streaming state
|
||||
let lastEventTime = Date.now();
|
||||
const SAFETY_TIMEOUT_MS = 120000; // 2 minutes
|
||||
|
||||
const safetyCheckInterval = setInterval(() => {
|
||||
const timeSinceLastEvent = Date.now() - lastEventTime;
|
||||
if (timeSinceLastEvent > SAFETY_TIMEOUT_MS) {
|
||||
console.warn('[AIChat] Safety timeout - forcing stream completion after', SAFETY_TIMEOUT_MS / 1000, 'seconds of inactivity');
|
||||
clearInterval(safetyCheckInterval);
|
||||
setMessages((prev) =>
|
||||
prev.map((msg) =>
|
||||
msg.id === assistantId && msg.isStreaming
|
||||
? { ...msg, isStreaming: false, content: msg.content || '(Request timed out - no response received)' }
|
||||
: msg
|
||||
)
|
||||
);
|
||||
setIsLoading(false);
|
||||
if (abortControllerRef) {
|
||||
abortControllerRef.abort();
|
||||
abortControllerRef = null;
|
||||
}
|
||||
}
|
||||
}, 10000); // Check every 10 seconds
|
||||
|
||||
try {
|
||||
await AIAPI.executeStream(
|
||||
{
|
||||
@@ -335,6 +364,7 @@ export const AIChat: Component<AIChatProps> = (props) => {
|
||||
history: history.length > 0 ? history : undefined,
|
||||
},
|
||||
(event: AIStreamEvent) => {
|
||||
lastEventTime = Date.now(); // Update last event time
|
||||
console.log('[AIChat] Received event:', event.type, event);
|
||||
// Update the streaming message based on event type
|
||||
setMessages((prev) =>
|
||||
@@ -370,6 +400,13 @@ export const AIChat: Component<AIChatProps> = (props) => {
|
||||
toolCalls: [...(msg.toolCalls || []), newToolCall],
|
||||
};
|
||||
}
|
||||
case 'thinking': {
|
||||
const thinking = event.data as string;
|
||||
return {
|
||||
...msg,
|
||||
thinking: (msg.thinking || '') + thinking,
|
||||
};
|
||||
}
|
||||
case 'content': {
|
||||
const content = event.data as string;
|
||||
return {
|
||||
@@ -409,6 +446,16 @@ export const AIChat: Component<AIChatProps> = (props) => {
|
||||
content: `Error: ${errorMsg}`,
|
||||
};
|
||||
}
|
||||
case 'processing': {
|
||||
// Show processing status for multi-iteration calls
|
||||
const status = event.data as string;
|
||||
console.log('[AIChat] Processing:', status);
|
||||
// Add as a pending tool for visual feedback
|
||||
return {
|
||||
...msg,
|
||||
pendingTools: [{ name: 'processing', input: status }],
|
||||
};
|
||||
}
|
||||
case 'approval_needed': {
|
||||
const data = event.data as AIStreamApprovalNeededData;
|
||||
return {
|
||||
@@ -417,10 +464,12 @@ export const AIChat: Component<AIChatProps> = (props) => {
|
||||
command: data.command,
|
||||
toolId: data.tool_id,
|
||||
toolName: data.tool_name,
|
||||
runOnHost: data.run_on_host,
|
||||
runOnHost: data.run_on_host ?? false, // Default to false if undefined
|
||||
targetHost: data.target_host, // Pass through the explicit routing target
|
||||
}],
|
||||
};
|
||||
}
|
||||
|
||||
default:
|
||||
return msg;
|
||||
}
|
||||
@@ -448,6 +497,7 @@ export const AIChat: Component<AIChatProps> = (props) => {
|
||||
)
|
||||
);
|
||||
} finally {
|
||||
clearInterval(safetyCheckInterval);
|
||||
abortControllerRef = null;
|
||||
setIsLoading(false);
|
||||
}
|
||||
@@ -472,11 +522,11 @@ export const AIChat: Component<AIChatProps> = (props) => {
|
||||
prev.map((m) =>
|
||||
m.id === messageId
|
||||
? {
|
||||
...m,
|
||||
pendingApprovals: m.pendingApprovals?.map((a) =>
|
||||
a.toolId === approval.toolId ? { ...a, isExecuting: true } : a
|
||||
),
|
||||
}
|
||||
...m,
|
||||
pendingApprovals: m.pendingApprovals?.map((a) =>
|
||||
a.toolId === approval.toolId ? { ...a, isExecuting: true } : a
|
||||
),
|
||||
}
|
||||
: m
|
||||
)
|
||||
);
|
||||
@@ -491,8 +541,10 @@ export const AIChat: Component<AIChatProps> = (props) => {
|
||||
target_id: targetId() || '',
|
||||
run_on_host: approval.runOnHost,
|
||||
vmid,
|
||||
target_host: approval.targetHost, // Pass through the explicit routing target
|
||||
});
|
||||
|
||||
|
||||
// Move from pending approvals to completed tool calls
|
||||
setMessages((prev) =>
|
||||
prev.map((m) => {
|
||||
@@ -528,11 +580,11 @@ export const AIChat: Component<AIChatProps> = (props) => {
|
||||
prev.map((m) =>
|
||||
m.id === messageId
|
||||
? {
|
||||
...m,
|
||||
pendingApprovals: m.pendingApprovals?.map((a) =>
|
||||
a.toolId === approval.toolId ? { ...a, isExecuting: false } : a
|
||||
),
|
||||
}
|
||||
...m,
|
||||
pendingApprovals: m.pendingApprovals?.map((a) =>
|
||||
a.toolId === approval.toolId ? { ...a, isExecuting: false } : a
|
||||
),
|
||||
}
|
||||
: m
|
||||
)
|
||||
);
|
||||
@@ -542,9 +594,8 @@ export const AIChat: Component<AIChatProps> = (props) => {
|
||||
// Panel renders as flex child, width controlled by isOpen state
|
||||
return (
|
||||
<div
|
||||
class={`flex-shrink-0 h-full bg-white dark:bg-gray-900 border-l border-gray-200 dark:border-gray-700 flex flex-col transition-all duration-300 overflow-hidden ${
|
||||
isOpen() ? 'w-[420px]' : 'w-0 border-l-0'
|
||||
}`}
|
||||
class={`flex-shrink-0 h-full bg-white dark:bg-gray-900 border-l border-gray-200 dark:border-gray-700 flex flex-col transition-all duration-300 overflow-hidden ${isOpen() ? 'w-[420px]' : 'w-0 border-l-0'
|
||||
}`}
|
||||
>
|
||||
<Show when={isOpen()}>
|
||||
{/* Header */}
|
||||
@@ -673,23 +724,40 @@ export const AIChat: Component<AIChatProps> = (props) => {
|
||||
class={`flex ${message.role === 'user' ? 'justify-end' : 'justify-start'}`}
|
||||
>
|
||||
<div
|
||||
class={`max-w-[85%] rounded-lg px-4 py-2 ${
|
||||
message.role === 'user'
|
||||
? 'bg-purple-600 text-white'
|
||||
: 'bg-gray-100 dark:bg-gray-800 text-gray-900 dark:text-gray-100'
|
||||
}`}
|
||||
class={`max-w-[85%] rounded-lg px-4 py-2 ${message.role === 'user'
|
||||
? 'bg-purple-600 text-white'
|
||||
: 'bg-gray-100 dark:bg-gray-800 text-gray-900 dark:text-gray-100'
|
||||
}`}
|
||||
>
|
||||
{/* Show thinking/reasoning content (DeepSeek) */}
|
||||
<Show when={message.role === 'assistant' && message.thinking}>
|
||||
<details class="mb-3 rounded border border-blue-300 dark:border-blue-700 overflow-hidden group">
|
||||
<summary class="px-2 py-1.5 text-xs font-medium flex items-center gap-2 bg-blue-100 dark:bg-blue-900/30 text-blue-800 dark:text-blue-200 cursor-pointer hover:bg-blue-200 dark:hover:bg-blue-900/50 transition-colors">
|
||||
<svg class="w-3.5 h-3.5 transition-transform group-open:rotate-90" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9 5l7 7-7 7" />
|
||||
</svg>
|
||||
<svg class="w-3.5 h-3.5" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M9.663 17h4.673M12 3v1m6.364 1.636l-.707.707M21 12h-1M4 12H3m3.343-5.657l-.707-.707m2.828 9.9a5 5 0 117.072 0l-.548.547A3.374 3.374 0 0014 18.469V19a2 2 0 11-4 0v-.531c0-.895-.356-1.754-.988-2.386l-.548-.547z" />
|
||||
</svg>
|
||||
<span>Thinking...</span>
|
||||
<span class="text-blue-600 dark:text-blue-400 text-[10px]">({message.thinking!.length} chars)</span>
|
||||
</summary>
|
||||
<div class="px-2 py-2 text-xs bg-blue-50 dark:bg-blue-900/20 text-gray-700 dark:text-gray-300 max-h-48 overflow-y-auto whitespace-pre-wrap break-words font-mono">
|
||||
{message.thinking!.length > 2000 ? message.thinking!.substring(0, 2000) + '...' : message.thinking}
|
||||
</div>
|
||||
</details>
|
||||
</Show>
|
||||
|
||||
{/* Show completed tool calls FIRST - chronological order */}
|
||||
<Show when={message.role === 'assistant' && message.toolCalls && message.toolCalls.length > 0}>
|
||||
<div class="mb-3 space-y-2">
|
||||
<For each={message.toolCalls}>
|
||||
{(tool) => (
|
||||
<div class="rounded border border-gray-300 dark:border-gray-600 overflow-hidden">
|
||||
<div class={`px-2 py-1 text-xs font-medium flex items-center gap-2 ${
|
||||
tool.success
|
||||
? 'bg-green-100 dark:bg-green-900/30 text-green-800 dark:text-green-200'
|
||||
: 'bg-red-100 dark:bg-red-900/30 text-red-800 dark:text-red-200'
|
||||
}`}>
|
||||
<div class={`px-2 py-1 text-xs font-medium flex items-center gap-2 ${tool.success
|
||||
? 'bg-green-100 dark:bg-green-900/30 text-green-800 dark:text-green-200'
|
||||
: 'bg-red-100 dark:bg-red-900/30 text-red-800 dark:text-red-200'
|
||||
}`}>
|
||||
<svg class="w-3 h-3" fill="none" stroke="currentColor" viewBox="0 0 24 24">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 9l3 3-3 3m5 0h3M5 20h14a2 2 0 002-2V6a2 2 0 00-2-2H5a2 2 0 00-2 2v12a2 2 0 002 2z" />
|
||||
</svg>
|
||||
@@ -734,11 +802,10 @@ export const AIChat: Component<AIChatProps> = (props) => {
|
||||
<div class="flex gap-2">
|
||||
<button
|
||||
type="button"
|
||||
class={`flex-1 px-2 py-1 text-xs font-medium rounded transition-colors ${
|
||||
approval.isExecuting
|
||||
? 'bg-green-400 text-white cursor-wait'
|
||||
: 'bg-green-600 hover:bg-green-700 text-white'
|
||||
}`}
|
||||
class={`flex-1 px-2 py-1 text-xs font-medium rounded transition-colors ${approval.isExecuting
|
||||
? 'bg-green-400 text-white cursor-wait'
|
||||
: 'bg-green-600 hover:bg-green-700 text-white'
|
||||
}`}
|
||||
onClick={() => executeApprovedCommand(message.id, approval)}
|
||||
disabled={approval.isExecuting}
|
||||
>
|
||||
@@ -887,24 +954,22 @@ export const AIChat: Component<AIChatProps> = (props) => {
|
||||
type="button"
|
||||
onClick={() => !isAlreadyAdded() && addResourceToContext(resource)}
|
||||
disabled={isAlreadyAdded()}
|
||||
class={`w-full px-3 py-2 text-left flex items-center gap-2 text-xs transition-colors ${
|
||||
isAlreadyAdded()
|
||||
? 'bg-purple-50 dark:bg-purple-900/20 text-gray-400 dark:text-gray-500 cursor-default'
|
||||
: 'hover:bg-gray-50 dark:hover:bg-gray-700/50 text-gray-700 dark:text-gray-300'
|
||||
}`}
|
||||
class={`w-full px-3 py-2 text-left flex items-center gap-2 text-xs transition-colors ${isAlreadyAdded()
|
||||
? 'bg-purple-50 dark:bg-purple-900/20 text-gray-400 dark:text-gray-500 cursor-default'
|
||||
: 'hover:bg-gray-50 dark:hover:bg-gray-700/50 text-gray-700 dark:text-gray-300'
|
||||
}`}
|
||||
>
|
||||
{/* Type icon */}
|
||||
<span class={`flex-shrink-0 w-5 h-5 rounded flex items-center justify-center text-[9px] font-bold uppercase ${
|
||||
resource.type === 'vm' ? 'bg-blue-100 text-blue-600 dark:bg-blue-900/40 dark:text-blue-400' :
|
||||
<span class={`flex-shrink-0 w-5 h-5 rounded flex items-center justify-center text-[9px] font-bold uppercase ${resource.type === 'vm' ? 'bg-blue-100 text-blue-600 dark:bg-blue-900/40 dark:text-blue-400' :
|
||||
resource.type === 'container' ? 'bg-green-100 text-green-600 dark:bg-green-900/40 dark:text-green-400' :
|
||||
resource.type === 'node' ? 'bg-orange-100 text-orange-600 dark:bg-orange-900/40 dark:text-orange-400' :
|
||||
resource.type === 'host' ? 'bg-purple-100 text-purple-600 dark:bg-purple-900/40 dark:text-purple-400' :
|
||||
'bg-gray-100 text-gray-600 dark:bg-gray-700 dark:text-gray-400'
|
||||
}`}>
|
||||
resource.type === 'node' ? 'bg-orange-100 text-orange-600 dark:bg-orange-900/40 dark:text-orange-400' :
|
||||
resource.type === 'host' ? 'bg-purple-100 text-purple-600 dark:bg-purple-900/40 dark:text-purple-400' :
|
||||
'bg-gray-100 text-gray-600 dark:bg-gray-700 dark:text-gray-400'
|
||||
}`}>
|
||||
{resource.type === 'vm' ? 'VM' :
|
||||
resource.type === 'container' ? 'CT' :
|
||||
resource.type === 'node' ? 'N' :
|
||||
resource.type === 'host' ? 'H' : '?'}
|
||||
resource.type === 'container' ? 'CT' :
|
||||
resource.type === 'node' ? 'N' :
|
||||
resource.type === 'host' ? 'H' : '?'}
|
||||
</span>
|
||||
|
||||
{/* Name and details */}
|
||||
@@ -916,11 +981,10 @@ export const AIChat: Component<AIChatProps> = (props) => {
|
||||
</div>
|
||||
|
||||
{/* Status indicator */}
|
||||
<span class={`flex-shrink-0 w-2 h-2 rounded-full ${
|
||||
resource.status === 'running' || resource.status === 'online' ? 'bg-green-500' :
|
||||
<span class={`flex-shrink-0 w-2 h-2 rounded-full ${resource.status === 'running' || resource.status === 'online' ? 'bg-green-500' :
|
||||
resource.status === 'stopped' || resource.status === 'offline' ? 'bg-gray-400' :
|
||||
'bg-yellow-500'
|
||||
}`} />
|
||||
'bg-yellow-500'
|
||||
}`} />
|
||||
|
||||
{/* Check if already added */}
|
||||
<Show when={isAlreadyAdded()}>
|
||||
@@ -956,6 +1020,15 @@ export const AIChat: Component<AIChatProps> = (props) => {
|
||||
Add VMs, containers, or hosts to provide context for your questions
|
||||
</p>
|
||||
</Show>
|
||||
|
||||
{/* Guest Notes - show for first context item */}
|
||||
<Show when={aiChatStore.contextItems.length > 0}>
|
||||
<GuestNotes
|
||||
guestId={`${aiChatStore.contextItems[0].type}-${aiChatStore.contextItems[0].id}`}
|
||||
guestName={aiChatStore.contextItems[0].name}
|
||||
guestType={aiChatStore.contextItems[0].type}
|
||||
/>
|
||||
</Show>
|
||||
</div>
|
||||
<form onSubmit={handleSubmit} class="flex gap-2">
|
||||
<textarea
|
||||
|
||||
690
frontend-modern/src/components/AI/GuestNotes.tsx
Normal file
690
frontend-modern/src/components/AI/GuestNotes.tsx
Normal file
@@ -0,0 +1,690 @@
|
||||
import { Component, createSignal, createEffect, For, Show, createMemo } from 'solid-js';
|
||||
import { notificationStore } from '@/stores/notifications';
|
||||
import { apiFetch } from '@/utils/apiClient';
|
||||
|
||||
interface Note {
|
||||
id: string;
|
||||
category: string;
|
||||
title: string;
|
||||
content: string;
|
||||
created_at: string;
|
||||
updated_at: string;
|
||||
}
|
||||
|
||||
interface GuestKnowledge {
|
||||
guest_id: string;
|
||||
guest_name: string;
|
||||
guest_type: string;
|
||||
notes: Note[];
|
||||
updated_at: string;
|
||||
}
|
||||
|
||||
interface GuestNotesProps {
|
||||
guestId: string;
|
||||
guestName?: string;
|
||||
guestType?: string;
|
||||
}
|
||||
|
||||
const CATEGORY_LABELS: Record<string, string> = {
|
||||
service: 'Service',
|
||||
path: 'Path',
|
||||
config: 'Config',
|
||||
credential: 'Credential',
|
||||
learning: 'Learning',
|
||||
};
|
||||
|
||||
const CATEGORY_ICONS: Record<string, string> = {
|
||||
service: '⚙️',
|
||||
path: '📁',
|
||||
config: '📋',
|
||||
credential: '🔐',
|
||||
learning: '💡',
|
||||
};
|
||||
|
||||
const CATEGORY_COLORS: Record<string, string> = {
|
||||
service: 'bg-blue-500/20 border-blue-500/30 text-blue-300',
|
||||
path: 'bg-amber-500/20 border-amber-500/30 text-amber-300',
|
||||
config: 'bg-purple-500/20 border-purple-500/30 text-purple-300',
|
||||
credential: 'bg-red-500/20 border-red-500/30 text-red-300',
|
||||
learning: 'bg-green-500/20 border-green-500/30 text-green-300',
|
||||
};
|
||||
|
||||
const CATEGORY_OPTIONS = ['service', 'path', 'config', 'credential', 'learning'];
|
||||
|
||||
// Quick templates for common note types
|
||||
const TEMPLATES = [
|
||||
{ category: 'credential', title: 'Admin Password', placeholder: 'Enter admin password...' },
|
||||
{ category: 'credential', title: 'SSH Key', placeholder: 'Paste SSH private key or fingerprint...' },
|
||||
{ category: 'credential', title: 'API Key', placeholder: 'Enter API key...' },
|
||||
{ category: 'path', title: 'Config Directory', placeholder: '/path/to/config' },
|
||||
{ category: 'path', title: 'Data Directory', placeholder: '/path/to/data' },
|
||||
{ category: 'path', title: 'Log Location', placeholder: '/var/log/service.log' },
|
||||
{ category: 'service', title: 'Web Interface', placeholder: 'http://localhost:8080' },
|
||||
{ category: 'service', title: 'Database', placeholder: 'PostgreSQL on port 5432' },
|
||||
{ category: 'config', title: 'Port Number', placeholder: '8080' },
|
||||
{ category: 'config', title: 'Environment', placeholder: 'production' },
|
||||
];
|
||||
|
||||
// Format relative time
|
||||
const formatRelativeTime = (dateStr: string): string => {
|
||||
const date = new Date(dateStr);
|
||||
const now = new Date();
|
||||
const diffMs = now.getTime() - date.getTime();
|
||||
const diffMinutes = Math.floor(diffMs / 60000);
|
||||
const diffHours = Math.floor(diffMs / 3600000);
|
||||
const diffDays = Math.floor(diffMs / 86400000);
|
||||
|
||||
if (diffMinutes < 1) return 'just now';
|
||||
if (diffMinutes < 60) return `${diffMinutes}m ago`;
|
||||
if (diffHours < 24) return `${diffHours}h ago`;
|
||||
if (diffDays < 7) return `${diffDays}d ago`;
|
||||
return date.toLocaleDateString();
|
||||
};
|
||||
|
||||
export const GuestNotes: Component<GuestNotesProps> = (props) => {
|
||||
const [knowledge, setKnowledge] = createSignal<GuestKnowledge | null>(null);
|
||||
const [isLoading, setIsLoading] = createSignal(false);
|
||||
const [isExpanded, setIsExpanded] = createSignal(false);
|
||||
const [showAddForm, setShowAddForm] = createSignal(false);
|
||||
const [showTemplates, setShowTemplates] = createSignal(false);
|
||||
const [showActions, setShowActions] = createSignal(false);
|
||||
const [editingNote, setEditingNote] = createSignal<Note | null>(null);
|
||||
const [searchQuery, setSearchQuery] = createSignal('');
|
||||
const [filterCategory, setFilterCategory] = createSignal<string>('');
|
||||
const [showCredentials, setShowCredentials] = createSignal<Set<string>>(new Set());
|
||||
const [deleteConfirmId, setDeleteConfirmId] = createSignal<string | null>(null);
|
||||
const [clearConfirm, setClearConfirm] = createSignal(false);
|
||||
const [isImporting, setIsImporting] = createSignal(false);
|
||||
|
||||
// Form state
|
||||
const [category, setCategory] = createSignal('learning');
|
||||
const [title, setTitle] = createSignal('');
|
||||
const [content, setContent] = createSignal('');
|
||||
|
||||
// File input ref for import
|
||||
let fileInputRef: HTMLInputElement | undefined;
|
||||
|
||||
// Fetch knowledge when guestId changes
|
||||
createEffect(() => {
|
||||
const guestId = props.guestId;
|
||||
if (guestId) {
|
||||
loadKnowledge(guestId);
|
||||
}
|
||||
});
|
||||
|
||||
const loadKnowledge = async (guestId: string) => {
|
||||
setIsLoading(true);
|
||||
try {
|
||||
const response = await apiFetch(`/api/ai/knowledge?guest_id=${encodeURIComponent(guestId)}`);
|
||||
if (response.ok) {
|
||||
const data = await response.json();
|
||||
setKnowledge(data);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to load guest knowledge:', error);
|
||||
} finally {
|
||||
setIsLoading(false);
|
||||
}
|
||||
};
|
||||
|
||||
const saveNote = async () => {
|
||||
if (!title().trim() || !content().trim()) return;
|
||||
|
||||
try {
|
||||
const response = await apiFetch('/api/ai/knowledge/save', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
guest_id: props.guestId,
|
||||
guest_name: props.guestName || props.guestId,
|
||||
guest_type: props.guestType || 'unknown',
|
||||
category: category(),
|
||||
title: title().trim(),
|
||||
content: content().trim(),
|
||||
}),
|
||||
});
|
||||
|
||||
if (response.ok) {
|
||||
notificationStore.success('Note saved');
|
||||
// Reset form
|
||||
setTitle('');
|
||||
setContent('');
|
||||
setShowAddForm(false);
|
||||
setShowTemplates(false);
|
||||
setEditingNote(null);
|
||||
// Reload knowledge
|
||||
loadKnowledge(props.guestId);
|
||||
} else {
|
||||
notificationStore.error('Failed to save note');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to save note:', error);
|
||||
notificationStore.error('Failed to save note');
|
||||
}
|
||||
};
|
||||
|
||||
const deleteNote = async (noteId: string) => {
|
||||
try {
|
||||
const response = await apiFetch('/api/ai/knowledge/delete', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
guest_id: props.guestId,
|
||||
note_id: noteId,
|
||||
}),
|
||||
});
|
||||
|
||||
if (response.ok) {
|
||||
notificationStore.success('Note deleted');
|
||||
setDeleteConfirmId(null);
|
||||
loadKnowledge(props.guestId);
|
||||
} else {
|
||||
notificationStore.error('Failed to delete note');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to delete note:', error);
|
||||
notificationStore.error('Failed to delete note');
|
||||
}
|
||||
};
|
||||
|
||||
const exportNotes = async () => {
|
||||
try {
|
||||
const response = await apiFetch(`/api/ai/knowledge/export?guest_id=${encodeURIComponent(props.guestId)}`);
|
||||
if (response.ok) {
|
||||
const data = await response.json();
|
||||
const blob = new Blob([JSON.stringify(data, null, 2)], { type: 'application/json' });
|
||||
const url = URL.createObjectURL(blob);
|
||||
const a = document.createElement('a');
|
||||
a.href = url;
|
||||
a.download = `pulse-notes-${props.guestName || props.guestId}.json`;
|
||||
document.body.appendChild(a);
|
||||
a.click();
|
||||
document.body.removeChild(a);
|
||||
URL.revokeObjectURL(url);
|
||||
notificationStore.success('Notes exported');
|
||||
setShowActions(false);
|
||||
} else {
|
||||
notificationStore.error('Failed to export notes');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to export notes:', error);
|
||||
notificationStore.error('Failed to export notes');
|
||||
}
|
||||
};
|
||||
|
||||
const handleImportFile = async (event: Event) => {
|
||||
const target = event.target as HTMLInputElement;
|
||||
const file = target.files?.[0];
|
||||
if (!file) return;
|
||||
|
||||
setIsImporting(true);
|
||||
try {
|
||||
const text = await file.text();
|
||||
const data = JSON.parse(text);
|
||||
|
||||
// Add merge flag and ensure guest_id matches current
|
||||
const importData = {
|
||||
...data,
|
||||
guest_id: props.guestId,
|
||||
guest_name: props.guestName || data.guest_name,
|
||||
guest_type: props.guestType || data.guest_type,
|
||||
merge: true, // Merge with existing notes
|
||||
};
|
||||
|
||||
const response = await apiFetch('/api/ai/knowledge/import', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify(importData),
|
||||
});
|
||||
|
||||
if (response.ok) {
|
||||
const result = await response.json();
|
||||
notificationStore.success(`Imported ${result.imported} of ${result.total} notes`);
|
||||
loadKnowledge(props.guestId);
|
||||
setShowActions(false);
|
||||
} else {
|
||||
const errorText = await response.text();
|
||||
notificationStore.error('Import failed: ' + errorText);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to import notes:', error);
|
||||
notificationStore.error('Failed to parse import file');
|
||||
} finally {
|
||||
setIsImporting(false);
|
||||
// Reset file input
|
||||
if (fileInputRef) {
|
||||
fileInputRef.value = '';
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
const clearAllNotes = async () => {
|
||||
try {
|
||||
const response = await apiFetch('/api/ai/knowledge/clear', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
guest_id: props.guestId,
|
||||
confirm: true,
|
||||
}),
|
||||
});
|
||||
|
||||
if (response.ok) {
|
||||
const result = await response.json();
|
||||
notificationStore.success(`Cleared ${result.deleted} notes`);
|
||||
setClearConfirm(false);
|
||||
setShowActions(false);
|
||||
loadKnowledge(props.guestId);
|
||||
} else {
|
||||
notificationStore.error('Failed to clear notes');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to clear notes:', error);
|
||||
notificationStore.error('Failed to clear notes');
|
||||
}
|
||||
};
|
||||
|
||||
const startEdit = (note: Note) => {
|
||||
setEditingNote(note);
|
||||
setCategory(note.category);
|
||||
setTitle(note.title);
|
||||
setContent(note.content);
|
||||
setShowAddForm(true);
|
||||
setShowTemplates(false);
|
||||
};
|
||||
|
||||
const useTemplate = (template: { category: string; title: string; placeholder: string }) => {
|
||||
setCategory(template.category);
|
||||
setTitle(template.title);
|
||||
setContent('');
|
||||
setShowTemplates(false);
|
||||
setShowAddForm(true);
|
||||
};
|
||||
|
||||
const cancelEdit = () => {
|
||||
setEditingNote(null);
|
||||
setTitle('');
|
||||
setContent('');
|
||||
setShowAddForm(false);
|
||||
setShowTemplates(false);
|
||||
};
|
||||
|
||||
const copyToClipboard = async (text: string, label: string) => {
|
||||
try {
|
||||
await navigator.clipboard.writeText(text);
|
||||
notificationStore.success(`${label} copied to clipboard`);
|
||||
} catch {
|
||||
notificationStore.error('Failed to copy to clipboard');
|
||||
}
|
||||
};
|
||||
|
||||
const toggleCredentialVisibility = (noteId: string) => {
|
||||
const current = showCredentials();
|
||||
const updated = new Set(current);
|
||||
if (updated.has(noteId)) {
|
||||
updated.delete(noteId);
|
||||
} else {
|
||||
updated.add(noteId);
|
||||
}
|
||||
setShowCredentials(updated);
|
||||
};
|
||||
|
||||
const maskCredential = (content: string): string => {
|
||||
// Mask most of the content, showing only first 2 and last 2 chars
|
||||
if (content.length <= 6) {
|
||||
return '••••••';
|
||||
}
|
||||
return content.slice(0, 2) + '•'.repeat(Math.min(content.length - 4, 12)) + content.slice(-2);
|
||||
};
|
||||
|
||||
const notes = () => knowledge()?.notes || [];
|
||||
const hasNotes = () => notes().length > 0;
|
||||
|
||||
// Filtered notes based on search and category
|
||||
const filteredNotes = createMemo(() => {
|
||||
let result = notes();
|
||||
|
||||
// Filter by category
|
||||
const catFilter = filterCategory();
|
||||
if (catFilter) {
|
||||
result = result.filter(n => n.category === catFilter);
|
||||
}
|
||||
|
||||
// Filter by search query
|
||||
const query = searchQuery().toLowerCase();
|
||||
if (query) {
|
||||
result = result.filter(n =>
|
||||
n.title.toLowerCase().includes(query) ||
|
||||
n.content.toLowerCase().includes(query) ||
|
||||
CATEGORY_LABELS[n.category]?.toLowerCase().includes(query)
|
||||
);
|
||||
}
|
||||
|
||||
// Sort by updated_at descending
|
||||
return result.sort((a, b) =>
|
||||
new Date(b.updated_at).getTime() - new Date(a.updated_at).getTime()
|
||||
);
|
||||
});
|
||||
|
||||
// Group notes by category for summary
|
||||
const notesByCategory = createMemo(() => {
|
||||
const grouped: Record<string, number> = {};
|
||||
for (const note of notes()) {
|
||||
grouped[note.category] = (grouped[note.category] || 0) + 1;
|
||||
}
|
||||
return grouped;
|
||||
});
|
||||
|
||||
return (
|
||||
<div class="border-t border-gray-700 pt-2 mt-2">
|
||||
{/* Hidden file input for import */}
|
||||
<input
|
||||
ref={fileInputRef}
|
||||
type="file"
|
||||
accept=".json"
|
||||
class="hidden"
|
||||
onChange={handleImportFile}
|
||||
/>
|
||||
|
||||
{/* Header with expand/collapse */}
|
||||
<button
|
||||
onClick={() => setIsExpanded(!isExpanded())}
|
||||
class="flex items-center justify-between w-full text-left px-2 py-1 text-sm hover:bg-gray-700/50 rounded transition-colors"
|
||||
>
|
||||
<span class="flex items-center gap-2">
|
||||
<svg class={`w-3 h-3 transition-transform ${isExpanded() ? 'rotate-90' : ''}`} fill="currentColor" viewBox="0 0 20 20">
|
||||
<path fill-rule="evenodd" d="M7.293 14.707a1 1 0 010-1.414L10.586 10 7.293 6.707a1 1 0 011.414-1.414l4 4a1 1 0 010 1.414l-4 4a1 1 0 01-1.414 0z" clip-rule="evenodd" />
|
||||
</svg>
|
||||
<span class="text-gray-300 font-medium">Saved Notes</span>
|
||||
<Show when={hasNotes()}>
|
||||
<span class="text-xs text-gray-500">({notes().length})</span>
|
||||
</Show>
|
||||
</span>
|
||||
<Show when={isLoading()}>
|
||||
<span class="text-xs text-gray-500">Loading...</span>
|
||||
</Show>
|
||||
</button>
|
||||
|
||||
{/* Expandable content */}
|
||||
<Show when={isExpanded()}>
|
||||
<div class="mt-2 space-y-2 px-2">
|
||||
{/* Search and filter bar - only show if there are notes */}
|
||||
<Show when={hasNotes()}>
|
||||
<div class="flex gap-2 mb-2">
|
||||
<input
|
||||
type="text"
|
||||
placeholder="Search notes..."
|
||||
value={searchQuery()}
|
||||
onInput={(e) => setSearchQuery(e.target.value)}
|
||||
class="flex-1 bg-gray-700 text-xs text-gray-200 rounded px-2 py-1 border border-gray-600 placeholder-gray-500"
|
||||
/>
|
||||
<select
|
||||
value={filterCategory()}
|
||||
onChange={(e) => setFilterCategory(e.target.value)}
|
||||
class="bg-gray-700 text-xs text-gray-200 rounded px-2 py-1 border border-gray-600"
|
||||
>
|
||||
<option value="">All</option>
|
||||
<For each={CATEGORY_OPTIONS}>
|
||||
{(cat) => (
|
||||
<Show when={notesByCategory()[cat]}>
|
||||
<option value={cat}>{CATEGORY_ICONS[cat]} {CATEGORY_LABELS[cat]} ({notesByCategory()[cat]})</option>
|
||||
</Show>
|
||||
)}
|
||||
</For>
|
||||
</select>
|
||||
</div>
|
||||
</Show>
|
||||
|
||||
{/* Notes list */}
|
||||
<Show when={hasNotes()} fallback={
|
||||
<p class="text-xs text-gray-500 italic">No saved notes yet. The AI will automatically save useful discoveries.</p>
|
||||
}>
|
||||
<Show when={filteredNotes().length > 0} fallback={
|
||||
<p class="text-xs text-gray-500 italic">No notes match your search.</p>
|
||||
}>
|
||||
<div class="space-y-1.5 max-h-48 overflow-y-auto">
|
||||
<For each={filteredNotes()}>
|
||||
{(note) => (
|
||||
<div class={`group rounded border px-2 py-1.5 text-xs ${CATEGORY_COLORS[note.category] || 'bg-gray-800/50 border-gray-700'}`}>
|
||||
<div class="flex items-start justify-between gap-2">
|
||||
<div class="flex-1 min-w-0">
|
||||
<div class="flex items-center gap-1.5">
|
||||
<span class="text-sm" title={CATEGORY_LABELS[note.category]}>{CATEGORY_ICONS[note.category]}</span>
|
||||
<span class="text-gray-200 font-medium">{note.title}</span>
|
||||
<span class="text-gray-500 text-[10px]" title={new Date(note.updated_at).toLocaleString()}>
|
||||
{formatRelativeTime(note.updated_at)}
|
||||
</span>
|
||||
</div>
|
||||
<div class="flex items-center gap-1 mt-0.5">
|
||||
{/* Content - mask if credential and not revealed */}
|
||||
<Show when={note.category === 'credential' && !showCredentials().has(note.id)}
|
||||
fallback={
|
||||
<p class="text-gray-300 break-words font-mono text-[11px]">{note.content}</p>
|
||||
}>
|
||||
<p class="text-gray-400 break-words font-mono text-[11px]">{maskCredential(note.content)}</p>
|
||||
</Show>
|
||||
</div>
|
||||
</div>
|
||||
<div class="flex gap-0.5 opacity-0 group-hover:opacity-100 transition-opacity shrink-0">
|
||||
{/* Show/hide for credentials */}
|
||||
<Show when={note.category === 'credential'}>
|
||||
<button
|
||||
onClick={() => toggleCredentialVisibility(note.id)}
|
||||
class="text-gray-400 hover:text-yellow-400 p-0.5"
|
||||
title={showCredentials().has(note.id) ? 'Hide' : 'Show'}
|
||||
>
|
||||
<Show when={showCredentials().has(note.id)} fallback={
|
||||
<svg class="w-3.5 h-3.5" fill="none" stroke="currentColor" viewBox="0 0 24 24"><path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M15 12a3 3 0 11-6 0 3 3 0 016 0z" /><path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M2.458 12C3.732 7.943 7.523 5 12 5c4.478 0 8.268 2.943 9.542 7-1.274 4.057-5.064 7-9.542 7-4.477 0-8.268-2.943-9.542-7z" /></svg>
|
||||
}>
|
||||
<svg class="w-3.5 h-3.5" fill="none" stroke="currentColor" viewBox="0 0 24 24"><path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M13.875 18.825A10.05 10.05 0 0112 19c-4.478 0-8.268-2.943-9.543-7a9.97 9.97 0 011.563-3.029m5.858.908a3 3 0 114.243 4.243M9.878 9.878l4.242 4.242M9.88 9.88l-3.29-3.29m7.532 7.532l3.29 3.29M3 3l3.59 3.59m0 0A9.953 9.953 0 0112 5c4.478 0 8.268 2.943 9.543 7a10.025 10.025 0 01-4.132 5.411m0 0L21 21" /></svg>
|
||||
</Show>
|
||||
</button>
|
||||
</Show>
|
||||
{/* Copy button */}
|
||||
<button
|
||||
onClick={() => copyToClipboard(note.content, note.title)}
|
||||
class="text-gray-400 hover:text-green-400 p-0.5"
|
||||
title="Copy content"
|
||||
>
|
||||
<svg class="w-3 h-3" fill="none" stroke="currentColor" viewBox="0 0 24 24"><path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M8 16H6a2 2 0 01-2-2V6a2 2 0 012-2h8a2 2 0 012 2v2m-6 12h8a2 2 0 002-2v-8a2 2 0 00-2-2h-8a2 2 0 00-2 2v8a2 2 0 002 2z" /></svg>
|
||||
</button>
|
||||
{/* Edit button */}
|
||||
<button
|
||||
onClick={() => startEdit(note)}
|
||||
class="text-gray-400 hover:text-blue-400 p-0.5"
|
||||
title="Edit"
|
||||
>
|
||||
<svg class="w-3 h-3" fill="none" stroke="currentColor" viewBox="0 0 24 24"><path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M15.232 5.232l3.536 3.536m-2.036-5.036a2.5 2.5 0 113.536 3.536L6.5 21.036H3v-3.572L16.732 3.732z" /></svg>
|
||||
</button>
|
||||
{/* Delete button with confirmation */}
|
||||
<Show when={deleteConfirmId() === note.id} fallback={
|
||||
<button
|
||||
onClick={() => setDeleteConfirmId(note.id)}
|
||||
class="text-gray-400 hover:text-red-400 p-0.5"
|
||||
title="Delete"
|
||||
>
|
||||
<svg class="w-3 h-3" fill="none" stroke="currentColor" viewBox="0 0 24 24"><path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 7l-.867 12.142A2 2 0 0116.138 21H7.862a2 2 0 01-1.995-1.858L5 7m5 4v6m4-6v6m1-10V4a1 1 0 00-1-1h-4a1 1 0 00-1 1v3M4 7h16" /></svg>
|
||||
</button>
|
||||
}>
|
||||
<div class="flex items-center gap-1 bg-red-900/50 rounded px-1">
|
||||
<span class="text-red-300 text-[10px]">Delete?</span>
|
||||
<button
|
||||
onClick={() => deleteNote(note.id)}
|
||||
class="text-red-400 hover:text-red-300 p-0.5 font-bold"
|
||||
title="Confirm delete"
|
||||
>
|
||||
✓
|
||||
</button>
|
||||
<button
|
||||
onClick={() => setDeleteConfirmId(null)}
|
||||
class="text-gray-400 hover:text-gray-300 p-0.5"
|
||||
title="Cancel"
|
||||
>
|
||||
✗
|
||||
</button>
|
||||
</div>
|
||||
</Show>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</For>
|
||||
</div>
|
||||
</Show>
|
||||
</Show>
|
||||
|
||||
{/* Template picker */}
|
||||
<Show when={showTemplates()}>
|
||||
<div class="bg-gray-800/80 rounded p-2 border border-gray-700">
|
||||
<div class="flex items-center justify-between mb-2">
|
||||
<span class="text-xs font-medium text-gray-300">Quick Templates</span>
|
||||
<button onClick={() => setShowTemplates(false)} class="text-gray-400 hover:text-gray-200 text-xs">✕</button>
|
||||
</div>
|
||||
<div class="grid grid-cols-2 gap-1">
|
||||
<For each={TEMPLATES}>
|
||||
{(template) => (
|
||||
<button
|
||||
onClick={() => useTemplate(template)}
|
||||
class={`text-left px-2 py-1 rounded text-[10px] border ${CATEGORY_COLORS[template.category]} hover:opacity-80 transition-opacity`}
|
||||
>
|
||||
{CATEGORY_ICONS[template.category]} {template.title}
|
||||
</button>
|
||||
)}
|
||||
</For>
|
||||
</div>
|
||||
</div>
|
||||
</Show>
|
||||
|
||||
{/* Actions menu (Export/Import/Clear) */}
|
||||
<Show when={showActions()}>
|
||||
<div class="bg-gray-800/80 rounded p-2 border border-gray-700 space-y-1">
|
||||
<div class="flex items-center justify-between mb-1">
|
||||
<span class="text-xs font-medium text-gray-300">Actions</span>
|
||||
<button onClick={() => { setShowActions(false); setClearConfirm(false); }} class="text-gray-400 hover:text-gray-200 text-xs">✕</button>
|
||||
</div>
|
||||
<button
|
||||
onClick={exportNotes}
|
||||
disabled={!hasNotes()}
|
||||
class="w-full text-left px-2 py-1.5 rounded text-xs bg-blue-600/20 hover:bg-blue-600/30 text-blue-300 disabled:opacity-50 disabled:cursor-not-allowed flex items-center gap-2"
|
||||
>
|
||||
<svg class="w-3.5 h-3.5" fill="none" stroke="currentColor" viewBox="0 0 24 24"><path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M12 10v6m0 0l-3-3m3 3l3-3m2 8H7a2 2 0 01-2-2V5a2 2 0 012-2h5.586a1 1 0 01.707.293l5.414 5.414a1 1 0 01.293.707V19a2 2 0 01-2 2z" /></svg>
|
||||
Export Notes (JSON)
|
||||
</button>
|
||||
<button
|
||||
onClick={() => fileInputRef?.click()}
|
||||
disabled={isImporting()}
|
||||
class="w-full text-left px-2 py-1.5 rounded text-xs bg-green-600/20 hover:bg-green-600/30 text-green-300 disabled:opacity-50 flex items-center gap-2"
|
||||
>
|
||||
<svg class="w-3.5 h-3.5" fill="none" stroke="currentColor" viewBox="0 0 24 24"><path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M4 16v1a3 3 0 003 3h10a3 3 0 003-3v-1m-4-8l-4-4m0 0L8 8m4-4v12" /></svg>
|
||||
{isImporting() ? 'Importing...' : 'Import Notes (Merge)'}
|
||||
</button>
|
||||
<Show when={clearConfirm()} fallback={
|
||||
<button
|
||||
onClick={() => setClearConfirm(true)}
|
||||
disabled={!hasNotes()}
|
||||
class="w-full text-left px-2 py-1.5 rounded text-xs bg-red-600/20 hover:bg-red-600/30 text-red-300 disabled:opacity-50 disabled:cursor-not-allowed flex items-center gap-2"
|
||||
>
|
||||
<svg class="w-3.5 h-3.5" fill="none" stroke="currentColor" viewBox="0 0 24 24"><path stroke-linecap="round" stroke-linejoin="round" stroke-width="2" d="M19 7l-.867 12.142A2 2 0 0116.138 21H7.862a2 2 0 01-1.995-1.858L5 7m5 4v6m4-6v6m1-10V4a1 1 0 00-1-1h-4a1 1 0 00-1 1v3M4 7h16" /></svg>
|
||||
Clear All Notes
|
||||
</button>
|
||||
}>
|
||||
<div class="bg-red-900/50 rounded p-2 flex items-center justify-between">
|
||||
<span class="text-red-300 text-[10px]">Delete all {notes().length} notes?</span>
|
||||
<div class="flex gap-1">
|
||||
<button onClick={clearAllNotes} class="px-2 py-0.5 bg-red-600 hover:bg-red-500 text-white rounded text-[10px]">Yes, clear all</button>
|
||||
<button onClick={() => setClearConfirm(false)} class="px-2 py-0.5 bg-gray-600 hover:bg-gray-500 text-white rounded text-[10px]">Cancel</button>
|
||||
</div>
|
||||
</div>
|
||||
</Show>
|
||||
</div>
|
||||
</Show>
|
||||
|
||||
{/* Add/Edit form */}
|
||||
<Show when={showAddForm()}>
|
||||
<div class="bg-gray-800/80 rounded p-2 space-y-2 border border-gray-700">
|
||||
<div class="flex items-center justify-between mb-1">
|
||||
<span class="text-xs font-medium text-gray-300">
|
||||
{editingNote() ? 'Edit Note' : 'Add Note'}
|
||||
</span>
|
||||
<Show when={editingNote()}>
|
||||
<span class="text-[10px] text-gray-500">ID: {editingNote()?.id}</span>
|
||||
</Show>
|
||||
</div>
|
||||
<div class="flex gap-2">
|
||||
<select
|
||||
value={category()}
|
||||
onChange={(e) => setCategory(e.target.value)}
|
||||
class="bg-gray-700 text-xs text-gray-200 rounded px-2 py-1.5 border border-gray-600"
|
||||
>
|
||||
<For each={CATEGORY_OPTIONS}>
|
||||
{(cat) => <option value={cat}>{CATEGORY_ICONS[cat]} {CATEGORY_LABELS[cat]}</option>}
|
||||
</For>
|
||||
</select>
|
||||
<input
|
||||
type="text"
|
||||
placeholder="Title (e.g., 'Admin Password')"
|
||||
value={title()}
|
||||
onInput={(e) => setTitle(e.target.value)}
|
||||
class="flex-1 bg-gray-700 text-xs text-gray-200 rounded px-2 py-1.5 border border-gray-600 placeholder-gray-500"
|
||||
/>
|
||||
</div>
|
||||
<textarea
|
||||
placeholder={category() === 'credential'
|
||||
? "Enter credential value (stored encrypted)..."
|
||||
: "Content..."}
|
||||
value={content()}
|
||||
onInput={(e) => setContent(e.target.value)}
|
||||
class="w-full bg-gray-700 text-xs text-gray-200 rounded px-2 py-1.5 border border-gray-600 resize-none placeholder-gray-500 font-mono"
|
||||
rows={2}
|
||||
/>
|
||||
<Show when={category() === 'credential'}>
|
||||
<p class="text-[10px] text-amber-400 flex items-center gap-1">
|
||||
<span>🔐</span> Credentials are encrypted at rest and masked in the UI
|
||||
</p>
|
||||
</Show>
|
||||
<div class="flex gap-2 justify-end">
|
||||
<button
|
||||
onClick={cancelEdit}
|
||||
class="text-xs px-2 py-1 text-gray-400 hover:text-gray-200"
|
||||
>
|
||||
Cancel
|
||||
</button>
|
||||
<button
|
||||
onClick={saveNote}
|
||||
disabled={!title().trim() || !content().trim()}
|
||||
class="text-xs px-3 py-1 bg-blue-600 text-white rounded hover:bg-blue-500 disabled:opacity-50 disabled:cursor-not-allowed"
|
||||
>
|
||||
{editingNote() ? 'Update' : 'Save'}
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</Show>
|
||||
|
||||
{/* Action buttons row */}
|
||||
<Show when={!showAddForm() && !showTemplates() && !showActions()}>
|
||||
<div class="flex items-center gap-2 pt-1">
|
||||
<button
|
||||
onClick={() => setShowAddForm(true)}
|
||||
class="text-xs text-blue-400 hover:text-blue-300 flex items-center gap-1"
|
||||
>
|
||||
<span>+</span> Add note
|
||||
</button>
|
||||
<button
|
||||
onClick={() => setShowTemplates(true)}
|
||||
class="text-xs text-purple-400 hover:text-purple-300 flex items-center gap-1"
|
||||
>
|
||||
<span>📝</span> Templates
|
||||
</button>
|
||||
<button
|
||||
onClick={() => setShowActions(true)}
|
||||
class="text-xs text-gray-400 hover:text-gray-300 flex items-center gap-1"
|
||||
>
|
||||
<span>⚙️</span> Actions
|
||||
</button>
|
||||
</div>
|
||||
</Show>
|
||||
</div>
|
||||
</Show>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
188
frontend-modern/src/components/Alerts/InvestigateAlertButton.tsx
Normal file
188
frontend-modern/src/components/Alerts/InvestigateAlertButton.tsx
Normal file
@@ -0,0 +1,188 @@
|
||||
import { Show, createSignal } from 'solid-js';
|
||||
import { aiChatStore } from '@/stores/aiChat';
|
||||
import type { Alert } from '@/types/api';
|
||||
|
||||
interface InvestigateAlertButtonProps {
|
||||
alert: Alert;
|
||||
resourceType?: string;
|
||||
vmid?: number;
|
||||
size?: 'sm' | 'md';
|
||||
variant?: 'icon' | 'text' | 'full';
|
||||
class?: string;
|
||||
}
|
||||
|
||||
/**
|
||||
* "Ask AI" button for one-click alert investigation.
|
||||
* When clicked, opens the AI chat panel with the alert context pre-populated.
|
||||
*/
|
||||
export function InvestigateAlertButton(props: InvestigateAlertButtonProps) {
|
||||
const [isHovered, setIsHovered] = createSignal(false);
|
||||
|
||||
const handleClick = (e: MouseEvent) => {
|
||||
e.stopPropagation();
|
||||
e.preventDefault();
|
||||
|
||||
// Calculate how long the alert has been active
|
||||
const startTime = new Date(props.alert.startTime);
|
||||
const now = new Date();
|
||||
const durationMs = now.getTime() - startTime.getTime();
|
||||
const durationMins = Math.floor(durationMs / 60000);
|
||||
const durationStr =
|
||||
durationMins < 60
|
||||
? `${durationMins} min${durationMins !== 1 ? 's' : ''}`
|
||||
: `${Math.floor(durationMins / 60)}h ${durationMins % 60}m`;
|
||||
|
||||
// Format a focused prompt for investigation
|
||||
const prompt = `Investigate this ${props.alert.level.toUpperCase()} alert:
|
||||
|
||||
**Resource:** ${props.alert.resourceName}
|
||||
**Alert Type:** ${props.alert.type}
|
||||
**Current Value:** ${props.alert.value.toFixed(1)}%
|
||||
**Threshold:** ${props.alert.threshold.toFixed(1)}%
|
||||
**Duration:** ${durationStr}
|
||||
${props.alert.node ? `**Node:** ${props.alert.node}` : ''}
|
||||
|
||||
Please:
|
||||
1. Identify the root cause
|
||||
2. Check related metrics
|
||||
3. Suggest specific remediation steps
|
||||
4. Execute diagnostic commands if safe`;
|
||||
|
||||
// Determine target type from alert or infer from resource
|
||||
let targetType = props.resourceType || 'guest';
|
||||
if (props.alert.type.startsWith('node_')) {
|
||||
targetType = 'node';
|
||||
} else if (props.alert.type.startsWith('docker_')) {
|
||||
targetType = 'docker_container';
|
||||
} else if (props.alert.type.startsWith('storage_')) {
|
||||
targetType = 'storage';
|
||||
}
|
||||
|
||||
// Open AI chat with this context and prompt
|
||||
aiChatStore.openWithPrompt(prompt, {
|
||||
targetType,
|
||||
targetId: props.alert.resourceId,
|
||||
context: {
|
||||
alertId: props.alert.id,
|
||||
alertType: props.alert.type,
|
||||
alertLevel: props.alert.level,
|
||||
alertMessage: props.alert.message,
|
||||
guestName: props.alert.resourceName,
|
||||
node: props.alert.node,
|
||||
vmid: props.vmid,
|
||||
},
|
||||
});
|
||||
};
|
||||
|
||||
const sizeClasses = {
|
||||
sm: 'w-6 h-6 text-xs',
|
||||
md: 'w-8 h-8 text-sm',
|
||||
};
|
||||
|
||||
const baseButtonClass = `
|
||||
inline-flex items-center justify-center
|
||||
rounded-md transition-all duration-200
|
||||
focus:outline-none focus-visible:ring-2 focus-visible:ring-blue-500
|
||||
disabled:opacity-50 disabled:cursor-not-allowed
|
||||
`;
|
||||
|
||||
// Icon-only variant (smallest footprint)
|
||||
if (props.variant === 'icon') {
|
||||
return (
|
||||
<button
|
||||
type="button"
|
||||
onClick={handleClick}
|
||||
onMouseEnter={() => setIsHovered(true)}
|
||||
onMouseLeave={() => setIsHovered(false)}
|
||||
class={`${baseButtonClass} ${sizeClasses[props.size || 'sm']}
|
||||
bg-gradient-to-r from-purple-500/10 to-blue-500/10
|
||||
hover:from-purple-500/20 hover:to-blue-500/20
|
||||
text-purple-600 dark:text-purple-400
|
||||
hover:text-purple-700 dark:hover:text-purple-300
|
||||
border border-purple-200/50 dark:border-purple-700/50
|
||||
hover:border-purple-300 dark:hover:border-purple-600
|
||||
${props.class || ''}`}
|
||||
title="Ask AI to investigate this alert"
|
||||
>
|
||||
<svg
|
||||
class={`${props.size === 'sm' ? 'w-3.5 h-3.5' : 'w-4 h-4'}`}
|
||||
fill="none"
|
||||
viewBox="0 0 24 24"
|
||||
stroke="currentColor"
|
||||
>
|
||||
<path
|
||||
stroke-linecap="round"
|
||||
stroke-linejoin="round"
|
||||
stroke-width="2"
|
||||
d="M9.663 17h4.673M12 3v1m6.364 1.636l-.707.707M21 12h-1M4 12H3m3.343-5.657l-.707-.707m2.828 9.9a5 5 0 117.072 0l-.548.547A3.374 3.374 0 0014 18.469V19a2 2 0 11-4 0v-.531c0-.895-.356-1.754-.988-2.386l-.548-.547z"
|
||||
/>
|
||||
</svg>
|
||||
</button>
|
||||
);
|
||||
}
|
||||
|
||||
// Text variant (shows "Ask AI" on hover)
|
||||
if (props.variant === 'text') {
|
||||
return (
|
||||
<button
|
||||
type="button"
|
||||
onClick={handleClick}
|
||||
onMouseEnter={() => setIsHovered(true)}
|
||||
onMouseLeave={() => setIsHovered(false)}
|
||||
class={`${baseButtonClass} px-2 py-1
|
||||
bg-gradient-to-r from-purple-500/10 to-blue-500/10
|
||||
hover:from-purple-500/20 hover:to-blue-500/20
|
||||
text-purple-600 dark:text-purple-400
|
||||
hover:text-purple-700 dark:hover:text-purple-300
|
||||
border border-purple-200/50 dark:border-purple-700/50
|
||||
hover:border-purple-300 dark:hover:border-purple-600
|
||||
gap-1.5
|
||||
${props.class || ''}`}
|
||||
title="Ask AI to investigate this alert"
|
||||
>
|
||||
<svg class="w-3.5 h-3.5" fill="none" viewBox="0 0 24 24" stroke="currentColor">
|
||||
<path
|
||||
stroke-linecap="round"
|
||||
stroke-linejoin="round"
|
||||
stroke-width="2"
|
||||
d="M9.663 17h4.673M12 3v1m6.364 1.636l-.707.707M21 12h-1M4 12H3m3.343-5.657l-.707-.707m2.828 9.9a5 5 0 117.072 0l-.548.547A3.374 3.374 0 0014 18.469V19a2 2 0 11-4 0v-.531c0-.895-.356-1.754-.988-2.386l-.548-.547z"
|
||||
/>
|
||||
</svg>
|
||||
<span class="text-xs font-medium">Ask AI</span>
|
||||
</button>
|
||||
);
|
||||
}
|
||||
|
||||
// Full variant (with expanded label)
|
||||
return (
|
||||
<button
|
||||
type="button"
|
||||
onClick={handleClick}
|
||||
onMouseEnter={() => setIsHovered(true)}
|
||||
onMouseLeave={() => setIsHovered(false)}
|
||||
class={`${baseButtonClass} px-3 py-1.5
|
||||
bg-gradient-to-r from-purple-500 to-blue-500
|
||||
hover:from-purple-600 hover:to-blue-600
|
||||
text-white font-medium
|
||||
shadow-sm hover:shadow-md
|
||||
gap-2
|
||||
${props.class || ''}`}
|
||||
title="Ask AI to investigate this alert"
|
||||
>
|
||||
<svg class="w-4 h-4" fill="none" viewBox="0 0 24 24" stroke="currentColor">
|
||||
<path
|
||||
stroke-linecap="round"
|
||||
stroke-linejoin="round"
|
||||
stroke-width="2"
|
||||
d="M9.663 17h4.673M12 3v1m6.364 1.636l-.707.707M21 12h-1M4 12H3m3.343-5.657l-.707-.707m2.828 9.9a5 5 0 117.072 0l-.548.547A3.374 3.374 0 0014 18.469V19a2 2 0 11-4 0v-.531c0-.895-.356-1.754-.988-2.386l-.548-.547z"
|
||||
/>
|
||||
</svg>
|
||||
<span>Investigate with AI</span>
|
||||
<Show when={isHovered()}>
|
||||
<span class="text-xs opacity-80">→</span>
|
||||
</Show>
|
||||
</button>
|
||||
);
|
||||
}
|
||||
|
||||
export default InvestigateAlertButton;
|
||||
@@ -201,6 +201,7 @@ type ViewMode = 'all' | 'vm' | 'lxc';
|
||||
type StatusMode = 'all' | 'running' | 'degraded' | 'stopped';
|
||||
type BackupMode = 'all' | 'needs-backup';
|
||||
type GroupingMode = 'grouped' | 'flat';
|
||||
type ProblemsMode = 'all' | 'problems';
|
||||
|
||||
export function Dashboard(props: DashboardProps) {
|
||||
const navigate = useNavigate();
|
||||
@@ -254,6 +255,15 @@ export function Dashboard(props: DashboardProps) {
|
||||
},
|
||||
);
|
||||
|
||||
// Problems mode - show only guests with issues
|
||||
const [problemsMode, setProblemsMode] = usePersistentSignal<ProblemsMode>(
|
||||
'dashboardProblemsMode',
|
||||
'all',
|
||||
{
|
||||
deserialize: (raw) => (raw === 'all' || raw === 'problems' ? raw : 'all'),
|
||||
},
|
||||
);
|
||||
|
||||
const [showFilters, setShowFilters] = usePersistentSignal<boolean>(
|
||||
'dashboardShowFilters',
|
||||
false,
|
||||
@@ -444,7 +454,8 @@ export function Dashboard(props: DashboardProps) {
|
||||
selectedNode() !== null ||
|
||||
viewMode() !== 'all' ||
|
||||
statusMode() !== 'all' ||
|
||||
backupMode() !== 'all';
|
||||
backupMode() !== 'all' ||
|
||||
problemsMode() !== 'all';
|
||||
|
||||
if (hasActiveFilters) {
|
||||
// Clear ALL filters including search text, tag filters, node selection, and view modes
|
||||
@@ -456,6 +467,7 @@ export function Dashboard(props: DashboardProps) {
|
||||
setViewMode('all');
|
||||
setStatusMode('all');
|
||||
setBackupMode('all');
|
||||
setProblemsMode('all');
|
||||
|
||||
// Blur the search input if it's focused
|
||||
if (searchInputRef && document.activeElement === searchInputRef) {
|
||||
@@ -489,6 +501,32 @@ export function Dashboard(props: DashboardProps) {
|
||||
return () => document.removeEventListener('keydown', handleKeyDown);
|
||||
});
|
||||
|
||||
// Compute guests with problems - used for AI investigation regardless of filter state
|
||||
const problemGuests = createMemo(() => {
|
||||
return allGuests().filter((g) => {
|
||||
// Skip templates
|
||||
if (g.template) return false;
|
||||
|
||||
// Check for degraded status
|
||||
const status = (g.status || '').toLowerCase();
|
||||
const isDegraded = DEGRADED_HEALTH_STATUSES.has(status) ||
|
||||
(status !== 'running' && !OFFLINE_HEALTH_STATUSES.has(status) && status !== 'stopped');
|
||||
if (isDegraded) return true;
|
||||
|
||||
// Check for backup issues
|
||||
const backupInfo = getBackupInfo(g.lastBackup);
|
||||
if (backupInfo.status === 'stale' || backupInfo.status === 'critical' || backupInfo.status === 'never') {
|
||||
return true;
|
||||
}
|
||||
|
||||
// Check for high resource usage (>90%)
|
||||
if (g.cpu > 0.9) return true;
|
||||
if (g.memory && g.memory.usage && g.memory.usage > 90) return true;
|
||||
|
||||
return false;
|
||||
});
|
||||
});
|
||||
|
||||
// Filter guests based on current settings
|
||||
const filteredGuests = createMemo(() => {
|
||||
let guests = allGuests();
|
||||
@@ -538,6 +576,32 @@ export function Dashboard(props: DashboardProps) {
|
||||
});
|
||||
}
|
||||
|
||||
// Filter by problems mode - show guests that need attention
|
||||
if (problemsMode() === 'problems') {
|
||||
guests = guests.filter((g) => {
|
||||
// Skip templates
|
||||
if (g.template) return false;
|
||||
|
||||
// Check for degraded status
|
||||
const status = (g.status || '').toLowerCase();
|
||||
const isDegraded = DEGRADED_HEALTH_STATUSES.has(status) ||
|
||||
(status !== 'running' && !OFFLINE_HEALTH_STATUSES.has(status) && status !== 'stopped');
|
||||
if (isDegraded) return true;
|
||||
|
||||
// Check for backup issues
|
||||
const backupInfo = getBackupInfo(g.lastBackup);
|
||||
if (backupInfo.status === 'stale' || backupInfo.status === 'critical' || backupInfo.status === 'never') {
|
||||
return true;
|
||||
}
|
||||
|
||||
// Check for high resource usage (>90%)
|
||||
if (g.cpu > 0.9) return true;
|
||||
if (g.memory && g.memory.usage && g.memory.usage > 90) return true;
|
||||
|
||||
return false;
|
||||
});
|
||||
}
|
||||
|
||||
// Apply search/filter
|
||||
const searchTerm = search().trim();
|
||||
if (searchTerm) {
|
||||
@@ -753,6 +817,27 @@ export function Dashboard(props: DashboardProps) {
|
||||
|
||||
const handleNodeSelect = (nodeId: string | null, nodeType: 'pve' | 'pbs' | 'pmg' | null) => {
|
||||
logger.debug('handleNodeSelect called', { nodeId, nodeType });
|
||||
|
||||
// If AI sidebar is open, add node to AI context instead of filtering
|
||||
if (aiChatStore.isOpen && nodeId && nodeType === 'pve') {
|
||||
const node = props.nodes.find((n) => n.id === nodeId);
|
||||
if (node) {
|
||||
// Toggle: remove if already in context, add if not
|
||||
if (aiChatStore.hasContextItem(nodeId)) {
|
||||
aiChatStore.removeContextItem(nodeId);
|
||||
} else {
|
||||
aiChatStore.addContextItem('node', nodeId, node.name, {
|
||||
nodeName: node.name,
|
||||
name: node.name,
|
||||
type: 'Proxmox Node',
|
||||
status: node.status,
|
||||
instance: node.instance,
|
||||
});
|
||||
}
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
// Track selected node for filtering (independent of search)
|
||||
if (nodeType === 'pve' || nodeType === null) {
|
||||
setSelectedNode(nodeId);
|
||||
@@ -812,6 +897,29 @@ export function Dashboard(props: DashboardProps) {
|
||||
}
|
||||
};
|
||||
|
||||
// Handle row click - add guest to AI context when sidebar is open
|
||||
const handleGuestRowClick = (guest: VM | Container) => {
|
||||
// Only add to context if AI is enabled and sidebar is open
|
||||
if (!aiChatStore.enabled || !aiChatStore.isOpen) return;
|
||||
|
||||
const guestId = guest.id || `${guest.instance}-${guest.vmid}`;
|
||||
const guestType = guest.type === 'qemu' ? 'vm' : 'container';
|
||||
|
||||
// Toggle: remove if already in context, add if not
|
||||
if (aiChatStore.hasContextItem(guestId)) {
|
||||
aiChatStore.removeContextItem(guestId);
|
||||
} else {
|
||||
aiChatStore.addContextItem(guestType, guestId, guest.name, {
|
||||
guestName: guest.name,
|
||||
name: guest.name,
|
||||
type: guest.type === 'qemu' ? 'Virtual Machine' : 'LXC Container',
|
||||
vmid: guest.vmid,
|
||||
node: guest.node,
|
||||
status: guest.status,
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<div class="space-y-3">
|
||||
<ProxmoxSectionNav current="overview" />
|
||||
@@ -838,6 +946,9 @@ export function Dashboard(props: DashboardProps) {
|
||||
setStatusMode={setStatusMode}
|
||||
backupMode={backupMode}
|
||||
setBackupMode={setBackupMode}
|
||||
problemsMode={problemsMode}
|
||||
setProblemsMode={setProblemsMode}
|
||||
filteredProblemGuests={problemGuests}
|
||||
groupingMode={groupingMode}
|
||||
setGroupingMode={setGroupingMode}
|
||||
setSortKey={setSortKey}
|
||||
@@ -971,7 +1082,7 @@ export function Dashboard(props: DashboardProps) {
|
||||
<ComponentErrorBoundary name="Guest Table">
|
||||
<Card padding="none" tone="glass" class="mb-4 overflow-hidden">
|
||||
<div class="overflow-x-auto">
|
||||
<table class="w-full border-collapse whitespace-nowrap">
|
||||
<table class="w-full border-collapse whitespace-nowrap table-fixed">
|
||||
<thead>
|
||||
<tr class="bg-gray-50 dark:bg-gray-700/50 text-gray-600 dark:text-gray-300 border-b border-gray-200 dark:border-gray-700">
|
||||
<For each={visibleColumns()}>
|
||||
@@ -986,6 +1097,7 @@ export function Dashboard(props: DashboardProps) {
|
||||
class={`py-1 text-[11px] sm:text-xs font-medium uppercase tracking-wider whitespace-nowrap
|
||||
${isFirst() ? 'pl-4 pr-2 text-left' : 'px-2 text-center'}
|
||||
${isSortable ? 'cursor-pointer hover:bg-gray-200 dark:hover:bg-gray-600' : ''}`}
|
||||
style={col.width ? { width: col.width } : undefined}
|
||||
onClick={() => isSortable && handleSort(sortKeyForCol!)}
|
||||
>
|
||||
{col.label} {isSorted() && (sortDirection() === 'asc' ? '▲' : '▼')}
|
||||
@@ -1014,13 +1126,20 @@ export function Dashboard(props: DashboardProps) {
|
||||
<NodeGroupHeader node={node!} renderAs="tr" colspan={totalColumns()} />
|
||||
</Show>
|
||||
<For each={guests} fallback={<></>}>
|
||||
{(guest) => {
|
||||
{(guest, index) => {
|
||||
const guestId = guest.id || `${guest.instance}-${guest.vmid}`;
|
||||
const metadata =
|
||||
guestMetadata()[guestId] ||
|
||||
guestMetadata()[`${guest.node}-${guest.vmid}`];
|
||||
const parentNode = node ?? resolveParentNode(guest);
|
||||
const parentNodeOnline = parentNode ? isNodeOnline(parentNode) : true;
|
||||
|
||||
// Get adjacent guest IDs for merged AI context borders
|
||||
const prevGuest = guests[index() - 1];
|
||||
const nextGuest = guests[index() + 1];
|
||||
const prevGuestId = prevGuest ? (prevGuest.id || `${prevGuest.instance}-${prevGuest.vmid}`) : null;
|
||||
const nextGuestId = nextGuest ? (nextGuest.id || `${nextGuest.instance}-${nextGuest.vmid}`) : null;
|
||||
|
||||
return (
|
||||
<ComponentErrorBoundary name="GuestRow">
|
||||
<GuestRow
|
||||
@@ -1033,6 +1152,9 @@ export function Dashboard(props: DashboardProps) {
|
||||
onCustomUrlUpdate={handleCustomUrlUpdate}
|
||||
isGroupedView={groupingMode() === 'grouped'}
|
||||
visibleColumnIds={visibleColumnIds()}
|
||||
aboveGuestId={prevGuestId}
|
||||
belowGuestId={nextGuestId}
|
||||
onRowClick={aiChatStore.isOpen ? handleGuestRowClick : undefined}
|
||||
/>
|
||||
</ComponentErrorBoundary>
|
||||
);
|
||||
|
||||
@@ -3,7 +3,9 @@ import { Card } from '@/components/shared/Card';
|
||||
import { SearchTipsPopover } from '@/components/shared/SearchTipsPopover';
|
||||
import { MetricsViewToggle } from '@/components/shared/MetricsViewToggle';
|
||||
import { ColumnPicker } from '@/components/shared/ColumnPicker';
|
||||
import { InvestigateProblemsButton } from './InvestigateProblemsButton';
|
||||
import type { ColumnDef } from '@/hooks/useColumnVisibility';
|
||||
import type { VM, Container } from '@/types/api';
|
||||
import { STORAGE_KEYS } from '@/utils/localStorage';
|
||||
import { createSearchHistoryManager } from '@/utils/searchHistory';
|
||||
|
||||
@@ -17,6 +19,10 @@ interface DashboardFilterProps {
|
||||
setStatusMode: (value: 'all' | 'running' | 'degraded' | 'stopped') => void;
|
||||
backupMode: () => 'all' | 'needs-backup';
|
||||
setBackupMode: (value: 'all' | 'needs-backup') => void;
|
||||
problemsMode: () => 'all' | 'problems';
|
||||
setProblemsMode: (value: 'all' | 'problems') => void;
|
||||
/** Guests that match the problems filter - used for AI investigation */
|
||||
filteredProblemGuests?: () => (VM | Container)[];
|
||||
groupingMode: () => 'grouped' | 'flat';
|
||||
setGroupingMode: (value: 'grouped' | 'flat') => void;
|
||||
setSortKey: (value: string) => void;
|
||||
@@ -98,10 +104,10 @@ export const DashboardFilter: Component<DashboardFilterProps> = (props) => {
|
||||
|
||||
return (
|
||||
<Card class="dashboard-filter mb-3" padding="sm">
|
||||
<div class="flex flex-col lg:flex-row gap-3">
|
||||
{/* Search Bar */}
|
||||
<div class="flex gap-2 flex-1 items-center">
|
||||
<div class="relative flex-1">
|
||||
<div class="flex flex-col gap-3">
|
||||
{/* Row 1: Search Bar */}
|
||||
<div class="flex gap-2 items-center">
|
||||
<div class="relative flex-1 min-w-0">
|
||||
<input
|
||||
ref={(el) => {
|
||||
searchInputEl = el;
|
||||
@@ -154,27 +160,33 @@ export const DashboardFilter: Component<DashboardFilterProps> = (props) => {
|
||||
<Show when={props.search()}>
|
||||
<button
|
||||
type="button"
|
||||
class="absolute right-9 top-1/2 -translate-y-1/2 transform text-gray-400 dark:text-gray-500 hover:text-gray-600 dark:hover:text-gray-300 transition-colors"
|
||||
class="absolute right-[4.5rem] top-1/2 -translate-y-1/2 transform p-1 rounded-full
|
||||
bg-gray-200 dark:bg-gray-600 text-gray-500 dark:text-gray-300
|
||||
hover:bg-red-100 hover:text-red-600 dark:hover:bg-red-900/50 dark:hover:text-red-400
|
||||
transition-all duration-150 active:scale-90"
|
||||
onClick={() => props.setSearch('')}
|
||||
onMouseDown={markSuppressCommit}
|
||||
aria-label="Clear search"
|
||||
title="Clear search"
|
||||
>
|
||||
<svg class="h-4 w-4" fill="none" viewBox="0 0 24 24" stroke="currentColor">
|
||||
<svg class="h-3 w-3" fill="none" viewBox="0 0 24 24" stroke="currentColor" stroke-width="3">
|
||||
<path
|
||||
stroke-linecap="round"
|
||||
stroke-linejoin="round"
|
||||
stroke-width="2"
|
||||
d="M6 18L18 6M6 6l12 12"
|
||||
/>
|
||||
</svg>
|
||||
</button>
|
||||
</Show>
|
||||
<div class="absolute inset-y-0 right-2 flex items-center gap-1">
|
||||
<div class="absolute inset-y-0 right-2 flex items-center gap-0.5">
|
||||
<button
|
||||
ref={(el) => (historyToggleRef = el)}
|
||||
type="button"
|
||||
class="flex h-6 w-6 items-center justify-center rounded-lg border border-transparent text-gray-400 transition-colors hover:border-gray-200 hover:text-gray-600 focus:outline-none focus:ring-2 focus:ring-blue-500/40 focus:ring-offset-1 focus:ring-offset-white dark:text-gray-500 dark:hover:border-gray-700 dark:hover:text-gray-200 dark:focus:ring-blue-400/40 dark:focus:ring-offset-gray-900"
|
||||
class={`flex h-6 w-6 items-center justify-center rounded-md transition-colors
|
||||
${isHistoryOpen()
|
||||
? 'bg-blue-100 dark:bg-blue-900/40 text-blue-600 dark:text-blue-400'
|
||||
: 'text-gray-400 dark:text-gray-500 hover:bg-gray-100 dark:hover:bg-gray-700 hover:text-gray-600 dark:hover:text-gray-300'
|
||||
}`}
|
||||
onClick={() =>
|
||||
setIsHistoryOpen((prev) => {
|
||||
const next = !prev;
|
||||
@@ -193,13 +205,9 @@ export const DashboardFilter: Component<DashboardFilterProps> = (props) => {
|
||||
: 'No recent searches yet'
|
||||
}
|
||||
>
|
||||
<svg class="h-3.5 w-3.5" fill="none" viewBox="0 0 24 24" stroke="currentColor">
|
||||
<path
|
||||
stroke-linecap="round"
|
||||
stroke-linejoin="round"
|
||||
stroke-width="2"
|
||||
d="M12 8v4l2.5 1.5M21 12a9 9 0 11-18 0 9 9 0 0118 0z"
|
||||
/>
|
||||
{/* Dropdown chevron icon - clearer than clock */}
|
||||
<svg class="h-3.5 w-3.5" fill="none" viewBox="0 0 24 24" stroke="currentColor" stroke-width="2">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" d="M19 9l-7 7-7-7" />
|
||||
</svg>
|
||||
<span class="sr-only">Show search history</span>
|
||||
</button>
|
||||
@@ -294,97 +302,141 @@ export const DashboardFilter: Component<DashboardFilterProps> = (props) => {
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Filters */}
|
||||
{/* Row 2: Filters - grouped for logical wrapping */}
|
||||
<div class="flex flex-wrap items-center gap-x-2 gap-y-2">
|
||||
{/* Problems Toggle - Prominent "Show me issues" button */}
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => props.setProblemsMode(props.problemsMode() === 'problems' ? 'all' : 'problems')}
|
||||
class={`inline-flex items-center gap-1.5 px-3 py-1.5 text-xs font-semibold rounded-lg transition-all duration-150 active:scale-95 ${props.problemsMode() === 'problems'
|
||||
? 'bg-gradient-to-r from-red-500 to-orange-500 text-white shadow-md shadow-red-500/25 ring-1 ring-red-400'
|
||||
: 'bg-gray-100 dark:bg-gray-700 text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-gray-100 hover:bg-gray-200 dark:hover:bg-gray-600'
|
||||
}`}
|
||||
title="Show guests that need attention: degraded status, backup issues, or high resource usage (>90%)"
|
||||
>
|
||||
<svg class={`w-3.5 h-3.5 ${props.problemsMode() === 'problems' ? 'animate-pulse' : ''}`} viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2.5" stroke-linecap="round" stroke-linejoin="round">
|
||||
<path d="M10.29 3.86L1.82 18a2 2 0 0 0 1.71 3h16.94a2 2 0 0 0 1.71-3L13.71 3.86a2 2 0 0 0-3.42 0z" />
|
||||
<line x1="12" y1="9" x2="12" y2="13" />
|
||||
<line x1="12" y1="17" x2="12.01" y2="17" />
|
||||
</svg>
|
||||
Problems
|
||||
</button>
|
||||
|
||||
{/* AI Investigation button - appears when problems mode is active */}
|
||||
<Show when={props.filteredProblemGuests}>
|
||||
<InvestigateProblemsButton
|
||||
problemGuests={props.filteredProblemGuests!()}
|
||||
isProblemsMode={props.problemsMode() === 'problems'}
|
||||
/>
|
||||
</Show>
|
||||
|
||||
<div class="h-5 w-px bg-gray-200 dark:bg-gray-600 hidden sm:block"></div>
|
||||
|
||||
{/* Primary Filters Group: Type + Status */}
|
||||
<div class="flex flex-wrap items-center gap-2">
|
||||
{/* Type Filter */}
|
||||
<div class="inline-flex rounded-lg bg-gray-100 dark:bg-gray-700 p-0.5">
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => props.setViewMode('all')}
|
||||
class={`px-2.5 py-1 text-xs font-medium rounded-md transition-all duration-150 active:scale-95 ${props.viewMode() === 'all'
|
||||
? 'bg-white dark:bg-gray-800 text-gray-900 dark:text-gray-100 shadow-sm ring-1 ring-gray-200 dark:ring-gray-600'
|
||||
: 'text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-gray-100 hover:bg-gray-50 dark:hover:bg-gray-600/50'
|
||||
}`}
|
||||
>
|
||||
All
|
||||
</button>
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => props.setViewMode(props.viewMode() === 'vm' ? 'all' : 'vm')}
|
||||
class={`inline-flex items-center gap-1.5 px-2.5 py-1 text-xs font-medium rounded-md transition-all duration-150 active:scale-95 ${props.viewMode() === 'vm'
|
||||
? 'bg-white dark:bg-gray-800 text-blue-600 dark:text-blue-400 shadow-sm ring-1 ring-blue-200 dark:ring-blue-800'
|
||||
: 'text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-gray-100 hover:bg-gray-50 dark:hover:bg-gray-600/50'
|
||||
}`}
|
||||
>
|
||||
<svg class="w-3 h-3" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2">
|
||||
<rect x="2" y="3" width="20" height="14" rx="2" />
|
||||
<path d="M8 21h8M12 17v4" />
|
||||
</svg>
|
||||
VMs
|
||||
</button>
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => props.setViewMode(props.viewMode() === 'lxc' ? 'all' : 'lxc')}
|
||||
class={`inline-flex items-center gap-1.5 px-2.5 py-1 text-xs font-medium rounded-md transition-all duration-150 active:scale-95 ${props.viewMode() === 'lxc'
|
||||
? 'bg-white dark:bg-gray-800 text-green-600 dark:text-green-400 shadow-sm ring-1 ring-green-200 dark:ring-green-800'
|
||||
: 'text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-gray-100 hover:bg-gray-50 dark:hover:bg-gray-600/50'
|
||||
}`}
|
||||
>
|
||||
<svg class="w-3 h-3" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2">
|
||||
<path d="M21 16V8a2 2 0 0 0-1-1.73l-7-4a2 2 0 0 0-2 0l-7 4A2 2 0 0 0 3 8v8a2 2 0 0 0 1 1.73l7 4a2 2 0 0 0 2 0l7-4A2 2 0 0 0 21 16z" />
|
||||
</svg>
|
||||
LXCs
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<div class="h-5 w-px bg-gray-200 dark:bg-gray-600 hidden sm:block"></div>
|
||||
|
||||
{/* Status Filter */}
|
||||
<div class="inline-flex rounded-lg bg-gray-100 dark:bg-gray-700 p-0.5">
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => props.setStatusMode('all')}
|
||||
class={`px-2.5 py-1 text-xs font-medium rounded-md transition-all duration-150 active:scale-95 ${props.statusMode() === 'all'
|
||||
? 'bg-white dark:bg-gray-800 text-gray-900 dark:text-gray-100 shadow-sm ring-1 ring-gray-200 dark:ring-gray-600'
|
||||
: 'text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-gray-100 hover:bg-gray-50 dark:hover:bg-gray-600/50'
|
||||
}`}
|
||||
>
|
||||
All
|
||||
</button>
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => props.setStatusMode(props.statusMode() === 'running' ? 'all' : 'running')}
|
||||
class={`inline-flex items-center gap-1.5 px-2.5 py-1 text-xs font-medium rounded-md transition-all duration-150 active:scale-95 ${props.statusMode() === 'running'
|
||||
? 'bg-white dark:bg-gray-800 text-green-600 dark:text-green-400 shadow-sm ring-1 ring-green-200 dark:ring-green-800'
|
||||
: 'text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-gray-100 hover:bg-gray-50 dark:hover:bg-gray-600/50'
|
||||
}`}
|
||||
>
|
||||
<span class={`w-2 h-2 rounded-full ${props.statusMode() === 'running' ? 'bg-green-500 shadow-sm shadow-green-500/50' : 'bg-green-400/60'}`} />
|
||||
Running
|
||||
</button>
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => props.setStatusMode(props.statusMode() === 'degraded' ? 'all' : 'degraded')}
|
||||
class={`inline-flex items-center gap-1.5 px-2.5 py-1 text-xs font-medium rounded-md transition-all duration-150 active:scale-95 ${props.statusMode() === 'degraded'
|
||||
? 'bg-white dark:bg-gray-800 text-amber-600 dark:text-amber-400 shadow-sm ring-1 ring-amber-200 dark:ring-amber-800'
|
||||
: 'text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-gray-100 hover:bg-gray-50 dark:hover:bg-gray-600/50'
|
||||
}`}
|
||||
>
|
||||
<span class={`w-2 h-2 rounded-full ${props.statusMode() === 'degraded' ? 'bg-amber-500 shadow-sm shadow-amber-500/50' : 'bg-amber-400/60'}`} />
|
||||
Degraded
|
||||
</button>
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => props.setStatusMode(props.statusMode() === 'stopped' ? 'all' : 'stopped')}
|
||||
class={`inline-flex items-center gap-1.5 px-2.5 py-1 text-xs font-medium rounded-md transition-all duration-150 active:scale-95 ${props.statusMode() === 'stopped'
|
||||
? 'bg-white dark:bg-gray-800 text-red-600 dark:text-red-400 shadow-sm ring-1 ring-red-200 dark:ring-red-800'
|
||||
: 'text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-gray-100 hover:bg-gray-50 dark:hover:bg-gray-600/50'
|
||||
}`}
|
||||
>
|
||||
<span class={`w-2 h-2 rounded-full ${props.statusMode() === 'stopped' ? 'bg-red-500 shadow-sm shadow-red-500/50' : 'bg-red-400/60'}`} />
|
||||
Stopped
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
{/* End Primary Filters Group */}
|
||||
|
||||
{/* Secondary Controls Group: Backup, Grouping, View, Columns, Reset */}
|
||||
<div class="flex flex-wrap items-center gap-2">
|
||||
{/* Type Filter */}
|
||||
<div class="inline-flex rounded-lg bg-gray-100 dark:bg-gray-700 p-0.5">
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => props.setViewMode('all')}
|
||||
class={`px-2.5 py-1 text-xs font-medium rounded-md transition-all ${props.viewMode() === 'all'
|
||||
? 'bg-white dark:bg-gray-800 text-gray-900 dark:text-gray-100 shadow-sm'
|
||||
: 'text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-gray-100'
|
||||
}`}
|
||||
>
|
||||
All
|
||||
</button>
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => props.setViewMode(props.viewMode() === 'vm' ? 'all' : 'vm')}
|
||||
class={`px-2.5 py-1 text-xs font-medium rounded-md transition-all ${props.viewMode() === 'vm'
|
||||
? 'bg-white dark:bg-gray-800 text-blue-600 dark:text-blue-400 shadow-sm'
|
||||
: 'text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-gray-100'
|
||||
}`}
|
||||
>
|
||||
VMs
|
||||
</button>
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => props.setViewMode(props.viewMode() === 'lxc' ? 'all' : 'lxc')}
|
||||
class={`px-2.5 py-1 text-xs font-medium rounded-md transition-all ${props.viewMode() === 'lxc'
|
||||
? 'bg-white dark:bg-gray-800 text-green-600 dark:text-green-400 shadow-sm'
|
||||
: 'text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-gray-100'
|
||||
}`}
|
||||
>
|
||||
LXCs
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<div class="h-5 w-px bg-gray-200 dark:bg-gray-600 hidden sm:block"></div>
|
||||
|
||||
{/* Status Filter */}
|
||||
<div class="inline-flex rounded-lg bg-gray-100 dark:bg-gray-700 p-0.5">
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => props.setStatusMode('all')}
|
||||
class={`px-2.5 py-1 text-xs font-medium rounded-md transition-all ${props.statusMode() === 'all'
|
||||
? 'bg-white dark:bg-gray-800 text-gray-900 dark:text-gray-100 shadow-sm'
|
||||
: 'text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-gray-100'
|
||||
}`}
|
||||
>
|
||||
All
|
||||
</button>
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => props.setStatusMode(props.statusMode() === 'running' ? 'all' : 'running')}
|
||||
class={`px-2.5 py-1 text-xs font-medium rounded-md transition-all ${props.statusMode() === 'running'
|
||||
? 'bg-white dark:bg-gray-800 text-green-600 dark:text-green-400 shadow-sm'
|
||||
: 'text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-gray-100'
|
||||
}`}
|
||||
>
|
||||
Running
|
||||
</button>
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => props.setStatusMode(props.statusMode() === 'degraded' ? 'all' : 'degraded')}
|
||||
class={`px-2.5 py-1 text-xs font-medium rounded-md transition-all ${props.statusMode() === 'degraded'
|
||||
? 'bg-white dark:bg-gray-800 text-amber-600 dark:text-amber-400 shadow-sm'
|
||||
: 'text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-gray-100'
|
||||
}`}
|
||||
>
|
||||
Degraded
|
||||
</button>
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => props.setStatusMode(props.statusMode() === 'stopped' ? 'all' : 'stopped')}
|
||||
class={`px-2.5 py-1 text-xs font-medium rounded-md transition-all ${props.statusMode() === 'stopped'
|
||||
? 'bg-white dark:bg-gray-800 text-red-600 dark:text-red-400 shadow-sm'
|
||||
: 'text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-gray-100'
|
||||
}`}
|
||||
>
|
||||
Stopped
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<div class="h-5 w-px bg-gray-200 dark:bg-gray-600 hidden sm:block"></div>
|
||||
|
||||
{/* Backup Filter */}
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => props.setBackupMode(props.backupMode() === 'needs-backup' ? 'all' : 'needs-backup')}
|
||||
class={`inline-flex items-center gap-1.5 px-2.5 py-1 text-xs font-medium rounded-lg transition-all ${props.backupMode() === 'needs-backup'
|
||||
? 'bg-orange-100 dark:bg-orange-900/50 text-orange-700 dark:text-orange-300 shadow-sm'
|
||||
: 'bg-gray-100 dark:bg-gray-700 text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-gray-100'
|
||||
class={`inline-flex items-center gap-1.5 px-2.5 py-1.5 text-xs font-medium rounded-lg transition-all duration-150 active:scale-95 ${props.backupMode() === 'needs-backup'
|
||||
? 'bg-orange-100 dark:bg-orange-900/50 text-orange-700 dark:text-orange-300 shadow-sm ring-1 ring-orange-200 dark:ring-orange-800'
|
||||
: 'bg-gray-100 dark:bg-gray-700 text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-gray-100 hover:bg-gray-200 dark:hover:bg-gray-600'
|
||||
}`}
|
||||
title="Show guests with stale or missing backups"
|
||||
>
|
||||
@@ -402,23 +454,34 @@ export const DashboardFilter: Component<DashboardFilterProps> = (props) => {
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => props.setGroupingMode('grouped')}
|
||||
class={`px-2.5 py-1 text-xs font-medium rounded-md transition-all ${props.groupingMode() === 'grouped'
|
||||
? 'bg-white dark:bg-gray-800 text-gray-900 dark:text-gray-100 shadow-sm'
|
||||
: 'text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-gray-100'
|
||||
class={`inline-flex items-center gap-1.5 px-2.5 py-1 text-xs font-medium rounded-md transition-all duration-150 active:scale-95 ${props.groupingMode() === 'grouped'
|
||||
? 'bg-white dark:bg-gray-800 text-gray-900 dark:text-gray-100 shadow-sm ring-1 ring-gray-200 dark:ring-gray-600'
|
||||
: 'text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-gray-100 hover:bg-gray-50 dark:hover:bg-gray-600/50'
|
||||
}`}
|
||||
title="Group by node"
|
||||
>
|
||||
<svg class="w-3 h-3" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2">
|
||||
<path d="M22 19a2 2 0 0 1-2 2H4a2 2 0 0 1-2-2V5a2 2 0 0 1 2-2h5l2 3h9a2 2 0 0 1 2 2v11z" />
|
||||
</svg>
|
||||
Grouped
|
||||
</button>
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => props.setGroupingMode(props.groupingMode() === 'flat' ? 'grouped' : 'flat')}
|
||||
class={`px-2.5 py-1 text-xs font-medium rounded-md transition-all ${props.groupingMode() === 'flat'
|
||||
? 'bg-white dark:bg-gray-800 text-gray-900 dark:text-gray-100 shadow-sm'
|
||||
: 'text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-gray-100'
|
||||
class={`inline-flex items-center gap-1.5 px-2.5 py-1 text-xs font-medium rounded-md transition-all duration-150 active:scale-95 ${props.groupingMode() === 'flat'
|
||||
? 'bg-white dark:bg-gray-800 text-gray-900 dark:text-gray-100 shadow-sm ring-1 ring-gray-200 dark:ring-gray-600'
|
||||
: 'text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-gray-100 hover:bg-gray-50 dark:hover:bg-gray-600/50'
|
||||
}`}
|
||||
title="Flat list view"
|
||||
>
|
||||
<svg class="w-3 h-3" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2">
|
||||
<line x1="8" y1="6" x2="21" y2="6" />
|
||||
<line x1="8" y1="12" x2="21" y2="12" />
|
||||
<line x1="8" y1="18" x2="21" y2="18" />
|
||||
<line x1="3" y1="6" x2="3.01" y2="6" />
|
||||
<line x1="3" y1="12" x2="3.01" y2="12" />
|
||||
<line x1="3" y1="18" x2="3.01" y2="18" />
|
||||
</svg>
|
||||
List
|
||||
</button>
|
||||
</div>
|
||||
@@ -453,15 +516,17 @@ export const DashboardFilter: Component<DashboardFilterProps> = (props) => {
|
||||
props.setViewMode('all');
|
||||
props.setStatusMode('all');
|
||||
props.setBackupMode('all');
|
||||
props.setProblemsMode('all');
|
||||
props.setGroupingMode('grouped');
|
||||
}}
|
||||
title="Reset all filters"
|
||||
class={`flex items-center justify-center px-2.5 py-1 text-xs font-medium rounded-lg transition-colors ${props.search() ||
|
||||
class={`flex items-center justify-center px-2.5 py-1.5 text-xs font-medium rounded-lg transition-all duration-150 active:scale-95 ${props.search() ||
|
||||
props.viewMode() !== 'all' ||
|
||||
props.statusMode() !== 'all' ||
|
||||
props.backupMode() !== 'all' ||
|
||||
props.problemsMode() !== 'all' ||
|
||||
props.groupingMode() !== 'grouped'
|
||||
? 'text-blue-700 dark:text-blue-300 bg-blue-100 dark:bg-blue-900/50 hover:bg-blue-200 dark:hover:bg-blue-900/70'
|
||||
? 'text-blue-700 dark:text-blue-300 bg-blue-100 dark:bg-blue-900/50 hover:bg-blue-200 dark:hover:bg-blue-900/70 ring-1 ring-blue-200 dark:ring-blue-800'
|
||||
: 'text-gray-600 dark:text-gray-400 bg-gray-100 dark:bg-gray-700 hover:bg-gray-200 dark:hover:bg-gray-600'
|
||||
}`}
|
||||
>
|
||||
|
||||
@@ -49,7 +49,7 @@ export function EnhancedCPUBar(props: EnhancedCPUBarProps) {
|
||||
fallback={
|
||||
<div ref={containerRef} class="metric-text w-full h-4 flex items-center justify-center">
|
||||
<div
|
||||
class="relative w-full max-w-[140px] h-full overflow-hidden bg-gray-200 dark:bg-gray-600 rounded cursor-help"
|
||||
class="relative w-full h-full overflow-hidden bg-gray-200 dark:bg-gray-600 rounded cursor-help"
|
||||
onMouseEnter={handleMouseEnter}
|
||||
onMouseLeave={handleMouseLeave}
|
||||
>
|
||||
@@ -112,19 +112,21 @@ export function EnhancedCPUBar(props: EnhancedCPUBarProps) {
|
||||
</div>
|
||||
}
|
||||
>
|
||||
{/* Sparkline mode */}
|
||||
<div class="metric-text w-full h-6 flex items-center gap-1.5">
|
||||
<div class="flex-1 min-w-0">
|
||||
<Sparkline
|
||||
data={metricHistory()}
|
||||
metric="cpu"
|
||||
width={0}
|
||||
height={24}
|
||||
/>
|
||||
{/* Sparkline mode - scales to fill column width, matching bar mode */}
|
||||
<div class="metric-text w-full h-6 flex items-center justify-center">
|
||||
<div class="flex items-center gap-1.5 w-full">
|
||||
<div class="flex-1 min-w-0 h-6">
|
||||
<Sparkline
|
||||
data={metricHistory()}
|
||||
metric="cpu"
|
||||
width={0}
|
||||
height={24}
|
||||
/>
|
||||
</div>
|
||||
<span class="text-[10px] font-medium text-gray-800 dark:text-gray-100 whitespace-nowrap flex-shrink-0 w-[35px] text-right">
|
||||
{formatPercent(props.usage)}
|
||||
</span>
|
||||
</div>
|
||||
<span class="text-[10px] font-medium text-gray-800 dark:text-gray-100 whitespace-nowrap flex-shrink-0 min-w-[35px]">
|
||||
{formatPercent(props.usage)}
|
||||
</span>
|
||||
</div>
|
||||
</Show>
|
||||
);
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -0,0 +1,154 @@
|
||||
import { Component, Show, createMemo } from 'solid-js';
|
||||
import { aiChatStore } from '@/stores/aiChat';
|
||||
import type { VM, Container } from '@/types/api';
|
||||
import { getBackupInfo } from '@/utils/format';
|
||||
import { DEGRADED_HEALTH_STATUSES, OFFLINE_HEALTH_STATUSES } from '@/utils/status';
|
||||
|
||||
interface ProblemGuest {
|
||||
guest: VM | Container;
|
||||
issues: string[];
|
||||
}
|
||||
|
||||
interface InvestigateProblemsButtonProps {
|
||||
/** The filtered guests currently showing as "problems" */
|
||||
problemGuests: (VM | Container)[];
|
||||
/** Whether the problems filter is active */
|
||||
isProblemsMode: boolean;
|
||||
}
|
||||
|
||||
/**
|
||||
* "Investigate Problems with AI" button.
|
||||
* Appears when the Problems filter is active and finds results.
|
||||
* Opens AI chat with rich context about all problem guests.
|
||||
*/
|
||||
export const InvestigateProblemsButton: Component<InvestigateProblemsButtonProps> = (props) => {
|
||||
// Analyze each guest to determine what issues they have
|
||||
const analyzedProblems = createMemo((): ProblemGuest[] => {
|
||||
return props.problemGuests.map(guest => {
|
||||
const issues: string[] = [];
|
||||
|
||||
// Check for degraded status
|
||||
const status = (guest.status || '').toLowerCase();
|
||||
const isDegraded = DEGRADED_HEALTH_STATUSES.has(status) ||
|
||||
(status !== 'running' && !OFFLINE_HEALTH_STATUSES.has(status) && status !== 'stopped');
|
||||
if (isDegraded) {
|
||||
issues.push(`Status: ${status || 'unknown'}`);
|
||||
}
|
||||
|
||||
// Check for backup issues
|
||||
if (!guest.template) {
|
||||
const backupInfo = getBackupInfo(guest.lastBackup);
|
||||
if (backupInfo.status === 'critical') {
|
||||
issues.push('Backup: Critical (very overdue)');
|
||||
} else if (backupInfo.status === 'stale') {
|
||||
issues.push('Backup: Stale');
|
||||
} else if (backupInfo.status === 'never') {
|
||||
issues.push('Backup: Never backed up');
|
||||
}
|
||||
}
|
||||
|
||||
// Check for high CPU
|
||||
if (guest.cpu > 0.9) {
|
||||
issues.push(`CPU: ${(guest.cpu * 100).toFixed(0)}%`);
|
||||
}
|
||||
|
||||
// Check for high memory
|
||||
if (guest.memory && guest.memory.usage && guest.memory.usage > 90) {
|
||||
issues.push(`Memory: ${guest.memory.usage.toFixed(0)}%`);
|
||||
}
|
||||
|
||||
return { guest, issues };
|
||||
}).filter(p => p.issues.length > 0);
|
||||
});
|
||||
|
||||
const handleClick = (e: MouseEvent) => {
|
||||
e.stopPropagation();
|
||||
e.preventDefault();
|
||||
|
||||
const problems = analyzedProblems();
|
||||
if (problems.length === 0) return;
|
||||
|
||||
// Build a rich prompt with all problem details
|
||||
const problemSummary = problems.map(({ guest, issues }) => {
|
||||
const type = guest.type === 'qemu' ? 'VM' : 'LXC';
|
||||
return `- **${guest.name}** (${type} ${guest.vmid} on ${guest.node}): ${issues.join(', ')}`;
|
||||
}).join('\n');
|
||||
|
||||
// Categorize issues for better AI understanding
|
||||
const issueCategories = {
|
||||
backup: problems.filter(p => p.issues.some(i => i.startsWith('Backup:'))).length,
|
||||
cpu: problems.filter(p => p.issues.some(i => i.startsWith('CPU:'))).length,
|
||||
memory: problems.filter(p => p.issues.some(i => i.startsWith('Memory:'))).length,
|
||||
status: problems.filter(p => p.issues.some(i => i.startsWith('Status:'))).length,
|
||||
};
|
||||
|
||||
const categoryBreakdown = Object.entries(issueCategories)
|
||||
.filter(([, count]) => count > 0)
|
||||
.map(([type, count]) => `${count} ${type}`)
|
||||
.join(', ');
|
||||
|
||||
const prompt = `I have ${problems.length} guest${problems.length !== 1 ? 's' : ''} that need attention (${categoryBreakdown}):
|
||||
|
||||
${problemSummary}
|
||||
|
||||
Please help me:
|
||||
1. **Prioritize** - Which issues should I address first?
|
||||
2. **Investigate** - For the most critical issues, check their current status
|
||||
3. **Remediate** - Suggest specific steps to resolve each issue
|
||||
4. **Prevent** - Recommend any configuration changes to prevent these issues
|
||||
|
||||
Start with the most critical problems first.`;
|
||||
|
||||
// Open AI chat with this rich context
|
||||
aiChatStore.openWithPrompt(prompt, {
|
||||
context: {
|
||||
problemCount: problems.length,
|
||||
issueCategories,
|
||||
guests: problems.map(p => ({
|
||||
id: p.guest.id,
|
||||
name: p.guest.name,
|
||||
vmid: p.guest.vmid,
|
||||
type: p.guest.type,
|
||||
node: p.guest.node,
|
||||
issues: p.issues,
|
||||
})),
|
||||
},
|
||||
});
|
||||
};
|
||||
|
||||
// Only show if problems mode is active AND there are results
|
||||
const shouldShow = createMemo(() =>
|
||||
props.isProblemsMode && props.problemGuests.length > 0
|
||||
);
|
||||
|
||||
return (
|
||||
<Show when={shouldShow()}>
|
||||
<button
|
||||
type="button"
|
||||
onClick={handleClick}
|
||||
class="inline-flex items-center gap-1.5 px-3 py-1.5 text-xs font-semibold rounded-lg
|
||||
bg-gradient-to-r from-purple-500 to-blue-500
|
||||
hover:from-purple-600 hover:to-blue-600
|
||||
text-white shadow-md shadow-purple-500/25
|
||||
hover:shadow-lg hover:shadow-purple-500/30
|
||||
transition-all duration-150 active:scale-95
|
||||
ring-1 ring-purple-400/50"
|
||||
title={`Ask AI to help investigate and resolve ${props.problemGuests.length} problem${props.problemGuests.length !== 1 ? 's' : ''}`}
|
||||
>
|
||||
<svg class="w-3.5 h-3.5" fill="none" viewBox="0 0 24 24" stroke="currentColor" stroke-width="2">
|
||||
<path
|
||||
stroke-linecap="round"
|
||||
stroke-linejoin="round"
|
||||
d="M9.663 17h4.673M12 3v1m6.364 1.636l-.707.707M21 12h-1M4 12H3m3.343-5.657l-.707-.707m2.828 9.9a5 5 0 117.072 0l-.548.547A3.374 3.374 0 0014 18.469V19a2 2 0 11-4 0v-.531c0-.895-.356-1.754-.988-2.386l-.548-.547z"
|
||||
/>
|
||||
</svg>
|
||||
<span>Investigate {props.problemGuests.length} with AI</span>
|
||||
<svg class="w-3 h-3 opacity-70" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2">
|
||||
<path stroke-linecap="round" stroke-linejoin="round" d="M13 7l5 5m0 0l-5 5m5-5H6" />
|
||||
</svg>
|
||||
</button>
|
||||
</Show>
|
||||
);
|
||||
};
|
||||
|
||||
export default InvestigateProblemsButton;
|
||||
@@ -102,9 +102,9 @@ export function MetricBar(props: MetricBarProps) {
|
||||
<Show
|
||||
when={viewMode() === 'sparklines' && props.resourceId}
|
||||
fallback={
|
||||
// Original progress bar mode - capped width for better appearance on wide screens
|
||||
// Progress bar mode - scales to fill column width
|
||||
<div ref={containerRef} class="metric-text w-full h-4 flex items-center justify-center">
|
||||
<div class={`relative w-full max-w-[140px] h-full overflow-hidden bg-gray-200 dark:bg-gray-600 rounded ${props.class || ''}`}>
|
||||
<div class={`relative w-full h-full overflow-hidden bg-gray-200 dark:bg-gray-600 rounded ${props.class || ''}`}>
|
||||
<div class={`absolute top-0 left-0 h-full ${progressColorClass()}`} style={{ width: `${width()}%` }} />
|
||||
<span class="absolute inset-0 flex items-center justify-center text-[10px] font-semibold text-gray-700 dark:text-gray-100 leading-none">
|
||||
<span class="flex items-center gap-1 whitespace-nowrap px-0.5">
|
||||
@@ -120,19 +120,21 @@ export function MetricBar(props: MetricBarProps) {
|
||||
</div>
|
||||
}
|
||||
>
|
||||
{/* Sparkline mode */}
|
||||
<div class="metric-text w-full h-6 flex items-center gap-1.5">
|
||||
<div class="flex-1 min-w-0">
|
||||
<Sparkline
|
||||
data={metricHistory()}
|
||||
metric={sparklineMetric()}
|
||||
width={0}
|
||||
height={24}
|
||||
/>
|
||||
{/* Sparkline mode - scales to fill column width, matching bar mode sizing */}
|
||||
<div class="metric-text w-full h-6 flex items-center justify-center">
|
||||
<div class="flex items-center gap-1.5 w-full">
|
||||
<div class="flex-1 min-w-0 h-6">
|
||||
<Sparkline
|
||||
data={metricHistory()}
|
||||
metric={sparklineMetric()}
|
||||
width={0}
|
||||
height={24}
|
||||
/>
|
||||
</div>
|
||||
<span class="text-[10px] font-medium text-gray-800 dark:text-gray-100 whitespace-nowrap flex-shrink-0 w-[35px] text-right">
|
||||
{props.label}
|
||||
</span>
|
||||
</div>
|
||||
<span class="text-[10px] font-medium text-gray-800 dark:text-gray-100 whitespace-nowrap flex-shrink-0 min-w-[35px]">
|
||||
{props.label}
|
||||
</span>
|
||||
</div>
|
||||
</Show>
|
||||
);
|
||||
|
||||
@@ -192,7 +192,7 @@ export function StackedDiskBar(props: StackedDiskBarProps) {
|
||||
return (
|
||||
<div ref={containerRef} class="metric-text w-full h-4 flex items-center justify-center">
|
||||
<div
|
||||
class="relative w-full max-w-[140px] h-full overflow-hidden bg-gray-200 dark:bg-gray-600 rounded"
|
||||
class="relative w-full h-full overflow-hidden bg-gray-200 dark:bg-gray-600 rounded"
|
||||
onMouseEnter={handleMouseEnter}
|
||||
onMouseLeave={handleMouseLeave}
|
||||
>
|
||||
|
||||
@@ -123,7 +123,7 @@ export function StackedMemoryBar(props: StackedMemoryBarProps) {
|
||||
return (
|
||||
<div ref={containerRef} class="metric-text w-full h-4 flex items-center justify-center">
|
||||
<div
|
||||
class="relative w-full max-w-[140px] h-full overflow-hidden bg-gray-200 dark:bg-gray-600 rounded cursor-help"
|
||||
class="relative w-full h-full overflow-hidden bg-gray-200 dark:bg-gray-600 rounded cursor-help"
|
||||
onMouseEnter={handleMouseEnter}
|
||||
onMouseLeave={handleMouseLeave}
|
||||
>
|
||||
|
||||
@@ -48,7 +48,7 @@ export function StackedContainerBar(props: StackedContainerBarProps) {
|
||||
return (
|
||||
<div ref={containerRef} class="metric-text w-full h-4 flex items-center justify-center">
|
||||
<div
|
||||
class="relative w-full max-w-[140px] h-full overflow-hidden bg-gray-200 dark:bg-gray-600 rounded cursor-help"
|
||||
class="relative w-full h-full overflow-hidden bg-gray-200 dark:bg-gray-600 rounded cursor-help"
|
||||
onMouseEnter={handleMouseEnter}
|
||||
onMouseLeave={handleMouseLeave}
|
||||
>
|
||||
|
||||
@@ -8,6 +8,7 @@ const FirstRunSetup = lazy(() =>
|
||||
|
||||
interface LoginProps {
|
||||
onLogin: () => void;
|
||||
hasAuth?: boolean; // If true, auth is configured (passed from App.tsx to skip redundant check)
|
||||
}
|
||||
|
||||
interface SecurityStatus {
|
||||
@@ -30,7 +31,8 @@ export const Login: Component<LoginProps> = (props) => {
|
||||
const [error, setError] = createSignal('');
|
||||
const [loading, setLoading] = createSignal(false);
|
||||
const [authStatus, setAuthStatus] = createSignal<SecurityStatus | null>(null);
|
||||
const [loadingAuth, setLoadingAuth] = createSignal(true);
|
||||
// If hasAuth is passed from App.tsx, we already know auth status - skip the loading state
|
||||
const [loadingAuth, setLoadingAuth] = createSignal(props.hasAuth === undefined);
|
||||
const [oidcLoading, setOidcLoading] = createSignal(false);
|
||||
const [oidcError, setOidcError] = createSignal('');
|
||||
const [oidcMessage, setOidcMessage] = createSignal('');
|
||||
@@ -96,6 +98,15 @@ export const Login: Component<LoginProps> = (props) => {
|
||||
window.history.replaceState({}, document.title, newUrl);
|
||||
}
|
||||
|
||||
// If hasAuth was passed from App.tsx, use it directly without making another API call
|
||||
// This eliminates the flicker between "Checking authentication..." and the login form
|
||||
if (props.hasAuth !== undefined) {
|
||||
logger.debug('[Login] Using hasAuth from App.tsx, skipping redundant auth check');
|
||||
setAuthStatus({ hasAuthentication: props.hasAuth });
|
||||
setLoadingAuth(false);
|
||||
return;
|
||||
}
|
||||
|
||||
logger.debug('[Login] Starting auth check...');
|
||||
try {
|
||||
const response = await fetch('/api/security/status');
|
||||
|
||||
@@ -10,7 +10,7 @@ import { AIAPI } from '@/api/ai';
|
||||
import type { AISettings as AISettingsType, AIProvider } from '@/types/ai';
|
||||
import { PROVIDER_NAMES, PROVIDER_DESCRIPTIONS, DEFAULT_MODELS } from '@/types/ai';
|
||||
|
||||
const PROVIDERS: AIProvider[] = ['anthropic', 'openai', 'ollama'];
|
||||
const PROVIDERS: AIProvider[] = ['anthropic', 'openai', 'ollama', 'deepseek'];
|
||||
|
||||
export const AISettings: Component = () => {
|
||||
const [settings, setSettings] = createSignal<AISettingsType | null>(null);
|
||||
@@ -146,7 +146,7 @@ export const AISettings: Component = () => {
|
||||
};
|
||||
|
||||
const needsApiKey = () => form.provider !== 'ollama';
|
||||
const showBaseUrl = () => form.provider === 'ollama' || form.provider === 'openai';
|
||||
const showBaseUrl = () => form.provider === 'ollama' || form.provider === 'openai' || form.provider === 'deepseek';
|
||||
|
||||
return (
|
||||
<Card
|
||||
@@ -220,11 +220,10 @@ export const AISettings: Component = () => {
|
||||
{(provider) => (
|
||||
<button
|
||||
type="button"
|
||||
class={`p-3 rounded-lg border-2 text-left transition-all ${
|
||||
form.provider === provider
|
||||
? 'border-purple-500 bg-purple-50 dark:bg-purple-900/30'
|
||||
: 'border-gray-200 dark:border-gray-700 hover:border-gray-300 dark:hover:border-gray-600'
|
||||
}`}
|
||||
class={`p-3 rounded-lg border-2 text-left transition-all ${form.provider === provider
|
||||
? 'border-purple-500 bg-purple-50 dark:bg-purple-900/30'
|
||||
: 'border-gray-200 dark:border-gray-700 hover:border-gray-300 dark:hover:border-gray-600'
|
||||
}`}
|
||||
onClick={() => handleProviderChange(provider)}
|
||||
disabled={saving()}
|
||||
>
|
||||
@@ -282,7 +281,9 @@ export const AISettings: Component = () => {
|
||||
<p class={formHelpText}>
|
||||
{form.provider === 'anthropic'
|
||||
? 'Get your API key from console.anthropic.com'
|
||||
: 'Get your API key from platform.openai.com'}
|
||||
: form.provider === 'deepseek'
|
||||
? 'Get your API key from platform.deepseek.com'
|
||||
: 'Get your API key from platform.openai.com'}
|
||||
</p>
|
||||
</div>
|
||||
</Show>
|
||||
@@ -303,7 +304,9 @@ export const AISettings: Component = () => {
|
||||
? 'e.g., claude-opus-4-5-20251101, claude-sonnet-4-20250514'
|
||||
: form.provider === 'openai'
|
||||
? 'e.g., gpt-4o, gpt-4-turbo'
|
||||
: 'e.g., llama3, mixtral, codellama'}
|
||||
: form.provider === 'deepseek'
|
||||
? 'e.g., deepseek-chat, deepseek-coder'
|
||||
: 'e.g., llama3, mixtral, codellama'}
|
||||
</p>
|
||||
</div>
|
||||
|
||||
@@ -320,7 +323,9 @@ export const AISettings: Component = () => {
|
||||
placeholder={
|
||||
form.provider === 'ollama'
|
||||
? 'http://localhost:11434'
|
||||
: 'https://api.openai.com/v1'
|
||||
: form.provider === 'deepseek'
|
||||
? 'https://api.deepseek.com/chat/completions'
|
||||
: 'https://api.openai.com/v1'
|
||||
}
|
||||
class={controlClass()}
|
||||
disabled={saving()}
|
||||
@@ -328,7 +333,9 @@ export const AISettings: Component = () => {
|
||||
<p class={formHelpText}>
|
||||
{form.provider === 'ollama'
|
||||
? 'URL where your Ollama server is running'
|
||||
: 'Custom endpoint for Azure OpenAI or compatible APIs'}
|
||||
: form.provider === 'deepseek'
|
||||
? 'Custom endpoint (leave blank for default DeepSeek API)'
|
||||
: 'Custom endpoint for Azure OpenAI or compatible APIs'}
|
||||
</p>
|
||||
</div>
|
||||
</Show>
|
||||
@@ -364,16 +371,14 @@ export const AISettings: Component = () => {
|
||||
{/* Status indicator */}
|
||||
<Show when={settings()}>
|
||||
<div
|
||||
class={`flex items-center gap-2 p-3 rounded-lg ${
|
||||
settings()?.configured
|
||||
? 'bg-green-50 dark:bg-green-900/30 text-green-800 dark:text-green-200'
|
||||
: 'bg-amber-50 dark:bg-amber-900/30 text-amber-800 dark:text-amber-200'
|
||||
}`}
|
||||
class={`flex items-center gap-2 p-3 rounded-lg ${settings()?.configured
|
||||
? 'bg-green-50 dark:bg-green-900/30 text-green-800 dark:text-green-200'
|
||||
: 'bg-amber-50 dark:bg-amber-900/30 text-amber-800 dark:text-amber-200'
|
||||
}`}
|
||||
>
|
||||
<div
|
||||
class={`w-2 h-2 rounded-full ${
|
||||
settings()?.configured ? 'bg-green-500' : 'bg-amber-500'
|
||||
}`}
|
||||
class={`w-2 h-2 rounded-full ${settings()?.configured ? 'bg-green-500' : 'bg-amber-500'
|
||||
}`}
|
||||
/>
|
||||
<span class="text-xs font-medium">
|
||||
{settings()?.configured
|
||||
|
||||
@@ -16,25 +16,32 @@ export const MetricsViewToggle: Component = () => {
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => setViewMode('bars')}
|
||||
class={`px-2.5 py-1 text-xs font-medium rounded-md transition-all ${
|
||||
viewMode() === 'bars'
|
||||
? 'bg-white dark:bg-gray-800 text-gray-900 dark:text-gray-100 shadow-sm'
|
||||
: 'text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-gray-100'
|
||||
}`}
|
||||
class={`inline-flex items-center gap-1.5 px-2.5 py-1 text-xs font-medium rounded-md transition-all duration-150 active:scale-95 ${viewMode() === 'bars'
|
||||
? 'bg-white dark:bg-gray-800 text-gray-900 dark:text-gray-100 shadow-sm ring-1 ring-gray-200 dark:ring-gray-600'
|
||||
: 'text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-gray-100 hover:bg-gray-50 dark:hover:bg-gray-600/50'
|
||||
}`}
|
||||
title="Bar view"
|
||||
>
|
||||
<svg class="w-3 h-3" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2">
|
||||
<rect x="3" y="12" width="4" height="9" rx="1" />
|
||||
<rect x="10" y="6" width="4" height="15" rx="1" />
|
||||
<rect x="17" y="3" width="4" height="18" rx="1" />
|
||||
</svg>
|
||||
Bars
|
||||
</button>
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => setViewMode('sparklines')}
|
||||
class={`px-2.5 py-1 text-xs font-medium rounded-md transition-all ${
|
||||
viewMode() === 'sparklines'
|
||||
? 'bg-white dark:bg-gray-800 text-gray-900 dark:text-gray-100 shadow-sm'
|
||||
: 'text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-gray-100'
|
||||
}`}
|
||||
class={`inline-flex items-center gap-1.5 px-2.5 py-1 text-xs font-medium rounded-md transition-all duration-150 active:scale-95 ${viewMode() === 'sparklines'
|
||||
? 'bg-white dark:bg-gray-800 text-gray-900 dark:text-gray-100 shadow-sm ring-1 ring-gray-200 dark:ring-gray-600'
|
||||
: 'text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-gray-100 hover:bg-gray-50 dark:hover:bg-gray-600/50'
|
||||
}`}
|
||||
title="Sparkline view"
|
||||
>
|
||||
<svg class="w-3 h-3" viewBox="0 0 24 24" fill="none" stroke="currentColor" stroke-width="2">
|
||||
<polyline points="3 17 9 11 13 15 21 7" />
|
||||
<polyline points="17 7 21 7 21 11" />
|
||||
</svg>
|
||||
Trends
|
||||
</button>
|
||||
</div>
|
||||
|
||||
@@ -15,6 +15,7 @@ import { StackedMemoryBar } from '@/components/Dashboard/StackedMemoryBar';
|
||||
import { EnhancedCPUBar } from '@/components/Dashboard/EnhancedCPUBar';
|
||||
import { TemperatureGauge } from '@/components/shared/TemperatureGauge';
|
||||
import { useBreakpoint } from '@/hooks/useBreakpoint';
|
||||
import { useMetricsViewMode } from '@/stores/metricsViewMode';
|
||||
|
||||
interface NodeSummaryTableProps {
|
||||
nodes: Node[];
|
||||
@@ -34,6 +35,7 @@ export const NodeSummaryTable: Component<NodeSummaryTableProps> = (props) => {
|
||||
const alertsActivation = useAlertsActivation();
|
||||
const alertsEnabled = createMemo(() => alertsActivation.activationState() === 'active');
|
||||
const { isMobile } = useBreakpoint();
|
||||
const { viewMode } = useMetricsViewMode();
|
||||
|
||||
const isTemperatureMonitoringEnabled = (node: Node): boolean => {
|
||||
const globalEnabled = props.globalTemperatureMonitoringEnabled ?? true;
|
||||
@@ -340,7 +342,7 @@ export const NodeSummaryTable: Component<NodeSummaryTableProps> = (props) => {
|
||||
return (
|
||||
<Card padding="none" tone="glass" class="mb-4 overflow-hidden">
|
||||
<div class="overflow-x-auto">
|
||||
<table class="w-full border-collapse whitespace-nowrap">
|
||||
<table class="w-full border-collapse whitespace-nowrap table-fixed">
|
||||
<thead>
|
||||
<tr class="bg-gray-50 dark:bg-gray-700/50 text-gray-600 dark:text-gray-300 border-b border-gray-200 dark:border-gray-700">
|
||||
<th
|
||||
@@ -349,41 +351,41 @@ export const NodeSummaryTable: Component<NodeSummaryTableProps> = (props) => {
|
||||
>
|
||||
{props.currentTab === 'backups' ? 'Node / PBS' : 'Node'} {renderSortIndicator('name')}
|
||||
</th>
|
||||
<th class={thClass} onClick={() => handleSort('uptime')}>
|
||||
<th class={thClass} style={{ width: '80px' }} onClick={() => handleSort('uptime')}>
|
||||
Uptime {renderSortIndicator('uptime')}
|
||||
</th>
|
||||
<th class={thClass} onClick={() => handleSort('cpu')}>
|
||||
<th class={thClass} style={{ width: '18%' }} onClick={() => handleSort('cpu')}>
|
||||
CPU {renderSortIndicator('cpu')}
|
||||
</th>
|
||||
<th class={thClass} onClick={() => handleSort('memory')}>
|
||||
<th class={thClass} style={{ width: '18%' }} onClick={() => handleSort('memory')}>
|
||||
Memory {renderSortIndicator('memory')}
|
||||
</th>
|
||||
<th class={thClass} onClick={() => handleSort('disk')}>
|
||||
<th class={thClass} style={{ width: '18%' }} onClick={() => handleSort('disk')}>
|
||||
Disk {renderSortIndicator('disk')}
|
||||
</th>
|
||||
<Show when={hasAnyTemperatureData()}>
|
||||
<th class={thClass} onClick={() => handleSort('temperature')}>
|
||||
<th class={thClass} style={{ width: '60px' }} onClick={() => handleSort('temperature')}>
|
||||
Temp {renderSortIndicator('temperature')}
|
||||
</th>
|
||||
</Show>
|
||||
<Show when={props.currentTab === 'dashboard'}>
|
||||
<th class={thClass} onClick={() => handleSort('vmCount')}>
|
||||
<th class={thClass} style={{ width: '50px' }} onClick={() => handleSort('vmCount')}>
|
||||
VMs {renderSortIndicator('vmCount')}
|
||||
</th>
|
||||
<th class={thClass} onClick={() => handleSort('containerCount')}>
|
||||
<th class={thClass} style={{ width: '50px' }} onClick={() => handleSort('containerCount')}>
|
||||
CTs {renderSortIndicator('containerCount')}
|
||||
</th>
|
||||
</Show>
|
||||
<Show when={props.currentTab === 'storage'}>
|
||||
<th class={thClass} onClick={() => handleSort('storageCount')}>
|
||||
<th class={thClass} style={{ width: '70px' }} onClick={() => handleSort('storageCount')}>
|
||||
Storage {renderSortIndicator('storageCount')}
|
||||
</th>
|
||||
<th class={thClass} onClick={() => handleSort('diskCount')}>
|
||||
<th class={thClass} style={{ width: '60px' }} onClick={() => handleSort('diskCount')}>
|
||||
Disks {renderSortIndicator('diskCount')}
|
||||
</th>
|
||||
</Show>
|
||||
<Show when={props.currentTab === 'backups'}>
|
||||
<th class={thClass} onClick={() => handleSort('backupCount')}>
|
||||
<th class={thClass} style={{ width: '70px' }} onClick={() => handleSort('backupCount')}>
|
||||
Backups {renderSortIndicator('backupCount')}
|
||||
</th>
|
||||
</Show>
|
||||
@@ -604,13 +606,26 @@ export const NodeSummaryTable: Component<NodeSummaryTableProps> = (props) => {
|
||||
showMobile={false}
|
||||
/>
|
||||
}>
|
||||
<StackedMemoryBar
|
||||
used={node!.memory?.used || 0}
|
||||
total={node!.memory?.total || 0}
|
||||
balloon={node!.memory?.balloon || 0}
|
||||
swapUsed={node!.memory?.swapUsed || 0}
|
||||
swapTotal={node!.memory?.swapTotal || 0}
|
||||
/>
|
||||
<Show
|
||||
when={viewMode() === 'sparklines'}
|
||||
fallback={
|
||||
<StackedMemoryBar
|
||||
used={node!.memory?.used || 0}
|
||||
total={node!.memory?.total || 0}
|
||||
balloon={node!.memory?.balloon || 0}
|
||||
swapUsed={node!.memory?.swapUsed || 0}
|
||||
swapTotal={node!.memory?.swapTotal || 0}
|
||||
/>
|
||||
}
|
||||
>
|
||||
<ResponsiveMetricCell
|
||||
value={memoryPercentValue}
|
||||
type="memory"
|
||||
resourceId={metricsKey}
|
||||
isRunning={online}
|
||||
showMobile={false}
|
||||
/>
|
||||
</Show>
|
||||
</Show>
|
||||
</div>
|
||||
</td>
|
||||
|
||||
@@ -95,7 +95,11 @@ export const Sparkline: Component<SparklineProps> = (props) => {
|
||||
const dpr = window.devicePixelRatio || 1;
|
||||
canvas.width = w * dpr;
|
||||
canvas.height = h * dpr;
|
||||
canvas.style.width = `${w}px`;
|
||||
// Only set explicit width style if a fixed width was provided
|
||||
// Otherwise let CSS handle the width (w-full class)
|
||||
if (props.width !== 0) {
|
||||
canvas.style.width = `${w}px`;
|
||||
}
|
||||
canvas.style.height = `${h}px`;
|
||||
ctx.scale(dpr, dpr);
|
||||
|
||||
@@ -257,14 +261,12 @@ export const Sparkline: Component<SparklineProps> = (props) => {
|
||||
|
||||
return (
|
||||
<>
|
||||
<div class="relative block w-full">
|
||||
<div class="relative block w-full" style={{ height: `${height()}px` }}>
|
||||
<canvas
|
||||
ref={canvasRef}
|
||||
class="block cursor-crosshair transition-opacity duration-150"
|
||||
class="block cursor-crosshair w-full"
|
||||
style={{
|
||||
width: `${width()}px`,
|
||||
height: `${height()}px`,
|
||||
opacity: hoveredPoint() ? '1' : '0.7',
|
||||
}}
|
||||
onMouseMove={handleMouseMove}
|
||||
onMouseLeave={handleMouseLeave}
|
||||
|
||||
@@ -7,6 +7,7 @@ export interface ColumnDef {
|
||||
label: string;
|
||||
priority: ColumnPriority;
|
||||
toggleable?: boolean;
|
||||
width?: string; // Fixed width for consistent column sizing
|
||||
minWidth?: string;
|
||||
maxWidth?: string;
|
||||
flex?: number;
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -6,6 +6,8 @@
|
||||
*/
|
||||
|
||||
import { logger } from '@/utils/logger';
|
||||
import { ChartsAPI, type ChartData, type TimeRange } from '@/api/charts';
|
||||
import { buildMetricKey } from '@/utils/metricsKeys';
|
||||
|
||||
export interface MetricSnapshot {
|
||||
timestamp: number; // Unix timestamp in ms
|
||||
@@ -218,6 +220,184 @@ function debouncedSave() {
|
||||
}, 5000); // Save 5 seconds after last change
|
||||
}
|
||||
|
||||
// Track if we've already seeded from backend to avoid redundant fetches
|
||||
let hasSeededFromBackend = false;
|
||||
let seedingPromise: Promise<void> | null = null;
|
||||
|
||||
/**
|
||||
* Seed metrics history from backend historical data.
|
||||
* This provides immediate trend data instead of waiting for 30s samples.
|
||||
* Called automatically when switching to sparklines/trends view.
|
||||
*/
|
||||
export async function seedFromBackend(range: TimeRange = '1h'): Promise<void> {
|
||||
// Don't re-fetch if we've already seeded
|
||||
if (hasSeededFromBackend) {
|
||||
return;
|
||||
}
|
||||
|
||||
// If already seeding, wait for that request
|
||||
if (seedingPromise) {
|
||||
return seedingPromise;
|
||||
}
|
||||
|
||||
seedingPromise = (async () => {
|
||||
try {
|
||||
logger.info('[MetricsHistory] Seeding from backend', { range });
|
||||
const response = await ChartsAPI.getCharts(range);
|
||||
|
||||
// Get current state to determine guest types
|
||||
// Import dynamically to avoid circular dependency
|
||||
const { getGlobalWebSocketStore } = await import('./websocket-global');
|
||||
const wsStore = getGlobalWebSocketStore();
|
||||
|
||||
// Wait a bit for WebSocket state to populate if it's empty
|
||||
let state = wsStore?.state;
|
||||
if (!state?.vms?.length && !state?.containers?.length) {
|
||||
// Wait up to 2 seconds for state to populate
|
||||
for (let i = 0; i < 4; i++) {
|
||||
await new Promise(resolve => setTimeout(resolve, 500));
|
||||
state = wsStore?.state;
|
||||
if (state?.vms?.length || state?.containers?.length) break;
|
||||
}
|
||||
}
|
||||
|
||||
const now = Date.now();
|
||||
const cutoff = now - MAX_AGE_MS;
|
||||
let seededCount = 0;
|
||||
|
||||
// Helper to convert backend ChartData to our MetricSnapshot format
|
||||
const processChartData = (resourceId: string, chartData: ChartData) => {
|
||||
const cpuPoints = chartData.cpu || [];
|
||||
const memPoints = chartData.memory || [];
|
||||
const diskPoints = chartData.disk || [];
|
||||
|
||||
// If no data, skip
|
||||
if (cpuPoints.length === 0 && memPoints.length === 0 && diskPoints.length === 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
// Get or create ring buffer
|
||||
let ring = metricsHistoryMap.get(resourceId);
|
||||
if (!ring) {
|
||||
ring = createRingBuffer();
|
||||
metricsHistoryMap.set(resourceId, ring);
|
||||
}
|
||||
|
||||
// Find all unique timestamps across all metrics
|
||||
const timestampSet = new Set<number>();
|
||||
cpuPoints.forEach(p => timestampSet.add(p.timestamp));
|
||||
memPoints.forEach(p => timestampSet.add(p.timestamp));
|
||||
diskPoints.forEach(p => timestampSet.add(p.timestamp));
|
||||
|
||||
// Create lookup maps for efficient access
|
||||
const cpuMap = new Map(cpuPoints.map(p => [p.timestamp, p.value]));
|
||||
const memMap = new Map(memPoints.map(p => [p.timestamp, p.value]));
|
||||
const diskMap = new Map(diskPoints.map(p => [p.timestamp, p.value]));
|
||||
|
||||
// Sort timestamps and create snapshots
|
||||
const timestamps = Array.from(timestampSet).sort((a, b) => a - b);
|
||||
|
||||
for (const ts of timestamps) {
|
||||
// Skip if too old
|
||||
if (ts < cutoff) continue;
|
||||
|
||||
// Skip if we already have data around this timestamp (within 15s)
|
||||
let skipDuplicate = false;
|
||||
for (let i = 0; i < ring.size; i++) {
|
||||
const idx = (ring.head + i) % MAX_POINTS;
|
||||
const existing = ring.buffer[idx];
|
||||
if (existing && Math.abs(existing.timestamp - ts) < 15000) {
|
||||
skipDuplicate = true;
|
||||
break;
|
||||
}
|
||||
}
|
||||
if (skipDuplicate) continue;
|
||||
|
||||
const snapshot: MetricSnapshot = {
|
||||
timestamp: ts,
|
||||
cpu: Math.round((cpuMap.get(ts) ?? 0) * 10) / 10,
|
||||
memory: Math.round((memMap.get(ts) ?? 0) * 10) / 10,
|
||||
disk: Math.round((diskMap.get(ts) ?? 0) * 10) / 10,
|
||||
};
|
||||
|
||||
pushToRingBuffer(ring, snapshot);
|
||||
seededCount++;
|
||||
}
|
||||
};
|
||||
|
||||
// Process VMs and containers
|
||||
if (response.data) {
|
||||
// Build a map from ID -> type using WebSocket state
|
||||
const guestTypeMap = new Map<string, 'vm' | 'container'>();
|
||||
if (state?.vms) {
|
||||
for (const vm of state.vms) {
|
||||
if (vm.id) guestTypeMap.set(vm.id, 'vm');
|
||||
}
|
||||
}
|
||||
if (state?.containers) {
|
||||
for (const ct of state.containers) {
|
||||
if (ct.id) guestTypeMap.set(ct.id, 'container');
|
||||
}
|
||||
}
|
||||
|
||||
const backendIds = Object.keys(response.data);
|
||||
const stateIds = Array.from(guestTypeMap.keys());
|
||||
|
||||
// Debug: Find IDs in backend but not in state
|
||||
const missingInState = backendIds.filter(id => !guestTypeMap.has(id));
|
||||
|
||||
console.log('[SPARKLINE DEBUG] Backend chart IDs:', backendIds);
|
||||
console.log('[SPARKLINE DEBUG] State guest IDs:', stateIds);
|
||||
console.log('[SPARKLINE DEBUG] Missing in state (will be wrong type):', missingInState);
|
||||
|
||||
for (const [id, chartData] of Object.entries(response.data)) {
|
||||
// Look up the guest type from state, default to 'vm' if unknown
|
||||
const guestType = guestTypeMap.get(id) ?? 'vm';
|
||||
const resourceKey = buildMetricKey(guestType, id);
|
||||
processChartData(resourceKey, chartData as ChartData);
|
||||
}
|
||||
}
|
||||
|
||||
// Process nodes
|
||||
if (response.nodeData) {
|
||||
for (const [id, chartData] of Object.entries(response.nodeData)) {
|
||||
const resourceKey = buildMetricKey('node', id);
|
||||
processChartData(resourceKey, chartData as ChartData);
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
hasSeededFromBackend = true;
|
||||
logger.info('[MetricsHistory] Seeded from backend', { seededCount, totalResources: metricsHistoryMap.size });
|
||||
|
||||
// Save to localStorage
|
||||
saveToLocalStorage();
|
||||
} catch (error) {
|
||||
logger.error('[MetricsHistory] Failed to seed from backend', { error });
|
||||
// Don't throw - gracefully degrade to client-side sampling
|
||||
} finally {
|
||||
seedingPromise = null;
|
||||
}
|
||||
})();
|
||||
|
||||
return seedingPromise;
|
||||
}
|
||||
|
||||
/**
|
||||
* Force re-seed from backend (useful when range changes)
|
||||
*/
|
||||
export function resetSeedingState(): void {
|
||||
hasSeededFromBackend = false;
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if we have seeded from backend
|
||||
*/
|
||||
export function hasSeedData(): boolean {
|
||||
return hasSeededFromBackend;
|
||||
}
|
||||
|
||||
|
||||
/**
|
||||
* Get metric history for a resource
|
||||
*/
|
||||
|
||||
@@ -7,6 +7,7 @@
|
||||
|
||||
import { createSignal } from 'solid-js';
|
||||
import { STORAGE_KEYS } from '@/utils/localStorage';
|
||||
import { seedFromBackend } from './metricsHistory';
|
||||
|
||||
export type MetricsViewMode = 'bars' | 'sparklines';
|
||||
|
||||
@@ -52,6 +53,14 @@ export function setMetricsViewModePreference(mode: MetricsViewMode): void {
|
||||
console.warn('Failed to save metrics view mode preference', err);
|
||||
}
|
||||
}
|
||||
|
||||
// When switching to sparklines, seed historical data from backend
|
||||
if (mode === 'sparklines') {
|
||||
// Fire and forget - don't block the UI
|
||||
seedFromBackend('1h').catch(() => {
|
||||
// Errors are already logged in seedFromBackend
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
|
||||
@@ -210,3 +210,179 @@
|
||||
.node-click {
|
||||
animation: nodeClick 0.15s ease-out;
|
||||
}
|
||||
|
||||
/* AI Context row highlight with mergeable borders */
|
||||
/* Using purple-600 (147, 51, 234) for a true purple */
|
||||
|
||||
/* Full border - single row or first/last of group */
|
||||
@keyframes ai-context-pulse-full {
|
||||
0%, 100% {
|
||||
background-color: rgba(147, 51, 234, 0.08);
|
||||
box-shadow:
|
||||
inset 2px 0 0 0 rgba(147, 51, 234, 0.4),
|
||||
inset -2px 0 0 0 rgba(147, 51, 234, 0.4),
|
||||
inset 0 2px 0 0 rgba(147, 51, 234, 0.4),
|
||||
inset 0 -2px 0 0 rgba(147, 51, 234, 0.4);
|
||||
}
|
||||
50% {
|
||||
background-color: rgba(147, 51, 234, 0.16);
|
||||
box-shadow:
|
||||
inset 2px 0 0 0 rgba(147, 51, 234, 0.8),
|
||||
inset -2px 0 0 0 rgba(147, 51, 234, 0.8),
|
||||
inset 0 2px 0 0 rgba(147, 51, 234, 0.8),
|
||||
inset 0 -2px 0 0 rgba(147, 51, 234, 0.8);
|
||||
}
|
||||
}
|
||||
|
||||
/* No top border - middle or bottom of group */
|
||||
@keyframes ai-context-pulse-no-top {
|
||||
0%, 100% {
|
||||
background-color: rgba(147, 51, 234, 0.08);
|
||||
box-shadow:
|
||||
inset 2px 0 0 0 rgba(147, 51, 234, 0.4),
|
||||
inset -2px 0 0 0 rgba(147, 51, 234, 0.4),
|
||||
inset 0 -2px 0 0 rgba(147, 51, 234, 0.4);
|
||||
}
|
||||
50% {
|
||||
background-color: rgba(147, 51, 234, 0.16);
|
||||
box-shadow:
|
||||
inset 2px 0 0 0 rgba(147, 51, 234, 0.8),
|
||||
inset -2px 0 0 0 rgba(147, 51, 234, 0.8),
|
||||
inset 0 -2px 0 0 rgba(147, 51, 234, 0.8);
|
||||
}
|
||||
}
|
||||
|
||||
/* No bottom border - top of group */
|
||||
@keyframes ai-context-pulse-no-bottom {
|
||||
0%, 100% {
|
||||
background-color: rgba(147, 51, 234, 0.08);
|
||||
box-shadow:
|
||||
inset 2px 0 0 0 rgba(147, 51, 234, 0.4),
|
||||
inset -2px 0 0 0 rgba(147, 51, 234, 0.4),
|
||||
inset 0 2px 0 0 rgba(147, 51, 234, 0.4);
|
||||
}
|
||||
50% {
|
||||
background-color: rgba(147, 51, 234, 0.16);
|
||||
box-shadow:
|
||||
inset 2px 0 0 0 rgba(147, 51, 234, 0.8),
|
||||
inset -2px 0 0 0 rgba(147, 51, 234, 0.8),
|
||||
inset 0 2px 0 0 rgba(147, 51, 234, 0.8);
|
||||
}
|
||||
}
|
||||
|
||||
/* Side borders only - middle of group */
|
||||
@keyframes ai-context-pulse-sides {
|
||||
0%, 100% {
|
||||
background-color: rgba(147, 51, 234, 0.08);
|
||||
box-shadow:
|
||||
inset 2px 0 0 0 rgba(147, 51, 234, 0.4),
|
||||
inset -2px 0 0 0 rgba(147, 51, 234, 0.4);
|
||||
}
|
||||
50% {
|
||||
background-color: rgba(147, 51, 234, 0.16);
|
||||
box-shadow:
|
||||
inset 2px 0 0 0 rgba(147, 51, 234, 0.8),
|
||||
inset -2px 0 0 0 rgba(147, 51, 234, 0.8);
|
||||
}
|
||||
}
|
||||
|
||||
.ai-context-row {
|
||||
animation: ai-context-pulse-full 2s ease-in-out infinite;
|
||||
}
|
||||
|
||||
.ai-context-row.ai-context-no-top {
|
||||
animation: ai-context-pulse-no-top 2s ease-in-out infinite;
|
||||
}
|
||||
|
||||
.ai-context-row.ai-context-no-bottom {
|
||||
animation: ai-context-pulse-no-bottom 2s ease-in-out infinite;
|
||||
}
|
||||
|
||||
.ai-context-row.ai-context-no-top.ai-context-no-bottom {
|
||||
animation: ai-context-pulse-sides 2s ease-in-out infinite;
|
||||
}
|
||||
|
||||
/* Dark mode - using purple-400 (167, 139, 250) */
|
||||
@keyframes ai-context-pulse-full-dark {
|
||||
0%, 100% {
|
||||
background-color: rgba(147, 51, 234, 0.12);
|
||||
box-shadow:
|
||||
inset 2px 0 0 0 rgba(167, 139, 250, 0.5),
|
||||
inset -2px 0 0 0 rgba(167, 139, 250, 0.5),
|
||||
inset 0 2px 0 0 rgba(167, 139, 250, 0.5),
|
||||
inset 0 -2px 0 0 rgba(167, 139, 250, 0.5);
|
||||
}
|
||||
50% {
|
||||
background-color: rgba(147, 51, 234, 0.24);
|
||||
box-shadow:
|
||||
inset 2px 0 0 0 rgba(167, 139, 250, 0.9),
|
||||
inset -2px 0 0 0 rgba(167, 139, 250, 0.9),
|
||||
inset 0 2px 0 0 rgba(167, 139, 250, 0.9),
|
||||
inset 0 -2px 0 0 rgba(167, 139, 250, 0.9);
|
||||
}
|
||||
}
|
||||
|
||||
@keyframes ai-context-pulse-no-top-dark {
|
||||
0%, 100% {
|
||||
background-color: rgba(147, 51, 234, 0.12);
|
||||
box-shadow:
|
||||
inset 2px 0 0 0 rgba(167, 139, 250, 0.5),
|
||||
inset -2px 0 0 0 rgba(167, 139, 250, 0.5),
|
||||
inset 0 -2px 0 0 rgba(167, 139, 250, 0.5);
|
||||
}
|
||||
50% {
|
||||
background-color: rgba(147, 51, 234, 0.24);
|
||||
box-shadow:
|
||||
inset 2px 0 0 0 rgba(167, 139, 250, 0.9),
|
||||
inset -2px 0 0 0 rgba(167, 139, 250, 0.9),
|
||||
inset 0 -2px 0 0 rgba(167, 139, 250, 0.9);
|
||||
}
|
||||
}
|
||||
|
||||
@keyframes ai-context-pulse-no-bottom-dark {
|
||||
0%, 100% {
|
||||
background-color: rgba(147, 51, 234, 0.12);
|
||||
box-shadow:
|
||||
inset 2px 0 0 0 rgba(167, 139, 250, 0.5),
|
||||
inset -2px 0 0 0 rgba(167, 139, 250, 0.5),
|
||||
inset 0 2px 0 0 rgba(167, 139, 250, 0.5);
|
||||
}
|
||||
50% {
|
||||
background-color: rgba(147, 51, 234, 0.24);
|
||||
box-shadow:
|
||||
inset 2px 0 0 0 rgba(167, 139, 250, 0.9),
|
||||
inset -2px 0 0 0 rgba(167, 139, 250, 0.9),
|
||||
inset 0 2px 0 0 rgba(167, 139, 250, 0.9);
|
||||
}
|
||||
}
|
||||
|
||||
@keyframes ai-context-pulse-sides-dark {
|
||||
0%, 100% {
|
||||
background-color: rgba(147, 51, 234, 0.12);
|
||||
box-shadow:
|
||||
inset 2px 0 0 0 rgba(167, 139, 250, 0.5),
|
||||
inset -2px 0 0 0 rgba(167, 139, 250, 0.5);
|
||||
}
|
||||
50% {
|
||||
background-color: rgba(147, 51, 234, 0.24);
|
||||
box-shadow:
|
||||
inset 2px 0 0 0 rgba(167, 139, 250, 0.9),
|
||||
inset -2px 0 0 0 rgba(167, 139, 250, 0.9);
|
||||
}
|
||||
}
|
||||
|
||||
.dark .ai-context-row {
|
||||
animation: ai-context-pulse-full-dark 2s ease-in-out infinite;
|
||||
}
|
||||
|
||||
.dark .ai-context-row.ai-context-no-top {
|
||||
animation: ai-context-pulse-no-top-dark 2s ease-in-out infinite;
|
||||
}
|
||||
|
||||
.dark .ai-context-row.ai-context-no-bottom {
|
||||
animation: ai-context-pulse-no-bottom-dark 2s ease-in-out infinite;
|
||||
}
|
||||
|
||||
.dark .ai-context-row.ai-context-no-top.ai-context-no-bottom {
|
||||
animation: ai-context-pulse-sides-dark 2s ease-in-out infinite;
|
||||
}
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
// AI feature types
|
||||
|
||||
export type AIProvider = 'anthropic' | 'openai' | 'ollama';
|
||||
export type AIProvider = 'anthropic' | 'openai' | 'ollama' | 'deepseek';
|
||||
|
||||
export interface AISettings {
|
||||
enabled: boolean;
|
||||
@@ -34,6 +34,7 @@ export const DEFAULT_MODELS: Record<AIProvider, string> = {
|
||||
anthropic: 'claude-opus-4-5-20251101',
|
||||
openai: 'gpt-4o',
|
||||
ollama: 'llama3',
|
||||
deepseek: 'deepseek-reasoner',
|
||||
};
|
||||
|
||||
// Provider display names
|
||||
@@ -41,6 +42,7 @@ export const PROVIDER_NAMES: Record<AIProvider, string> = {
|
||||
anthropic: 'Anthropic',
|
||||
openai: 'OpenAI',
|
||||
ollama: 'Ollama',
|
||||
deepseek: 'DeepSeek',
|
||||
};
|
||||
|
||||
// Provider descriptions
|
||||
@@ -48,6 +50,7 @@ export const PROVIDER_DESCRIPTIONS: Record<AIProvider, string> = {
|
||||
anthropic: 'Claude models from Anthropic',
|
||||
openai: 'GPT models from OpenAI',
|
||||
ollama: 'Local models via Ollama',
|
||||
deepseek: 'DeepSeek reasoning models',
|
||||
};
|
||||
|
||||
// Conversation history for multi-turn chats
|
||||
@@ -82,7 +85,7 @@ export interface AIExecuteResponse {
|
||||
}
|
||||
|
||||
// Streaming event types
|
||||
export type AIStreamEventType = 'tool_start' | 'tool_end' | 'content' | 'done' | 'error' | 'complete' | 'approval_needed';
|
||||
export type AIStreamEventType = 'tool_start' | 'tool_end' | 'content' | 'thinking' | 'done' | 'error' | 'complete' | 'approval_needed' | 'processing';
|
||||
|
||||
export interface AIStreamToolStartData {
|
||||
name: string;
|
||||
@@ -101,8 +104,10 @@ export interface AIStreamApprovalNeededData {
|
||||
tool_id: string;
|
||||
tool_name: string;
|
||||
run_on_host: boolean;
|
||||
target_host?: string; // Explicit host to route the command to
|
||||
}
|
||||
|
||||
|
||||
export interface AIStreamEvent {
|
||||
type: AIStreamEventType;
|
||||
data?: string | AIStreamToolStartData | AIStreamToolEndData | AIStreamCompleteData | AIStreamApprovalNeededData;
|
||||
|
||||
@@ -181,6 +181,28 @@ class ApiClient {
|
||||
return false;
|
||||
}
|
||||
|
||||
// Ensure CSRF token is available by making a GET request if needed
|
||||
// The backend issues CSRF cookies on GET requests to /api/* endpoints
|
||||
private async ensureCSRFToken(): Promise<string | null> {
|
||||
try {
|
||||
// Make a simple GET request to trigger CSRF cookie issuance
|
||||
const response = await fetch('/api/health', {
|
||||
method: 'GET',
|
||||
credentials: 'include',
|
||||
});
|
||||
|
||||
// The response should have set the pulse_csrf cookie
|
||||
if (response.ok) {
|
||||
// Small delay to ensure cookie is set
|
||||
await new Promise(resolve => setTimeout(resolve, 10));
|
||||
return this.loadCSRFToken();
|
||||
}
|
||||
} catch (err) {
|
||||
logger.warn('Failed to fetch CSRF token', err);
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
// Main fetch wrapper that adds authentication
|
||||
async fetch(url: string, options: FetchOptions = {}): Promise<Response> {
|
||||
const { skipAuth = false, headers = {}, ...fetchOptions } = options;
|
||||
@@ -206,7 +228,12 @@ class ApiClient {
|
||||
// Add CSRF token for state-changing requests
|
||||
const method = (fetchOptions.method || 'GET').toUpperCase();
|
||||
if (method !== 'GET' && method !== 'HEAD' && method !== 'OPTIONS') {
|
||||
const token = this.loadCSRFToken();
|
||||
// Try to get CSRF token, or fetch one if missing
|
||||
let token = this.loadCSRFToken();
|
||||
if (!token) {
|
||||
// No CSRF token available - try to get one by making a GET request
|
||||
token = await this.ensureCSRFToken();
|
||||
}
|
||||
if (token) {
|
||||
finalHeaders['X-CSRF-Token'] = token;
|
||||
}
|
||||
@@ -223,7 +250,7 @@ class ApiClient {
|
||||
|
||||
// If we get a 401 on an API call (not during initial auth check), redirect to login
|
||||
// Skip redirect for specific auth-check endpoints to avoid loops
|
||||
if (response.status === 401 && !url.includes('/api/security/status') && !url.includes('/api/state')) {
|
||||
if (response.status === 401 && !url.includes('/api/security/status') && !url.includes('/api/state') && !url.includes('/api/settings/ai')) {
|
||||
logger.warn('Authentication expired - redirecting to login');
|
||||
// Clear auth and redirect to login
|
||||
if (typeof window !== 'undefined') {
|
||||
@@ -234,13 +261,15 @@ class ApiClient {
|
||||
return response;
|
||||
}
|
||||
|
||||
// Handle CSRF token failures
|
||||
// Handle CSRF token failures - the 403 response should have set a new CSRF cookie
|
||||
if (response.status === 403) {
|
||||
const csrfHeader = response.headers.get('X-CSRF-Token');
|
||||
let refreshedToken: string | null = null;
|
||||
if (csrfHeader) {
|
||||
refreshedToken = csrfHeader;
|
||||
} else {
|
||||
// First try the response header (backend sends new token in X-CSRF-Token header)
|
||||
let refreshedToken = response.headers.get('X-CSRF-Token');
|
||||
|
||||
// If not in header, reload from cookie (backend also sets pulse_csrf cookie on 403)
|
||||
if (!refreshedToken) {
|
||||
// Force reload from cookie - the 403 response just set it
|
||||
this.csrfToken = null;
|
||||
refreshedToken = this.loadCSRFToken();
|
||||
}
|
||||
|
||||
|
||||
@@ -70,20 +70,25 @@ export default defineConfig({
|
||||
});
|
||||
},
|
||||
},
|
||||
// SSE endpoint for AI chat streaming
|
||||
'/api/ai/execute/stream': {
|
||||
target: backendUrl,
|
||||
changeOrigin: true,
|
||||
// SSE requires special handling to prevent proxy timeouts
|
||||
// Set timeout to 10 minutes (600000ms) for long-running AI requests
|
||||
timeout: 600000,
|
||||
proxyTimeout: 600000,
|
||||
// Set timeout to 0 to completely disable
|
||||
timeout: 0,
|
||||
proxyTimeout: 0,
|
||||
configure: (proxy, _options) => {
|
||||
// Completely disable http-proxy internal timeouts
|
||||
proxy.options.timeout = 0;
|
||||
proxy.options.proxyTimeout = 0;
|
||||
|
||||
// Set proxy-level timeouts
|
||||
proxy.on('proxyReq', (proxyReq, req, res) => {
|
||||
// Disable socket timeouts for SSE
|
||||
req.socket.setTimeout(0);
|
||||
req.socket.setNoDelay(true);
|
||||
req.socket.setKeepAlive(true);
|
||||
req.socket.setKeepAlive(true, 30000);
|
||||
// Also set on the proxy request
|
||||
proxyReq.socket?.setTimeout(0);
|
||||
});
|
||||
@@ -91,7 +96,7 @@ export default defineConfig({
|
||||
// Disable response socket timeout
|
||||
res.socket?.setTimeout(0);
|
||||
res.socket?.setNoDelay(true);
|
||||
res.socket?.setKeepAlive(true);
|
||||
res.socket?.setKeepAlive(true, 30000);
|
||||
// Also disable on proxy response socket
|
||||
proxyRes.socket?.setTimeout(0);
|
||||
});
|
||||
@@ -100,6 +105,34 @@ export default defineConfig({
|
||||
});
|
||||
},
|
||||
},
|
||||
// SSE endpoint for AI alert investigation (one-click investigate from alerts page)
|
||||
'/api/ai/investigate-alert': {
|
||||
target: backendUrl,
|
||||
changeOrigin: true,
|
||||
// SSE requires special handling to prevent proxy timeouts
|
||||
timeout: 0,
|
||||
proxyTimeout: 0,
|
||||
configure: (proxy, _options) => {
|
||||
proxy.options.timeout = 0;
|
||||
proxy.options.proxyTimeout = 0;
|
||||
|
||||
proxy.on('proxyReq', (proxyReq, req, res) => {
|
||||
req.socket.setTimeout(0);
|
||||
req.socket.setNoDelay(true);
|
||||
req.socket.setKeepAlive(true, 30000);
|
||||
proxyReq.socket?.setTimeout(0);
|
||||
});
|
||||
proxy.on('proxyRes', (proxyRes, req, res) => {
|
||||
res.socket?.setTimeout(0);
|
||||
res.socket?.setNoDelay(true);
|
||||
res.socket?.setKeepAlive(true, 30000);
|
||||
proxyRes.socket?.setTimeout(0);
|
||||
});
|
||||
proxy.on('error', (err, req, res) => {
|
||||
console.error('[SSE Proxy Error - Investigate Alert]', err.message);
|
||||
});
|
||||
},
|
||||
},
|
||||
'/api/agent/ws': {
|
||||
target: backendWsUrl,
|
||||
ws: true,
|
||||
|
||||
8
go.mod
8
go.mod
@@ -40,6 +40,7 @@ require (
|
||||
github.com/distribution/reference v0.6.0 // indirect
|
||||
github.com/docker/go-connections v0.6.0 // indirect
|
||||
github.com/docker/go-units v0.5.0 // indirect
|
||||
github.com/dustin/go-humanize v1.0.1 // indirect
|
||||
github.com/ebitengine/purego v0.9.1 // indirect
|
||||
github.com/felixge/httpsnoop v1.0.4 // indirect
|
||||
github.com/go-jose/go-jose/v4 v4.1.3 // indirect
|
||||
@@ -58,6 +59,7 @@ require (
|
||||
github.com/moby/term v0.5.2 // indirect
|
||||
github.com/morikuni/aec v1.0.0 // indirect
|
||||
github.com/munnerz/goautoneg v0.0.0-20191010083416-a7dc8b61c822 // indirect
|
||||
github.com/ncruces/go-strftime v0.1.9 // indirect
|
||||
github.com/opencontainers/go-digest v1.0.0 // indirect
|
||||
github.com/opencontainers/image-spec v1.1.1 // indirect
|
||||
github.com/pkg/errors v0.9.1 // indirect
|
||||
@@ -65,6 +67,7 @@ require (
|
||||
github.com/power-devops/perfstat v0.0.0-20240221224432-82ca36839d55 // indirect
|
||||
github.com/prometheus/common v0.67.4 // indirect
|
||||
github.com/prometheus/procfs v0.19.2 // indirect
|
||||
github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec // indirect
|
||||
github.com/spf13/pflag v1.0.10 // indirect
|
||||
github.com/tklauser/go-sysconf v0.3.16 // indirect
|
||||
github.com/tklauser/numcpus v0.11.0 // indirect
|
||||
@@ -76,7 +79,12 @@ require (
|
||||
go.opentelemetry.io/otel/metric v1.38.0 // indirect
|
||||
go.opentelemetry.io/otel/trace v1.38.0 // indirect
|
||||
go.yaml.in/yaml/v2 v2.4.3 // indirect
|
||||
golang.org/x/exp v0.0.0-20250620022241-b7579e27df2b // indirect
|
||||
google.golang.org/grpc v1.75.1 // indirect
|
||||
google.golang.org/protobuf v1.36.10 // indirect
|
||||
gotest.tools/v3 v3.5.2 // indirect
|
||||
modernc.org/libc v1.66.10 // indirect
|
||||
modernc.org/mathutil v1.7.1 // indirect
|
||||
modernc.org/memory v1.11.0 // indirect
|
||||
modernc.org/sqlite v1.40.1 // indirect
|
||||
)
|
||||
|
||||
16
go.sum
16
go.sum
@@ -28,6 +28,8 @@ github.com/docker/go-connections v0.6.0 h1:LlMG9azAe1TqfR7sO+NJttz1gy6KO7VJBh+pM
|
||||
github.com/docker/go-connections v0.6.0/go.mod h1:AahvXYshr6JgfUJGdDCs2b5EZG/vmaMAntpSFH5BFKE=
|
||||
github.com/docker/go-units v0.5.0 h1:69rxXcBk27SvSaaxTtLh/8llcHD8vYHT7WSdRZ/jvr4=
|
||||
github.com/docker/go-units v0.5.0/go.mod h1:fgPhTUdO+D/Jk86RDLlptpiXQzgHJF7gydDDbaIK4Dk=
|
||||
github.com/dustin/go-humanize v1.0.1 h1:GzkhY7T5VNhEkwH0PVJgjz+fX1rhBrR7pRT3mDkpeCY=
|
||||
github.com/dustin/go-humanize v1.0.1/go.mod h1:Mu1zIs6XwVuF/gI1OepvI0qD18qycQx+mFykh5fBlto=
|
||||
github.com/ebitengine/purego v0.9.1 h1:a/k2f2HQU3Pi399RPW1MOaZyhKJL9w/xFpKAg4q1s0A=
|
||||
github.com/ebitengine/purego v0.9.1/go.mod h1:iIjxzd6CiRiOG0UyXP+V1+jWqUXVjPKLAI0mRfJZTmQ=
|
||||
github.com/felixge/httpsnoop v1.0.4 h1:NFTV2Zj1bL4mc9sqWACXbQFVBBg2W3GPvqp8/ESS2Wg=
|
||||
@@ -88,6 +90,8 @@ github.com/morikuni/aec v1.0.0 h1:nP9CBfwrvYnBRgY6qfDQkygYDmYwOilePFkwzv4dU8A=
|
||||
github.com/morikuni/aec v1.0.0/go.mod h1:BbKIizmSmc5MMPqRYbxO4ZU0S0+P200+tUnFx7PXmsc=
|
||||
github.com/munnerz/goautoneg v0.0.0-20191010083416-a7dc8b61c822 h1:C3w9PqII01/Oq1c1nUAm88MOHcQC9l5mIlSMApZMrHA=
|
||||
github.com/munnerz/goautoneg v0.0.0-20191010083416-a7dc8b61c822/go.mod h1:+n7T8mK8HuQTcFwEeznm/DIxMOiR9yIdICNftLE1DvQ=
|
||||
github.com/ncruces/go-strftime v0.1.9 h1:bY0MQC28UADQmHmaF5dgpLmImcShSi2kHU9XLdhx/f4=
|
||||
github.com/ncruces/go-strftime v0.1.9/go.mod h1:Fwc5htZGVVkseilnfgOVb9mKy6w1naJmn9CehxcKcls=
|
||||
github.com/oklog/ulid/v2 v2.1.1 h1:suPZ4ARWLOJLegGFiZZ1dFAkqzhMjL3J1TzI+5wHz8s=
|
||||
github.com/oklog/ulid/v2 v2.1.1/go.mod h1:rcEKHmBBKfef9DhnvX7y1HZBYxjXb0cP5ExxNsTT1QQ=
|
||||
github.com/opencontainers/go-digest v1.0.0 h1:apOUWs51W5PlhuyGyz9FCeeBIOUDA/6nW8Oi/yOhh5U=
|
||||
@@ -109,6 +113,8 @@ github.com/prometheus/common v0.67.4 h1:yR3NqWO1/UyO1w2PhUvXlGQs/PtFmoveVO0KZ4+L
|
||||
github.com/prometheus/common v0.67.4/go.mod h1:gP0fq6YjjNCLssJCQp0yk4M8W6ikLURwkdd/YKtTbyI=
|
||||
github.com/prometheus/procfs v0.19.2 h1:zUMhqEW66Ex7OXIiDkll3tl9a1ZdilUOd/F6ZXw4Vws=
|
||||
github.com/prometheus/procfs v0.19.2/go.mod h1:M0aotyiemPhBCM0z5w87kL22CxfcH05ZpYlu+b4J7mw=
|
||||
github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec h1:W09IVJc94icq4NjY3clb7Lk8O1qJ8BdBEF8z0ibU0rE=
|
||||
github.com/remyoudompheng/bigfft v0.0.0-20230129092748-24d4a6f8daec/go.mod h1:qqbHyh8v60DhA7CoWK5oRCqLrMHRGoxYCSS9EjAz6Eo=
|
||||
github.com/rogpeppe/go-internal v1.13.1 h1:KvO1DLK/DRN07sQ1LQKScxyZJuNnedQ5/wKSR38lUII=
|
||||
github.com/rogpeppe/go-internal v1.13.1/go.mod h1:uMEvuHeurkdAXX61udpOXGD/AzZDWNMNyH2VO9fmH0o=
|
||||
github.com/rs/dnscache v0.0.0-20230804202142-fc85eb664529 h1:18kd+8ZUlt/ARXhljq+14TwAoKa61q6dX8jtwOf6DH8=
|
||||
@@ -162,6 +168,8 @@ go.yaml.in/yaml/v2 v2.4.3 h1:6gvOSjQoTB3vt1l+CU+tSyi/HOjfOjRLJ4YwYZGwRO0=
|
||||
go.yaml.in/yaml/v2 v2.4.3/go.mod h1:zSxWcmIDjOzPXpjlTTbAsKokqkDNAVtZO0WOMiT90s8=
|
||||
golang.org/x/crypto v0.45.0 h1:jMBrvKuj23MTlT0bQEOBcAE0mjg8mK9RXFhRH6nyF3Q=
|
||||
golang.org/x/crypto v0.45.0/go.mod h1:XTGrrkGJve7CYK7J8PEww4aY7gM3qMCElcJQ8n8JdX4=
|
||||
golang.org/x/exp v0.0.0-20250620022241-b7579e27df2b h1:M2rDM6z3Fhozi9O7NWsxAkg/yqS/lQJ6PmkyIV3YP+o=
|
||||
golang.org/x/exp v0.0.0-20250620022241-b7579e27df2b/go.mod h1:3//PLf8L/X+8b4vuAfHzxeRUl04Adcb341+IGKfnqS8=
|
||||
golang.org/x/net v0.47.0 h1:Mx+4dIFzqraBXUugkia1OOvlD6LemFo1ALMHjrXDOhY=
|
||||
golang.org/x/net v0.47.0/go.mod h1:/jNxtkgq5yWUGYkaZGqo27cfGZ1c5Nen03aYrrKpVRU=
|
||||
golang.org/x/oauth2 v0.33.0 h1:4Q+qn+E5z8gPRJfmRy7C2gGG3T4jIprK6aSYgTXGRpo=
|
||||
@@ -200,3 +208,11 @@ gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=
|
||||
gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
|
||||
gotest.tools/v3 v3.5.2 h1:7koQfIKdy+I8UTetycgUqXWSDwpgv193Ka+qRsmBY8Q=
|
||||
gotest.tools/v3 v3.5.2/go.mod h1:LtdLGcnqToBH83WByAAi/wiwSFCArdFIUV/xxN4pcjA=
|
||||
modernc.org/libc v1.66.10 h1:yZkb3YeLx4oynyR+iUsXsybsX4Ubx7MQlSYEw4yj59A=
|
||||
modernc.org/libc v1.66.10/go.mod h1:8vGSEwvoUoltr4dlywvHqjtAqHBaw0j1jI7iFBTAr2I=
|
||||
modernc.org/mathutil v1.7.1 h1:GCZVGXdaN8gTqB1Mf/usp1Y/hSqgI2vAGGP4jZMCxOU=
|
||||
modernc.org/mathutil v1.7.1/go.mod h1:4p5IwJITfppl0G4sUEDtCr4DthTaT47/N3aT6MhfgJg=
|
||||
modernc.org/memory v1.11.0 h1:o4QC8aMQzmcwCK3t3Ux/ZHmwFPzE6hf2Y5LbkRs+hbI=
|
||||
modernc.org/memory v1.11.0/go.mod h1:/JP4VbVC+K5sU2wZi9bHoq2MAkCnrt2r98UGeSK7Mjw=
|
||||
modernc.org/sqlite v1.40.1 h1:VfuXcxcUWWKRBuP8+BR9L7VnmusMgBNNnBYGEe9w/iY=
|
||||
modernc.org/sqlite v1.40.1/go.mod h1:9fjQZ0mB1LLP0GYrp39oOJXx/I2sxEnZtzCmEQIKvGE=
|
||||
|
||||
216
internal/ai/alert_adapter.go
Normal file
216
internal/ai/alert_adapter.go
Normal file
@@ -0,0 +1,216 @@
|
||||
package ai
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"time"
|
||||
|
||||
"github.com/rcourtman/pulse-go-rewrite/internal/alerts"
|
||||
"github.com/rcourtman/pulse-go-rewrite/internal/models"
|
||||
)
|
||||
|
||||
// AlertManagerAdapter adapts the alerts.Manager to the AI's AlertProvider interface
|
||||
type AlertManagerAdapter struct {
|
||||
manager *alerts.Manager
|
||||
}
|
||||
|
||||
// NewAlertManagerAdapter creates a new adapter for the alert manager
|
||||
func NewAlertManagerAdapter(manager *alerts.Manager) *AlertManagerAdapter {
|
||||
return &AlertManagerAdapter{manager: manager}
|
||||
}
|
||||
|
||||
// GetActiveAlerts returns all currently active alerts
|
||||
func (a *AlertManagerAdapter) GetActiveAlerts() []AlertInfo {
|
||||
if a.manager == nil {
|
||||
return nil
|
||||
}
|
||||
|
||||
activeAlerts := a.manager.GetActiveAlerts()
|
||||
result := make([]AlertInfo, 0, len(activeAlerts))
|
||||
|
||||
for _, alert := range activeAlerts {
|
||||
result = append(result, convertAlertFromManager(&alert))
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
// GetRecentlyResolved returns alerts resolved in the last N minutes
|
||||
func (a *AlertManagerAdapter) GetRecentlyResolved(minutes int) []ResolvedAlertInfo {
|
||||
if a.manager == nil {
|
||||
return nil
|
||||
}
|
||||
|
||||
resolvedAlerts := a.manager.GetRecentlyResolved()
|
||||
cutoff := time.Now().Add(-time.Duration(minutes) * time.Minute)
|
||||
result := make([]ResolvedAlertInfo, 0)
|
||||
|
||||
for _, resolved := range resolvedAlerts {
|
||||
if resolved.ResolvedTime.After(cutoff) {
|
||||
info := ResolvedAlertInfo{
|
||||
AlertInfo: convertAlertFromModels(&resolved.Alert),
|
||||
ResolvedTime: resolved.ResolvedTime,
|
||||
Duration: formatDuration(resolved.ResolvedTime.Sub(resolved.StartTime)),
|
||||
}
|
||||
result = append(result, info)
|
||||
}
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
// GetAlertsByResource returns active alerts for a specific resource
|
||||
func (a *AlertManagerAdapter) GetAlertsByResource(resourceID string) []AlertInfo {
|
||||
if a.manager == nil {
|
||||
return nil
|
||||
}
|
||||
|
||||
activeAlerts := a.manager.GetActiveAlerts()
|
||||
result := make([]AlertInfo, 0)
|
||||
|
||||
for _, alert := range activeAlerts {
|
||||
if alert.ResourceID == resourceID {
|
||||
result = append(result, convertAlertFromManager(&alert))
|
||||
}
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
// GetAlertHistory returns historical alerts for a resource
|
||||
func (a *AlertManagerAdapter) GetAlertHistory(resourceID string, limit int) []ResolvedAlertInfo {
|
||||
if a.manager == nil {
|
||||
return nil
|
||||
}
|
||||
|
||||
// Get from recently resolved and filter by resource
|
||||
resolvedAlerts := a.manager.GetRecentlyResolved()
|
||||
result := make([]ResolvedAlertInfo, 0)
|
||||
|
||||
for _, resolved := range resolvedAlerts {
|
||||
if resolved.ResourceID == resourceID {
|
||||
info := ResolvedAlertInfo{
|
||||
AlertInfo: convertAlertFromModels(&resolved.Alert),
|
||||
ResolvedTime: resolved.ResolvedTime,
|
||||
Duration: formatDuration(resolved.ResolvedTime.Sub(resolved.StartTime)),
|
||||
}
|
||||
result = append(result, info)
|
||||
if len(result) >= limit {
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
// convertAlertFromManager converts an alerts.Alert to AI's AlertInfo
|
||||
func convertAlertFromManager(alert *alerts.Alert) AlertInfo {
|
||||
if alert == nil {
|
||||
return AlertInfo{}
|
||||
}
|
||||
|
||||
resourceType := inferResourceType(alert.Type, alert.Metadata)
|
||||
|
||||
return AlertInfo{
|
||||
ID: alert.ID,
|
||||
Type: alert.Type,
|
||||
Level: string(alert.Level),
|
||||
ResourceID: alert.ResourceID,
|
||||
ResourceName: alert.ResourceName,
|
||||
ResourceType: resourceType,
|
||||
Node: alert.Node,
|
||||
Instance: alert.Instance,
|
||||
Message: alert.Message,
|
||||
Value: alert.Value,
|
||||
Threshold: alert.Threshold,
|
||||
StartTime: alert.StartTime,
|
||||
Duration: formatDuration(time.Since(alert.StartTime)),
|
||||
Acknowledged: alert.Acknowledged,
|
||||
}
|
||||
}
|
||||
|
||||
// convertAlertFromModels converts a models.Alert to AI's AlertInfo
|
||||
func convertAlertFromModels(alert *models.Alert) AlertInfo {
|
||||
if alert == nil {
|
||||
return AlertInfo{}
|
||||
}
|
||||
|
||||
resourceType := inferResourceType(alert.Type, nil)
|
||||
|
||||
return AlertInfo{
|
||||
ID: alert.ID,
|
||||
Type: alert.Type,
|
||||
Level: alert.Level,
|
||||
ResourceID: alert.ResourceID,
|
||||
ResourceName: alert.ResourceName,
|
||||
ResourceType: resourceType,
|
||||
Node: alert.Node,
|
||||
Instance: alert.Instance,
|
||||
Message: alert.Message,
|
||||
Value: alert.Value,
|
||||
Threshold: alert.Threshold,
|
||||
StartTime: alert.StartTime,
|
||||
Duration: formatDuration(time.Since(alert.StartTime)),
|
||||
Acknowledged: alert.Acknowledged,
|
||||
}
|
||||
}
|
||||
|
||||
// inferResourceType infers resource type from alert type
|
||||
func inferResourceType(alertType string, metadata map[string]interface{}) string {
|
||||
if metadata != nil {
|
||||
if rt, ok := metadata["resourceType"].(string); ok {
|
||||
return rt
|
||||
}
|
||||
}
|
||||
|
||||
switch {
|
||||
case alertType == "node_offline" || alertType == "node_cpu" || alertType == "node_memory" || alertType == "node_temperature":
|
||||
return "node"
|
||||
case alertType == "storage_usage" || alertType == "storage":
|
||||
return "storage"
|
||||
case alertType == "docker_cpu" || alertType == "docker_memory" || alertType == "docker_restart" || alertType == "docker_offline":
|
||||
return "docker"
|
||||
case alertType == "host_cpu" || alertType == "host_memory" || alertType == "host_offline" || alertType == "host_disk":
|
||||
return "host"
|
||||
case alertType == "pmg" || alertType == "pmg_queue" || alertType == "pmg_quarantine":
|
||||
return "pmg"
|
||||
case alertType == "backup" || alertType == "backup_missing":
|
||||
return "backup"
|
||||
case alertType == "snapshot" || alertType == "snapshot_age":
|
||||
return "snapshot"
|
||||
default:
|
||||
return "guest"
|
||||
}
|
||||
}
|
||||
|
||||
// formatDuration returns a human-readable duration string
|
||||
func formatDuration(d time.Duration) string {
|
||||
if d < time.Minute {
|
||||
return "< 1 min"
|
||||
} else if d < time.Hour {
|
||||
mins := int(d.Minutes())
|
||||
if mins == 1 {
|
||||
return "1 min"
|
||||
}
|
||||
return fmt.Sprintf("%d mins", mins)
|
||||
} else if d < 24*time.Hour {
|
||||
hours := int(d.Hours())
|
||||
mins := int(d.Minutes()) % 60
|
||||
if mins > 0 {
|
||||
return fmt.Sprintf("%dh %dm", hours, mins)
|
||||
}
|
||||
if hours == 1 {
|
||||
return "1 hour"
|
||||
}
|
||||
return fmt.Sprintf("%d hours", hours)
|
||||
}
|
||||
days := int(d.Hours() / 24)
|
||||
hours := int(d.Hours()) % 24
|
||||
if hours > 0 {
|
||||
return fmt.Sprintf("%dd %dh", days, hours)
|
||||
}
|
||||
if days == 1 {
|
||||
return "1 day"
|
||||
}
|
||||
return fmt.Sprintf("%d days", days)
|
||||
}
|
||||
237
internal/ai/alert_provider.go
Normal file
237
internal/ai/alert_provider.go
Normal file
@@ -0,0 +1,237 @@
|
||||
// Package ai provides AI-powered infrastructure investigation and remediation.
|
||||
package ai
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"strings"
|
||||
"time"
|
||||
)
|
||||
|
||||
// AlertInfo contains information about an alert for AI context
|
||||
type AlertInfo struct {
|
||||
ID string `json:"id"`
|
||||
Type string `json:"type"` // cpu, memory, disk, offline, etc.
|
||||
Level string `json:"level"` // warning, critical
|
||||
ResourceID string `json:"resource_id"` // unique resource identifier
|
||||
ResourceName string `json:"resource_name"` // human-readable name
|
||||
ResourceType string `json:"resource_type"` // guest, node, storage, docker, etc.
|
||||
Node string `json:"node"` // PVE node (if applicable)
|
||||
Instance string `json:"instance"` // Proxmox instance name
|
||||
Message string `json:"message"` // Alert description
|
||||
Value float64 `json:"value"` // Current metric value
|
||||
Threshold float64 `json:"threshold"` // Threshold that was exceeded
|
||||
StartTime time.Time `json:"start_time"` // When alert started
|
||||
Duration string `json:"duration"` // Human-readable duration
|
||||
Acknowledged bool `json:"acknowledged"` // Whether alert has been acked
|
||||
}
|
||||
|
||||
// ResolvedAlertInfo contains information about a recently resolved alert
|
||||
type ResolvedAlertInfo struct {
|
||||
AlertInfo
|
||||
ResolvedTime time.Time `json:"resolved_time"`
|
||||
Duration string `json:"total_duration"` // How long the alert lasted
|
||||
}
|
||||
|
||||
// AlertProvider provides access to the current alert state
|
||||
type AlertProvider interface {
|
||||
// GetActiveAlerts returns all currently active alerts
|
||||
GetActiveAlerts() []AlertInfo
|
||||
|
||||
// GetRecentlyResolved returns alerts resolved in the last N minutes
|
||||
GetRecentlyResolved(minutes int) []ResolvedAlertInfo
|
||||
|
||||
// GetAlertsByResource returns active alerts for a specific resource
|
||||
GetAlertsByResource(resourceID string) []AlertInfo
|
||||
|
||||
// GetAlertHistory returns historical alerts for a resource
|
||||
GetAlertHistory(resourceID string, limit int) []ResolvedAlertInfo
|
||||
}
|
||||
|
||||
// SetAlertProvider sets the alert provider for AI context
|
||||
func (s *Service) SetAlertProvider(ap AlertProvider) {
|
||||
s.mu.Lock()
|
||||
defer s.mu.Unlock()
|
||||
s.alertProvider = ap
|
||||
}
|
||||
|
||||
// buildAlertContext generates AI context from current alerts
|
||||
func (s *Service) buildAlertContext() string {
|
||||
s.mu.RLock()
|
||||
ap := s.alertProvider
|
||||
s.mu.RUnlock()
|
||||
|
||||
if ap == nil {
|
||||
return ""
|
||||
}
|
||||
|
||||
activeAlerts := ap.GetActiveAlerts()
|
||||
recentlyResolved := ap.GetRecentlyResolved(30) // Last 30 minutes
|
||||
|
||||
if len(activeAlerts) == 0 && len(recentlyResolved) == 0 {
|
||||
return ""
|
||||
}
|
||||
|
||||
var sections []string
|
||||
sections = append(sections, "\n## Alert Status")
|
||||
|
||||
// Active alerts
|
||||
if len(activeAlerts) > 0 {
|
||||
sections = append(sections, "\n### Active Alerts")
|
||||
sections = append(sections, fmt.Sprintf("There are **%d active alert(s)** that may need attention:\n", len(activeAlerts)))
|
||||
|
||||
// Group by severity
|
||||
var critical, warning []AlertInfo
|
||||
for _, a := range activeAlerts {
|
||||
if a.Level == "critical" {
|
||||
critical = append(critical, a)
|
||||
} else {
|
||||
warning = append(warning, a)
|
||||
}
|
||||
}
|
||||
|
||||
if len(critical) > 0 {
|
||||
sections = append(sections, "**Critical:**")
|
||||
for _, a := range critical {
|
||||
sections = append(sections, formatAlertForAI(a))
|
||||
}
|
||||
}
|
||||
|
||||
if len(warning) > 0 {
|
||||
sections = append(sections, "**Warning:**")
|
||||
for _, a := range warning {
|
||||
sections = append(sections, formatAlertForAI(a))
|
||||
}
|
||||
}
|
||||
} else {
|
||||
sections = append(sections, "\n### No Active Alerts")
|
||||
sections = append(sections, "All systems are operating within normal thresholds.")
|
||||
}
|
||||
|
||||
// Recently resolved
|
||||
if len(recentlyResolved) > 0 {
|
||||
sections = append(sections, fmt.Sprintf("\n### Recently Resolved (%d)", len(recentlyResolved)))
|
||||
sections = append(sections, "These alerts were resolved in the last 30 minutes:")
|
||||
// Show up to 5 most recent
|
||||
limit := 5
|
||||
if len(recentlyResolved) < limit {
|
||||
limit = len(recentlyResolved)
|
||||
}
|
||||
for i := 0; i < limit; i++ {
|
||||
a := recentlyResolved[i]
|
||||
sections = append(sections, fmt.Sprintf("- **%s** on %s: %s (lasted %s, resolved %s ago)",
|
||||
a.Type, a.ResourceName, a.Message, a.Duration,
|
||||
formatTimeAgo(a.ResolvedTime)))
|
||||
}
|
||||
if len(recentlyResolved) > limit {
|
||||
sections = append(sections, fmt.Sprintf(" ... and %d more", len(recentlyResolved)-limit))
|
||||
}
|
||||
}
|
||||
|
||||
return strings.Join(sections, "\n")
|
||||
}
|
||||
|
||||
// buildTargetAlertContext builds alert context for a specific target
|
||||
func (s *Service) buildTargetAlertContext(resourceID string) string {
|
||||
s.mu.RLock()
|
||||
ap := s.alertProvider
|
||||
s.mu.RUnlock()
|
||||
|
||||
if ap == nil || resourceID == "" {
|
||||
return ""
|
||||
}
|
||||
|
||||
alerts := ap.GetAlertsByResource(resourceID)
|
||||
if len(alerts) == 0 {
|
||||
return ""
|
||||
}
|
||||
|
||||
var lines []string
|
||||
lines = append(lines, "\n### Active Alerts for This Resource")
|
||||
for _, a := range alerts {
|
||||
lines = append(lines, formatAlertForAI(a))
|
||||
}
|
||||
|
||||
return strings.Join(lines, "\n")
|
||||
}
|
||||
|
||||
// formatAlertForAI formats an alert for inclusion in AI context
|
||||
func formatAlertForAI(a AlertInfo) string {
|
||||
ackedNote := ""
|
||||
if a.Acknowledged {
|
||||
ackedNote = " [ACKNOWLEDGED]"
|
||||
}
|
||||
|
||||
nodeInfo := ""
|
||||
if a.Node != "" {
|
||||
nodeInfo = fmt.Sprintf(" on node %s", a.Node)
|
||||
}
|
||||
|
||||
return fmt.Sprintf("- **%s** %s: %s (current: %.1f%%, threshold: %.1f%%) - active for %s%s%s",
|
||||
strings.ToUpper(a.Level), a.Type, a.ResourceName,
|
||||
a.Value, a.Threshold, a.Duration, nodeInfo, ackedNote)
|
||||
}
|
||||
|
||||
// formatTimeAgo returns a human-readable time-ago string
|
||||
func formatTimeAgo(t time.Time) string {
|
||||
d := time.Since(t)
|
||||
if d < time.Minute {
|
||||
return "just now"
|
||||
} else if d < time.Hour {
|
||||
mins := int(d.Minutes())
|
||||
if mins == 1 {
|
||||
return "1 minute"
|
||||
}
|
||||
return fmt.Sprintf("%d minutes", mins)
|
||||
} else if d < 24*time.Hour {
|
||||
hours := int(d.Hours())
|
||||
if hours == 1 {
|
||||
return "1 hour"
|
||||
}
|
||||
return fmt.Sprintf("%d hours", hours)
|
||||
}
|
||||
days := int(d.Hours() / 24)
|
||||
if days == 1 {
|
||||
return "1 day"
|
||||
}
|
||||
return fmt.Sprintf("%d days", days)
|
||||
}
|
||||
|
||||
// AlertInvestigationRequest represents a request to investigate an alert
|
||||
type AlertInvestigationRequest struct {
|
||||
AlertID string `json:"alert_id"`
|
||||
ResourceID string `json:"resource_id"`
|
||||
ResourceName string `json:"resource_name"`
|
||||
ResourceType string `json:"resource_type"` // guest, node, storage, docker
|
||||
AlertType string `json:"alert_type"` // cpu, memory, disk, offline, etc.
|
||||
Level string `json:"level"` // warning, critical
|
||||
Value float64 `json:"value"`
|
||||
Threshold float64 `json:"threshold"`
|
||||
Message string `json:"message"`
|
||||
Duration string `json:"duration"` // How long the alert has been active
|
||||
Node string `json:"node,omitempty"`
|
||||
VMID int `json:"vmid,omitempty"`
|
||||
}
|
||||
|
||||
// GenerateAlertInvestigationPrompt creates a focused prompt for alert investigation
|
||||
func GenerateAlertInvestigationPrompt(req AlertInvestigationRequest) string {
|
||||
var prompt strings.Builder
|
||||
|
||||
prompt.WriteString(fmt.Sprintf("Investigate this %s alert:\n\n", strings.ToUpper(req.Level)))
|
||||
prompt.WriteString(fmt.Sprintf("**Resource:** %s (%s)\n", req.ResourceName, req.ResourceType))
|
||||
prompt.WriteString(fmt.Sprintf("**Alert Type:** %s\n", req.AlertType))
|
||||
prompt.WriteString(fmt.Sprintf("**Current Value:** %.1f%%\n", req.Value))
|
||||
prompt.WriteString(fmt.Sprintf("**Threshold:** %.1f%%\n", req.Threshold))
|
||||
prompt.WriteString(fmt.Sprintf("**Duration:** %s\n", req.Duration))
|
||||
|
||||
if req.Node != "" {
|
||||
prompt.WriteString(fmt.Sprintf("**Node:** %s\n", req.Node))
|
||||
}
|
||||
|
||||
prompt.WriteString("\n**Action Required:**\n")
|
||||
prompt.WriteString("1. Identify the root cause of this alert\n")
|
||||
prompt.WriteString("2. Check related metrics and system state\n")
|
||||
prompt.WriteString("3. Suggest specific remediation steps\n")
|
||||
prompt.WriteString("4. If safe, execute diagnostic commands to gather more info\n")
|
||||
|
||||
return prompt.String()
|
||||
}
|
||||
@@ -11,6 +11,7 @@ const (
|
||||
ProviderAnthropic = config.AIProviderAnthropic
|
||||
ProviderOpenAI = config.AIProviderOpenAI
|
||||
ProviderOllama = config.AIProviderOllama
|
||||
ProviderDeepSeek = config.AIProviderDeepSeek
|
||||
)
|
||||
|
||||
// NewDefaultConfig returns a new AI config with sensible defaults
|
||||
|
||||
421
internal/ai/knowledge/store.go
Normal file
421
internal/ai/knowledge/store.go
Normal file
@@ -0,0 +1,421 @@
|
||||
// Package knowledge provides persistent storage for AI-learned information about guests
|
||||
package knowledge
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
"sync"
|
||||
"time"
|
||||
|
||||
"github.com/rcourtman/pulse-go-rewrite/internal/crypto"
|
||||
"github.com/rs/zerolog/log"
|
||||
)
|
||||
|
||||
// Note represents a single piece of learned information
|
||||
type Note struct {
|
||||
ID string `json:"id"`
|
||||
Category string `json:"category"` // "service", "path", "credential", "config", "learning", "history"
|
||||
Title string `json:"title"`
|
||||
Content string `json:"content"`
|
||||
CreatedAt time.Time `json:"created_at"`
|
||||
UpdatedAt time.Time `json:"updated_at"`
|
||||
}
|
||||
|
||||
// GuestKnowledge represents all knowledge about a specific guest
|
||||
type GuestKnowledge struct {
|
||||
GuestID string `json:"guest_id"`
|
||||
GuestName string `json:"guest_name"`
|
||||
GuestType string `json:"guest_type"` // "vm", "container", "node", "host"
|
||||
Notes []Note `json:"notes"`
|
||||
UpdatedAt time.Time `json:"updated_at"`
|
||||
}
|
||||
|
||||
// Store manages persistent knowledge storage with encryption
|
||||
type Store struct {
|
||||
dataDir string
|
||||
mu sync.RWMutex
|
||||
cache map[string]*GuestKnowledge
|
||||
crypto *crypto.CryptoManager
|
||||
}
|
||||
|
||||
// NewStore creates a new knowledge store with encryption
|
||||
func NewStore(dataDir string) (*Store, error) {
|
||||
knowledgeDir := filepath.Join(dataDir, "knowledge")
|
||||
if err := os.MkdirAll(knowledgeDir, 0700); err != nil {
|
||||
return nil, fmt.Errorf("failed to create knowledge directory: %w", err)
|
||||
}
|
||||
|
||||
// Initialize crypto manager for encryption (uses same key as other Pulse secrets)
|
||||
cryptoMgr, err := crypto.NewCryptoManagerAt(dataDir)
|
||||
if err != nil {
|
||||
log.Warn().Err(err).Msg("Failed to initialize crypto for knowledge store, data will be unencrypted")
|
||||
}
|
||||
|
||||
return &Store{
|
||||
dataDir: knowledgeDir,
|
||||
cache: make(map[string]*GuestKnowledge),
|
||||
crypto: cryptoMgr,
|
||||
}, nil
|
||||
}
|
||||
|
||||
// guestFilePath returns the file path for a guest's knowledge
|
||||
func (s *Store) guestFilePath(guestID string) string {
|
||||
// Sanitize guest ID for filesystem
|
||||
safeID := filepath.Base(guestID) // Prevent path traversal
|
||||
// Use .enc extension for encrypted files
|
||||
if s.crypto != nil {
|
||||
return filepath.Join(s.dataDir, safeID+".enc")
|
||||
}
|
||||
return filepath.Join(s.dataDir, safeID+".json")
|
||||
}
|
||||
|
||||
// GetKnowledge retrieves knowledge for a guest
|
||||
func (s *Store) GetKnowledge(guestID string) (*GuestKnowledge, error) {
|
||||
s.mu.RLock()
|
||||
if cached, ok := s.cache[guestID]; ok {
|
||||
s.mu.RUnlock()
|
||||
return cached, nil
|
||||
}
|
||||
s.mu.RUnlock()
|
||||
|
||||
// Load from disk
|
||||
s.mu.Lock()
|
||||
defer s.mu.Unlock()
|
||||
|
||||
// Double-check after acquiring write lock
|
||||
if cached, ok := s.cache[guestID]; ok {
|
||||
return cached, nil
|
||||
}
|
||||
|
||||
filePath := s.guestFilePath(guestID)
|
||||
data, err := os.ReadFile(filePath)
|
||||
if os.IsNotExist(err) {
|
||||
// Try legacy unencrypted file
|
||||
legacyPath := filepath.Join(s.dataDir, filepath.Base(guestID)+".json")
|
||||
data, err = os.ReadFile(legacyPath)
|
||||
if os.IsNotExist(err) {
|
||||
// No knowledge yet, return empty
|
||||
knowledge := &GuestKnowledge{
|
||||
GuestID: guestID,
|
||||
Notes: []Note{},
|
||||
}
|
||||
s.cache[guestID] = knowledge
|
||||
return knowledge, nil
|
||||
}
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to read knowledge file: %w", err)
|
||||
}
|
||||
// Legacy file found - will be encrypted on next save
|
||||
log.Info().Str("guest_id", guestID).Msg("Found unencrypted knowledge file, will encrypt on next save")
|
||||
} else if err != nil {
|
||||
return nil, fmt.Errorf("failed to read knowledge file: %w", err)
|
||||
}
|
||||
|
||||
// Decrypt if crypto is available and file is encrypted
|
||||
if s.crypto != nil && filepath.Ext(filePath) == ".enc" {
|
||||
decrypted, err := s.crypto.Decrypt(data)
|
||||
if err != nil {
|
||||
// Try as plain JSON (migration case)
|
||||
var knowledge GuestKnowledge
|
||||
if jsonErr := json.Unmarshal(data, &knowledge); jsonErr == nil {
|
||||
log.Info().Str("guest_id", guestID).Msg("Loaded unencrypted knowledge (will encrypt on next save)")
|
||||
s.cache[guestID] = &knowledge
|
||||
return &knowledge, nil
|
||||
}
|
||||
return nil, fmt.Errorf("failed to decrypt knowledge: %w", err)
|
||||
}
|
||||
data = decrypted
|
||||
}
|
||||
|
||||
var knowledge GuestKnowledge
|
||||
if err := json.Unmarshal(data, &knowledge); err != nil {
|
||||
return nil, fmt.Errorf("failed to parse knowledge file: %w", err)
|
||||
}
|
||||
|
||||
s.cache[guestID] = &knowledge
|
||||
return &knowledge, nil
|
||||
}
|
||||
|
||||
// SaveNote adds or updates a note for a guest
|
||||
func (s *Store) SaveNote(guestID, guestName, guestType, category, title, content string) error {
|
||||
s.mu.Lock()
|
||||
defer s.mu.Unlock()
|
||||
|
||||
// Get or create knowledge
|
||||
knowledge, ok := s.cache[guestID]
|
||||
if !ok {
|
||||
// Try to load from disk first
|
||||
knowledge = &GuestKnowledge{
|
||||
GuestID: guestID,
|
||||
GuestName: guestName,
|
||||
GuestType: guestType,
|
||||
Notes: []Note{},
|
||||
}
|
||||
|
||||
// Check for existing file
|
||||
filePath := s.guestFilePath(guestID)
|
||||
if data, err := os.ReadFile(filePath); err == nil {
|
||||
// Decrypt if needed
|
||||
if s.crypto != nil && filepath.Ext(filePath) == ".enc" {
|
||||
if decrypted, err := s.crypto.Decrypt(data); err == nil {
|
||||
data = decrypted
|
||||
}
|
||||
}
|
||||
if err := json.Unmarshal(data, &knowledge); err != nil {
|
||||
log.Warn().Err(err).Str("guest_id", guestID).Msg("Failed to parse existing knowledge, starting fresh")
|
||||
}
|
||||
}
|
||||
s.cache[guestID] = knowledge
|
||||
}
|
||||
|
||||
// Update guest info if provided
|
||||
if guestName != "" {
|
||||
knowledge.GuestName = guestName
|
||||
}
|
||||
if guestType != "" {
|
||||
knowledge.GuestType = guestType
|
||||
}
|
||||
|
||||
now := time.Now()
|
||||
|
||||
// Check if note with same title exists in category
|
||||
found := false
|
||||
for i, note := range knowledge.Notes {
|
||||
if note.Category == category && note.Title == title {
|
||||
// Update existing note
|
||||
knowledge.Notes[i].Content = content
|
||||
knowledge.Notes[i].UpdatedAt = now
|
||||
found = true
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if !found {
|
||||
// Add new note
|
||||
note := Note{
|
||||
ID: fmt.Sprintf("%s-%d", category, len(knowledge.Notes)+1),
|
||||
Category: category,
|
||||
Title: title,
|
||||
Content: content,
|
||||
CreatedAt: now,
|
||||
UpdatedAt: now,
|
||||
}
|
||||
knowledge.Notes = append(knowledge.Notes, note)
|
||||
}
|
||||
|
||||
knowledge.UpdatedAt = now
|
||||
|
||||
// Save to disk (encrypted)
|
||||
return s.saveToFile(guestID, knowledge)
|
||||
}
|
||||
|
||||
// DeleteNote removes a note
|
||||
func (s *Store) DeleteNote(guestID, noteID string) error {
|
||||
s.mu.Lock()
|
||||
defer s.mu.Unlock()
|
||||
|
||||
knowledge, ok := s.cache[guestID]
|
||||
if !ok {
|
||||
return fmt.Errorf("guest not found: %s", guestID)
|
||||
}
|
||||
|
||||
// Find and remove note
|
||||
for i, note := range knowledge.Notes {
|
||||
if note.ID == noteID {
|
||||
knowledge.Notes = append(knowledge.Notes[:i], knowledge.Notes[i+1:]...)
|
||||
knowledge.UpdatedAt = time.Now()
|
||||
return s.saveToFile(guestID, knowledge)
|
||||
}
|
||||
}
|
||||
|
||||
return fmt.Errorf("note not found: %s", noteID)
|
||||
}
|
||||
|
||||
// GetNotesByCategory returns notes filtered by category
|
||||
func (s *Store) GetNotesByCategory(guestID, category string) ([]Note, error) {
|
||||
knowledge, err := s.GetKnowledge(guestID)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
var notes []Note
|
||||
for _, note := range knowledge.Notes {
|
||||
if category == "" || note.Category == category {
|
||||
notes = append(notes, note)
|
||||
}
|
||||
}
|
||||
return notes, nil
|
||||
}
|
||||
|
||||
// FormatForContext formats knowledge for injection into AI context
|
||||
func (s *Store) FormatForContext(guestID string) string {
|
||||
knowledge, err := s.GetKnowledge(guestID)
|
||||
if err != nil {
|
||||
log.Warn().Err(err).Str("guest_id", guestID).Msg("Failed to load guest knowledge")
|
||||
return ""
|
||||
}
|
||||
|
||||
if len(knowledge.Notes) == 0 {
|
||||
return ""
|
||||
}
|
||||
|
||||
// Group notes by category
|
||||
byCategory := make(map[string][]Note)
|
||||
for _, note := range knowledge.Notes {
|
||||
byCategory[note.Category] = append(byCategory[note.Category], note)
|
||||
}
|
||||
|
||||
// Build formatted output with guidance on using this knowledge
|
||||
var result string
|
||||
result = fmt.Sprintf("\n## Previously Learned Information about %s\n", knowledge.GuestName)
|
||||
result += "**If relevant to the current task, use this saved information directly instead of rediscovering it.**\n"
|
||||
|
||||
categoryOrder := []string{"credential", "service", "path", "config", "learning", "history"}
|
||||
categoryNames := map[string]string{
|
||||
"credential": "Credentials",
|
||||
"service": "Services",
|
||||
"path": "Important Paths",
|
||||
"config": "Configuration",
|
||||
"learning": "Learnings",
|
||||
"history": "Session History",
|
||||
}
|
||||
|
||||
for _, cat := range categoryOrder {
|
||||
notes, ok := byCategory[cat]
|
||||
if !ok || len(notes) == 0 {
|
||||
continue
|
||||
}
|
||||
|
||||
result += fmt.Sprintf("\n### %s\n", categoryNames[cat])
|
||||
for _, note := range notes {
|
||||
result += fmt.Sprintf("- **%s**: %s\n", note.Title, note.Content)
|
||||
}
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
// saveToFile persists knowledge to disk with encryption
|
||||
func (s *Store) saveToFile(guestID string, knowledge *GuestKnowledge) error {
|
||||
data, err := json.MarshalIndent(knowledge, "", " ")
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to marshal knowledge: %w", err)
|
||||
}
|
||||
|
||||
// Encrypt if crypto manager is available
|
||||
if s.crypto != nil {
|
||||
encrypted, err := s.crypto.Encrypt(data)
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to encrypt knowledge: %w", err)
|
||||
}
|
||||
data = encrypted
|
||||
}
|
||||
|
||||
filePath := s.guestFilePath(guestID)
|
||||
if err := os.WriteFile(filePath, data, 0600); err != nil {
|
||||
return fmt.Errorf("failed to write knowledge file: %w", err)
|
||||
}
|
||||
|
||||
// Remove legacy unencrypted file if it exists
|
||||
if s.crypto != nil {
|
||||
legacyPath := filepath.Join(s.dataDir, filepath.Base(guestID)+".json")
|
||||
if _, err := os.Stat(legacyPath); err == nil {
|
||||
os.Remove(legacyPath)
|
||||
log.Info().Str("guest_id", guestID).Msg("Removed legacy unencrypted knowledge file")
|
||||
}
|
||||
}
|
||||
|
||||
log.Debug().
|
||||
Str("guest_id", guestID).
|
||||
Int("notes", len(knowledge.Notes)).
|
||||
Bool("encrypted", s.crypto != nil).
|
||||
Msg("Saved guest knowledge")
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// ListGuests returns all guests that have knowledge stored
|
||||
func (s *Store) ListGuests() ([]string, error) {
|
||||
files, err := os.ReadDir(s.dataDir)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to read knowledge directory: %w", err)
|
||||
}
|
||||
|
||||
var guests []string
|
||||
for _, file := range files {
|
||||
ext := filepath.Ext(file.Name())
|
||||
if ext == ".json" || ext == ".enc" {
|
||||
guestID := file.Name()[:len(file.Name())-len(ext)]
|
||||
guests = append(guests, guestID)
|
||||
}
|
||||
}
|
||||
return guests, nil
|
||||
}
|
||||
|
||||
// FormatAllForContext returns a summary of all saved knowledge across all guests
|
||||
// This is used when no specific target is selected to give the AI full context
|
||||
func (s *Store) FormatAllForContext() string {
|
||||
guests, err := s.ListGuests()
|
||||
if err != nil || len(guests) == 0 {
|
||||
return ""
|
||||
}
|
||||
|
||||
var sections []string
|
||||
totalNotes := 0
|
||||
|
||||
for _, guestID := range guests {
|
||||
knowledge, err := s.GetKnowledge(guestID)
|
||||
if err != nil || len(knowledge.Notes) == 0 {
|
||||
continue
|
||||
}
|
||||
|
||||
totalNotes += len(knowledge.Notes)
|
||||
|
||||
// Build a summary for this guest
|
||||
guestName := knowledge.GuestName
|
||||
if guestName == "" {
|
||||
guestName = guestID
|
||||
}
|
||||
|
||||
// Group notes by category
|
||||
byCategory := make(map[string][]Note)
|
||||
for _, note := range knowledge.Notes {
|
||||
byCategory[note.Category] = append(byCategory[note.Category], note)
|
||||
}
|
||||
|
||||
var guestSection string
|
||||
guestSection = fmt.Sprintf("\n### %s (%s)", guestName, knowledge.GuestType)
|
||||
|
||||
categoryOrder := []string{"credential", "service", "path", "config", "learning"}
|
||||
for _, cat := range categoryOrder {
|
||||
notes, ok := byCategory[cat]
|
||||
if !ok || len(notes) == 0 {
|
||||
continue
|
||||
}
|
||||
for _, note := range notes {
|
||||
// Mask credentials in the summary
|
||||
content := note.Content
|
||||
if cat == "credential" && len(content) > 6 {
|
||||
content = content[:2] + "****" + content[len(content)-2:]
|
||||
}
|
||||
guestSection += fmt.Sprintf("\n- **%s**: %s", note.Title, content)
|
||||
}
|
||||
}
|
||||
|
||||
sections = append(sections, guestSection)
|
||||
}
|
||||
|
||||
if len(sections) == 0 {
|
||||
return ""
|
||||
}
|
||||
|
||||
result := fmt.Sprintf("\n\n## Saved Knowledge (%d notes across %d guests)\n", totalNotes, len(sections))
|
||||
result += "This is information learned from previous sessions. Use it to avoid rediscovery.\n"
|
||||
result += strings.Join(sections, "\n")
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
@@ -32,7 +32,8 @@ func NewAnthropicClient(apiKey, model string) *AnthropicClient {
|
||||
apiKey: apiKey,
|
||||
model: model,
|
||||
client: &http.Client{
|
||||
Timeout: 120 * time.Second, // LLM responses can take a while
|
||||
// 5 minutes - Opus and other large models can take a very long time
|
||||
Timeout: 300 * time.Second,
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
@@ -32,6 +32,13 @@ func NewFromConfig(cfg *config.AIConfig) (Provider, error) {
|
||||
case config.AIProviderOllama:
|
||||
return NewOllamaClient(cfg.GetModel(), cfg.GetBaseURL()), nil
|
||||
|
||||
case config.AIProviderDeepSeek:
|
||||
if cfg.APIKey == "" {
|
||||
return nil, fmt.Errorf("DeepSeek API key is required")
|
||||
}
|
||||
// DeepSeek uses OpenAI-compatible API
|
||||
return NewOpenAIClient(cfg.APIKey, cfg.GetModel(), cfg.GetBaseURL()), nil
|
||||
|
||||
default:
|
||||
return nil, fmt.Errorf("unknown provider: %s", cfg.Provider)
|
||||
}
|
||||
|
||||
@@ -7,14 +7,20 @@ import (
|
||||
"fmt"
|
||||
"io"
|
||||
"net/http"
|
||||
"strings"
|
||||
"time"
|
||||
|
||||
"github.com/rs/zerolog/log"
|
||||
)
|
||||
|
||||
const (
|
||||
openaiAPIURL = "https://api.openai.com/v1/chat/completions"
|
||||
openaiAPIURL = "https://api.openai.com/v1/chat/completions"
|
||||
openaiMaxRetries = 3
|
||||
openaiInitialBackoff = 2 * time.Second
|
||||
)
|
||||
|
||||
// OpenAIClient implements the Provider interface for OpenAI's API
|
||||
// Also works with OpenAI-compatible APIs like DeepSeek
|
||||
type OpenAIClient struct {
|
||||
apiKey string
|
||||
model string
|
||||
@@ -32,7 +38,8 @@ func NewOpenAIClient(apiKey, model, baseURL string) *OpenAIClient {
|
||||
model: model,
|
||||
baseURL: baseURL,
|
||||
client: &http.Client{
|
||||
Timeout: 120 * time.Second,
|
||||
// 5 minutes timeout - DeepSeek reasoning models can take a long time
|
||||
Timeout: 300 * time.Second,
|
||||
},
|
||||
}
|
||||
}
|
||||
@@ -44,15 +51,52 @@ func (c *OpenAIClient) Name() string {
|
||||
|
||||
// openaiRequest is the request body for the OpenAI API
|
||||
type openaiRequest struct {
|
||||
Model string `json:"model"`
|
||||
Messages []openaiMessage `json:"messages"`
|
||||
MaxTokens int `json:"max_tokens,omitempty"`
|
||||
Temperature float64 `json:"temperature,omitempty"`
|
||||
Model string `json:"model"`
|
||||
Messages []openaiMessage `json:"messages"`
|
||||
MaxTokens int `json:"max_tokens,omitempty"`
|
||||
Temperature float64 `json:"temperature,omitempty"`
|
||||
Tools []openaiTool `json:"tools,omitempty"`
|
||||
ToolChoice interface{} `json:"tool_choice,omitempty"` // "auto", "none", or specific tool
|
||||
}
|
||||
|
||||
// deepseekRequest extends openaiRequest with DeepSeek-specific fields
|
||||
type deepseekRequest struct {
|
||||
Model string `json:"model"`
|
||||
Messages []openaiMessage `json:"messages"`
|
||||
MaxTokens int `json:"max_tokens,omitempty"`
|
||||
Tools []openaiTool `json:"tools,omitempty"`
|
||||
ToolChoice interface{} `json:"tool_choice,omitempty"`
|
||||
}
|
||||
|
||||
// openaiTool represents a function tool in OpenAI format
|
||||
type openaiTool struct {
|
||||
Type string `json:"type"` // always "function"
|
||||
Function openaiFunction `json:"function"`
|
||||
}
|
||||
|
||||
type openaiFunction struct {
|
||||
Name string `json:"name"`
|
||||
Description string `json:"description,omitempty"`
|
||||
Parameters map[string]interface{} `json:"parameters,omitempty"`
|
||||
}
|
||||
|
||||
type openaiMessage struct {
|
||||
Role string `json:"role"`
|
||||
Content string `json:"content"`
|
||||
Role string `json:"role"`
|
||||
Content interface{} `json:"content,omitempty"` // string or null for tool calls
|
||||
ReasoningContent string `json:"reasoning_content,omitempty"` // DeepSeek thinking mode
|
||||
ToolCalls []openaiToolCall `json:"tool_calls,omitempty"` // For assistant messages with tool calls
|
||||
ToolCallID string `json:"tool_call_id,omitempty"` // For tool response messages
|
||||
}
|
||||
|
||||
type openaiToolCall struct {
|
||||
ID string `json:"id"`
|
||||
Type string `json:"type"` // always "function"
|
||||
Function openaiToolFunction `json:"function"`
|
||||
}
|
||||
|
||||
type openaiToolFunction struct {
|
||||
Name string `json:"name"`
|
||||
Arguments string `json:"arguments"` // JSON string of arguments
|
||||
}
|
||||
|
||||
// openaiResponse is the response from the OpenAI API
|
||||
@@ -67,8 +111,15 @@ type openaiResponse struct {
|
||||
|
||||
type openaiChoice struct {
|
||||
Index int `json:"index"`
|
||||
Message openaiMessage `json:"message"`
|
||||
FinishReason string `json:"finish_reason"`
|
||||
Message openaiRespMsg `json:"message"`
|
||||
FinishReason string `json:"finish_reason"` // "stop", "tool_calls", etc.
|
||||
}
|
||||
|
||||
type openaiRespMsg struct {
|
||||
Role string `json:"role"`
|
||||
Content string `json:"content,omitempty"`
|
||||
ReasoningContent string `json:"reasoning_content,omitempty"` // DeepSeek thinking mode
|
||||
ToolCalls []openaiToolCall `json:"tool_calls,omitempty"`
|
||||
}
|
||||
|
||||
type openaiUsage struct {
|
||||
@@ -87,6 +138,16 @@ type openaiErrorDetail struct {
|
||||
Code string `json:"code"`
|
||||
}
|
||||
|
||||
// isDeepSeek returns true if this client is configured for DeepSeek
|
||||
func (c *OpenAIClient) isDeepSeek() bool {
|
||||
return strings.Contains(c.baseURL, "deepseek.com")
|
||||
}
|
||||
|
||||
// isDeepSeekReasoner returns true if using DeepSeek's reasoning model
|
||||
func (c *OpenAIClient) isDeepSeekReasoner() bool {
|
||||
return c.isDeepSeek() && strings.Contains(c.model, "reasoner")
|
||||
}
|
||||
|
||||
// Chat sends a chat request to the OpenAI API
|
||||
func (c *OpenAIClient) Chat(ctx context.Context, req ChatRequest) (*ChatResponse, error) {
|
||||
// Convert messages to OpenAI format
|
||||
@@ -101,10 +162,45 @@ func (c *OpenAIClient) Chat(ctx context.Context, req ChatRequest) (*ChatResponse
|
||||
}
|
||||
|
||||
for _, m := range req.Messages {
|
||||
messages = append(messages, openaiMessage{
|
||||
Role: m.Role,
|
||||
Content: m.Content,
|
||||
})
|
||||
msg := openaiMessage{
|
||||
Role: m.Role,
|
||||
}
|
||||
|
||||
// Handle tool calls in assistant messages
|
||||
if len(m.ToolCalls) > 0 {
|
||||
msg.Content = nil // Content is null when there are tool calls
|
||||
if m.Content != "" {
|
||||
msg.Content = m.Content
|
||||
}
|
||||
// For DeepSeek reasoner, include reasoning_content if present
|
||||
if c.isDeepSeekReasoner() && m.ReasoningContent != "" {
|
||||
msg.ReasoningContent = m.ReasoningContent
|
||||
}
|
||||
for _, tc := range m.ToolCalls {
|
||||
argsJSON, _ := json.Marshal(tc.Input)
|
||||
msg.ToolCalls = append(msg.ToolCalls, openaiToolCall{
|
||||
ID: tc.ID,
|
||||
Type: "function",
|
||||
Function: openaiToolFunction{
|
||||
Name: tc.Name,
|
||||
Arguments: string(argsJSON),
|
||||
},
|
||||
})
|
||||
}
|
||||
} else if m.ToolResult != nil {
|
||||
// This is a tool result message
|
||||
msg.Role = "tool"
|
||||
msg.Content = m.ToolResult.Content
|
||||
msg.ToolCallID = m.ToolResult.ToolUseID
|
||||
} else {
|
||||
msg.Content = m.Content
|
||||
// For assistant messages with reasoning content (DeepSeek)
|
||||
if c.isDeepSeekReasoner() && m.ReasoningContent != "" {
|
||||
msg.ReasoningContent = m.ReasoningContent
|
||||
}
|
||||
}
|
||||
|
||||
messages = append(messages, msg)
|
||||
}
|
||||
|
||||
// Use provided model or fall back to client default
|
||||
@@ -113,6 +209,7 @@ func (c *OpenAIClient) Chat(ctx context.Context, req ChatRequest) (*ChatResponse
|
||||
model = c.model
|
||||
}
|
||||
|
||||
// Build request
|
||||
openaiReq := openaiRequest{
|
||||
Model: model,
|
||||
Messages: messages,
|
||||
@@ -122,40 +219,114 @@ func (c *OpenAIClient) Chat(ctx context.Context, req ChatRequest) (*ChatResponse
|
||||
openaiReq.MaxTokens = req.MaxTokens
|
||||
}
|
||||
|
||||
if req.Temperature > 0 {
|
||||
// DeepSeek reasoner doesn't support temperature
|
||||
if req.Temperature > 0 && !c.isDeepSeekReasoner() {
|
||||
openaiReq.Temperature = req.Temperature
|
||||
}
|
||||
|
||||
// Convert tools to OpenAI format
|
||||
if len(req.Tools) > 0 {
|
||||
for _, t := range req.Tools {
|
||||
// Skip non-function tools (like web_search)
|
||||
if t.Type != "" && t.Type != "function" {
|
||||
continue
|
||||
}
|
||||
openaiReq.Tools = append(openaiReq.Tools, openaiTool{
|
||||
Type: "function",
|
||||
Function: openaiFunction{
|
||||
Name: t.Name,
|
||||
Description: t.Description,
|
||||
Parameters: t.InputSchema,
|
||||
},
|
||||
})
|
||||
}
|
||||
if len(openaiReq.Tools) > 0 {
|
||||
openaiReq.ToolChoice = "auto"
|
||||
}
|
||||
}
|
||||
|
||||
body, err := json.Marshal(openaiReq)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to marshal request: %w", err)
|
||||
}
|
||||
|
||||
httpReq, err := http.NewRequestWithContext(ctx, "POST", c.baseURL, bytes.NewReader(body))
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to create request: %w", err)
|
||||
}
|
||||
// Retry loop for transient errors (connection resets, 429, 5xx)
|
||||
var respBody []byte
|
||||
var lastErr error
|
||||
|
||||
httpReq.Header.Set("Content-Type", "application/json")
|
||||
httpReq.Header.Set("Authorization", "Bearer "+c.apiKey)
|
||||
for attempt := 0; attempt <= openaiMaxRetries; attempt++ {
|
||||
if attempt > 0 {
|
||||
// Exponential backoff: 2s, 4s, 8s
|
||||
backoff := openaiInitialBackoff * time.Duration(1<<(attempt-1))
|
||||
log.Warn().
|
||||
Int("attempt", attempt).
|
||||
Dur("backoff", backoff).
|
||||
Str("last_error", lastErr.Error()).
|
||||
Msg("Retrying OpenAI/DeepSeek API request after transient error")
|
||||
|
||||
resp, err := c.client.Do(httpReq)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("request failed: %w", err)
|
||||
}
|
||||
defer resp.Body.Close()
|
||||
|
||||
respBody, err := io.ReadAll(resp.Body)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to read response: %w", err)
|
||||
}
|
||||
|
||||
if resp.StatusCode != http.StatusOK {
|
||||
var errResp openaiError
|
||||
if err := json.Unmarshal(respBody, &errResp); err == nil && errResp.Error.Message != "" {
|
||||
return nil, fmt.Errorf("API error (%d): %s", resp.StatusCode, errResp.Error.Message)
|
||||
select {
|
||||
case <-ctx.Done():
|
||||
return nil, ctx.Err()
|
||||
case <-time.After(backoff):
|
||||
}
|
||||
}
|
||||
return nil, fmt.Errorf("API error (%d): %s", resp.StatusCode, string(respBody))
|
||||
|
||||
httpReq, err := http.NewRequestWithContext(ctx, "POST", c.baseURL, bytes.NewReader(body))
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to create request: %w", err)
|
||||
}
|
||||
|
||||
httpReq.Header.Set("Content-Type", "application/json")
|
||||
httpReq.Header.Set("Authorization", "Bearer "+c.apiKey)
|
||||
|
||||
resp, err := c.client.Do(httpReq)
|
||||
if err != nil {
|
||||
// Check if this is a retryable connection error
|
||||
errStr := err.Error()
|
||||
if strings.Contains(errStr, "connection reset") ||
|
||||
strings.Contains(errStr, "connection refused") ||
|
||||
strings.Contains(errStr, "EOF") ||
|
||||
strings.Contains(errStr, "timeout") {
|
||||
lastErr = fmt.Errorf("connection error: %w", err)
|
||||
continue
|
||||
}
|
||||
return nil, fmt.Errorf("request failed: %w", err)
|
||||
}
|
||||
|
||||
respBody, err = io.ReadAll(resp.Body)
|
||||
resp.Body.Close()
|
||||
if err != nil {
|
||||
lastErr = fmt.Errorf("failed to read response: %w", err)
|
||||
continue
|
||||
}
|
||||
|
||||
// Check for retryable HTTP errors
|
||||
if resp.StatusCode == 429 || resp.StatusCode == 502 || resp.StatusCode == 503 || resp.StatusCode == 504 {
|
||||
var errResp openaiError
|
||||
errMsg := string(respBody)
|
||||
if err := json.Unmarshal(respBody, &errResp); err == nil && errResp.Error.Message != "" {
|
||||
errMsg = errResp.Error.Message
|
||||
}
|
||||
lastErr = fmt.Errorf("API error (%d): %s", resp.StatusCode, errMsg)
|
||||
continue
|
||||
}
|
||||
|
||||
// Non-retryable error
|
||||
if resp.StatusCode != http.StatusOK {
|
||||
var errResp openaiError
|
||||
if err := json.Unmarshal(respBody, &errResp); err == nil && errResp.Error.Message != "" {
|
||||
return nil, fmt.Errorf("API error (%d): %s", resp.StatusCode, errResp.Error.Message)
|
||||
}
|
||||
return nil, fmt.Errorf("API error (%d): %s", resp.StatusCode, string(respBody))
|
||||
}
|
||||
|
||||
// Success - break out of retry loop
|
||||
lastErr = nil
|
||||
break
|
||||
}
|
||||
|
||||
if lastErr != nil {
|
||||
return nil, fmt.Errorf("request failed after %d retries: %w", openaiMaxRetries, lastErr)
|
||||
}
|
||||
|
||||
var openaiResp openaiResponse
|
||||
@@ -167,13 +338,33 @@ func (c *OpenAIClient) Chat(ctx context.Context, req ChatRequest) (*ChatResponse
|
||||
return nil, fmt.Errorf("no response choices returned")
|
||||
}
|
||||
|
||||
return &ChatResponse{
|
||||
Content: openaiResp.Choices[0].Message.Content,
|
||||
Model: openaiResp.Model,
|
||||
StopReason: openaiResp.Choices[0].FinishReason,
|
||||
InputTokens: openaiResp.Usage.PromptTokens,
|
||||
OutputTokens: openaiResp.Usage.CompletionTokens,
|
||||
}, nil
|
||||
choice := openaiResp.Choices[0]
|
||||
result := &ChatResponse{
|
||||
Content: choice.Message.Content,
|
||||
ReasoningContent: choice.Message.ReasoningContent, // DeepSeek thinking mode
|
||||
Model: openaiResp.Model,
|
||||
StopReason: choice.FinishReason,
|
||||
InputTokens: openaiResp.Usage.PromptTokens,
|
||||
OutputTokens: openaiResp.Usage.CompletionTokens,
|
||||
}
|
||||
|
||||
// Convert tool calls from OpenAI format to our format
|
||||
if len(choice.Message.ToolCalls) > 0 {
|
||||
result.StopReason = "tool_use" // Normalize to match Anthropic's format
|
||||
for _, tc := range choice.Message.ToolCalls {
|
||||
var input map[string]interface{}
|
||||
if err := json.Unmarshal([]byte(tc.Function.Arguments), &input); err != nil {
|
||||
input = map[string]interface{}{"raw": tc.Function.Arguments}
|
||||
}
|
||||
result.ToolCalls = append(result.ToolCalls, ToolCall{
|
||||
ID: tc.ID,
|
||||
Name: tc.Function.Name,
|
||||
Input: input,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
return result, nil
|
||||
}
|
||||
|
||||
// TestConnection validates the API key by making a minimal request
|
||||
|
||||
@@ -7,10 +7,11 @@ import (
|
||||
|
||||
// Message represents a chat message
|
||||
type Message struct {
|
||||
Role string `json:"role"` // "user", "assistant", "system"
|
||||
Content string `json:"content"` // Text content (simple case)
|
||||
ToolCalls []ToolCall `json:"tool_calls,omitempty"` // For assistant messages with tool calls
|
||||
ToolResult *ToolResult `json:"tool_result,omitempty"` // For user messages with tool results
|
||||
Role string `json:"role"` // "user", "assistant", "system"
|
||||
Content string `json:"content"` // Text content (simple case)
|
||||
ReasoningContent string `json:"reasoning_content,omitempty"` // DeepSeek thinking mode
|
||||
ToolCalls []ToolCall `json:"tool_calls,omitempty"` // For assistant messages with tool calls
|
||||
ToolResult *ToolResult `json:"tool_result,omitempty"` // For user messages with tool results
|
||||
}
|
||||
|
||||
// ToolCall represents a tool invocation from the AI
|
||||
@@ -48,12 +49,13 @@ type ChatRequest struct {
|
||||
|
||||
// ChatResponse represents a response from the AI provider
|
||||
type ChatResponse struct {
|
||||
Content string `json:"content"`
|
||||
Model string `json:"model"`
|
||||
StopReason string `json:"stop_reason,omitempty"` // "end_turn", "tool_use"
|
||||
ToolCalls []ToolCall `json:"tool_calls,omitempty"` // Tool invocations
|
||||
InputTokens int `json:"input_tokens,omitempty"`
|
||||
OutputTokens int `json:"output_tokens,omitempty"`
|
||||
Content string `json:"content"`
|
||||
ReasoningContent string `json:"reasoning_content,omitempty"` // DeepSeek thinking mode
|
||||
Model string `json:"model"`
|
||||
StopReason string `json:"stop_reason,omitempty"` // "end_turn", "tool_use"
|
||||
ToolCalls []ToolCall `json:"tool_calls,omitempty"` // Tool invocations
|
||||
InputTokens int `json:"input_tokens,omitempty"`
|
||||
OutputTokens int `json:"output_tokens,omitempty"`
|
||||
}
|
||||
|
||||
// Provider defines the interface for AI providers
|
||||
|
||||
354
internal/ai/routing.go
Normal file
354
internal/ai/routing.go
Normal file
@@ -0,0 +1,354 @@
|
||||
// Package ai provides AI-powered diagnostic and command execution capabilities.
|
||||
// This file contains the robust agent routing logic for executing commands on the correct host.
|
||||
package ai
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"strconv"
|
||||
"strings"
|
||||
|
||||
"github.com/rcourtman/pulse-go-rewrite/internal/agentexec"
|
||||
"github.com/rcourtman/pulse-go-rewrite/internal/config"
|
||||
"github.com/rs/zerolog/log"
|
||||
)
|
||||
|
||||
// RoutingResult contains the result of agent routing
|
||||
type RoutingResult struct {
|
||||
AgentID string // ID of the selected agent
|
||||
AgentHostname string // Hostname of the selected agent
|
||||
TargetNode string // The node we're trying to reach
|
||||
TargetVMID string // The VMID (for container/VM targets)
|
||||
RoutingMethod string // How we determined the route (for debugging)
|
||||
ClusterPeer bool // True if routing via a cluster peer
|
||||
Warnings []string // Any warnings encountered during routing
|
||||
}
|
||||
|
||||
// RoutingError represents a routing failure with actionable information
|
||||
type RoutingError struct {
|
||||
TargetNode string
|
||||
TargetVMID int
|
||||
AvailableAgents []string
|
||||
Reason string
|
||||
Suggestion string
|
||||
}
|
||||
|
||||
func (e *RoutingError) Error() string {
|
||||
if e.Suggestion != "" {
|
||||
return fmt.Sprintf("%s. %s", e.Reason, e.Suggestion)
|
||||
}
|
||||
return e.Reason
|
||||
}
|
||||
|
||||
// routeToAgent determines which agent should execute a command.
|
||||
// This is the authoritative routing function that should be used for all command execution.
|
||||
//
|
||||
// Routing priority:
|
||||
// 1. VMID lookup from state (most reliable for pct/qm commands)
|
||||
// 2. Explicit "node" field in context
|
||||
// 3. Explicit "guest_node" field in context
|
||||
// 4. "hostname" field for host targets
|
||||
// 5. VMID extracted from target ID (last resort)
|
||||
//
|
||||
// Agent matching is EXACT only - no substring matching to prevent false positives.
|
||||
// If no direct match, cluster peer routing is attempted.
|
||||
// If all else fails, returns an explicit error rather than silently using wrong agent.
|
||||
func (s *Service) routeToAgent(req ExecuteRequest, command string, agents []agentexec.ConnectedAgent) (*RoutingResult, error) {
|
||||
result := &RoutingResult{}
|
||||
|
||||
if len(agents) == 0 {
|
||||
return nil, &RoutingError{
|
||||
Reason: "No agents are connected to Pulse",
|
||||
Suggestion: "Install pulse-agent on at least one host",
|
||||
}
|
||||
}
|
||||
|
||||
// Build a map of available agents for quick lookup and error messages
|
||||
agentMap := make(map[string]agentexec.ConnectedAgent) // lowercase hostname -> agent
|
||||
var agentHostnames []string
|
||||
for _, agent := range agents {
|
||||
hostname := strings.TrimSpace(strings.ToLower(agent.Hostname))
|
||||
agentMap[hostname] = agent
|
||||
agentHostnames = append(agentHostnames, agent.Hostname)
|
||||
}
|
||||
|
||||
// Step 1: Try VMID-based routing (most authoritative for pct/qm commands)
|
||||
if vmid, requiresOwnerNode, found := extractVMIDFromCommand(command); found && requiresOwnerNode {
|
||||
targetInstance := ""
|
||||
if inst, ok := req.Context["instance"].(string); ok {
|
||||
targetInstance = inst
|
||||
}
|
||||
|
||||
guests := s.lookupGuestsByVMID(vmid, targetInstance)
|
||||
|
||||
if len(guests) == 0 {
|
||||
result.Warnings = append(result.Warnings,
|
||||
fmt.Sprintf("VMID %d not found in Pulse state - routing based on context", vmid))
|
||||
} else if len(guests) == 1 {
|
||||
result.TargetNode = strings.ToLower(guests[0].Node)
|
||||
result.RoutingMethod = "vmid_lookup"
|
||||
log.Info().
|
||||
Int("vmid", vmid).
|
||||
Str("node", guests[0].Node).
|
||||
Str("guest_name", guests[0].Name).
|
||||
Msg("Routed command via VMID state lookup")
|
||||
} else {
|
||||
// Multiple matches - try to disambiguate
|
||||
if targetInstance != "" {
|
||||
for _, g := range guests {
|
||||
if strings.EqualFold(g.Instance, targetInstance) {
|
||||
result.TargetNode = strings.ToLower(g.Node)
|
||||
result.RoutingMethod = "vmid_lookup_with_instance"
|
||||
log.Info().
|
||||
Int("vmid", vmid).
|
||||
Str("node", g.Node).
|
||||
Str("instance", g.Instance).
|
||||
Msg("Resolved VMID collision using instance")
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
if result.TargetNode == "" {
|
||||
// Return explicit error for VMID collision
|
||||
var locations []string
|
||||
for _, g := range guests {
|
||||
locations = append(locations, fmt.Sprintf("%s on %s (%s)", g.Name, g.Node, g.Instance))
|
||||
}
|
||||
return nil, &RoutingError{
|
||||
TargetVMID: vmid,
|
||||
AvailableAgents: agentHostnames,
|
||||
Reason: fmt.Sprintf("VMID %d exists on multiple nodes: %s",
|
||||
vmid, strings.Join(locations, ", ")),
|
||||
Suggestion: "Specify the instance/cluster in your query to disambiguate",
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Step 2: Try context-based routing (explicit node information)
|
||||
if result.TargetNode == "" {
|
||||
if node, ok := req.Context["node"].(string); ok && node != "" {
|
||||
result.TargetNode = strings.ToLower(node)
|
||||
result.RoutingMethod = "context_node"
|
||||
log.Debug().
|
||||
Str("node", node).
|
||||
Str("command", command).
|
||||
Msg("Routing via explicit 'node' in context")
|
||||
} else if node, ok := req.Context["guest_node"].(string); ok && node != "" {
|
||||
result.TargetNode = strings.ToLower(node)
|
||||
result.RoutingMethod = "context_guest_node"
|
||||
log.Debug().
|
||||
Str("guest_node", node).
|
||||
Str("command", command).
|
||||
Msg("Routing via 'guest_node' in context")
|
||||
} else if req.TargetType == "host" {
|
||||
if hostname, ok := req.Context["hostname"].(string); ok && hostname != "" {
|
||||
result.TargetNode = strings.ToLower(hostname)
|
||||
result.RoutingMethod = "context_hostname"
|
||||
log.Debug().
|
||||
Str("hostname", hostname).
|
||||
Str("command", command).
|
||||
Msg("Routing via 'hostname' in context")
|
||||
} else {
|
||||
// For host target type with no node info, log a warning
|
||||
// This is a common source of routing issues
|
||||
log.Warn().
|
||||
Str("target_type", req.TargetType).
|
||||
Str("target_id", req.TargetID).
|
||||
Str("command", command).
|
||||
Msg("Host command with no node/hostname in context - may route to wrong agent")
|
||||
result.Warnings = append(result.Warnings,
|
||||
"No target host specified in context. Use target_host parameter for reliable routing.")
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
// Step 3: Extract VMID from target ID and look up in state
|
||||
if result.TargetNode == "" && req.TargetID != "" {
|
||||
if vmid := extractVMIDFromTargetID(req.TargetID); vmid > 0 {
|
||||
result.TargetVMID = strconv.Itoa(vmid)
|
||||
|
||||
// Try instance from context
|
||||
targetInstance := ""
|
||||
if inst, ok := req.Context["instance"].(string); ok {
|
||||
targetInstance = inst
|
||||
}
|
||||
|
||||
guests := s.lookupGuestsByVMID(vmid, targetInstance)
|
||||
if len(guests) == 1 {
|
||||
result.TargetNode = strings.ToLower(guests[0].Node)
|
||||
result.RoutingMethod = "target_id_vmid_lookup"
|
||||
log.Debug().
|
||||
Int("vmid", vmid).
|
||||
Str("node", guests[0].Node).
|
||||
Str("target_id", req.TargetID).
|
||||
Msg("Resolved node from target ID VMID lookup")
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Step 4: Try to find exact matching agent
|
||||
if result.TargetNode != "" {
|
||||
targetNodeClean := strings.TrimSpace(strings.ToLower(result.TargetNode))
|
||||
|
||||
// EXACT match only - no substring matching
|
||||
if agent, exists := agentMap[targetNodeClean]; exists {
|
||||
result.AgentID = agent.AgentID
|
||||
result.AgentHostname = agent.Hostname
|
||||
log.Debug().
|
||||
Str("target_node", result.TargetNode).
|
||||
Str("agent", agent.Hostname).
|
||||
Str("method", result.RoutingMethod).
|
||||
Msg("Exact agent match found")
|
||||
return result, nil
|
||||
}
|
||||
|
||||
// Try cluster peer routing
|
||||
if peerAgentID := s.findClusterPeerAgent(targetNodeClean, agents); peerAgentID != "" {
|
||||
for _, agent := range agents {
|
||||
if agent.AgentID == peerAgentID {
|
||||
result.AgentID = peerAgentID
|
||||
result.AgentHostname = agent.Hostname
|
||||
result.ClusterPeer = true
|
||||
log.Info().
|
||||
Str("target_node", result.TargetNode).
|
||||
Str("peer_agent", agent.Hostname).
|
||||
Msg("Routing via cluster peer agent")
|
||||
return result, nil
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// No agent available for this node
|
||||
return nil, &RoutingError{
|
||||
TargetNode: result.TargetNode,
|
||||
AvailableAgents: agentHostnames,
|
||||
Reason: fmt.Sprintf("No agent connected to node %q", result.TargetNode),
|
||||
Suggestion: fmt.Sprintf("Install pulse-agent on %q, or ensure it's in a cluster with %s",
|
||||
result.TargetNode, strings.Join(agentHostnames, ", ")),
|
||||
}
|
||||
}
|
||||
|
||||
// Step 5: No target node determined - for host commands with no context, use first agent
|
||||
if req.TargetType == "host" && len(agents) == 1 {
|
||||
result.AgentID = agents[0].AgentID
|
||||
result.AgentHostname = agents[0].Hostname
|
||||
result.RoutingMethod = "single_agent_fallback"
|
||||
result.Warnings = append(result.Warnings,
|
||||
fmt.Sprintf("No target node specified, using the only connected agent (%s). For multi-agent setups, specify target_host.", agents[0].Hostname))
|
||||
log.Info().
|
||||
Str("agent", agents[0].Hostname).
|
||||
Str("target_type", req.TargetType).
|
||||
Msg("Routing via single-agent fallback")
|
||||
return result, nil
|
||||
}
|
||||
|
||||
// Cannot determine where to route
|
||||
// Provide actionable error with available agents listed
|
||||
log.Error().
|
||||
Str("target_type", req.TargetType).
|
||||
Str("target_id", req.TargetID).
|
||||
Strs("available_agents", agentHostnames).
|
||||
Msg("Routing failed - cannot determine target agent")
|
||||
|
||||
return nil, &RoutingError{
|
||||
AvailableAgents: agentHostnames,
|
||||
Reason: "Cannot determine which agent should execute this command",
|
||||
Suggestion: fmt.Sprintf("Use target_host parameter with one of: %s. Or specify VMID in the command for pct/qm commands.",
|
||||
strings.Join(agentHostnames, ", ")),
|
||||
}
|
||||
|
||||
}
|
||||
|
||||
// extractVMIDFromTargetID extracts a numeric VMID from various target ID formats.
|
||||
// Handles formats like:
|
||||
// - "delly-minipc-106" -> 106
|
||||
// - "minipc-106" -> 106
|
||||
// - "106" -> 106
|
||||
// - "lxc-106" -> 106
|
||||
// - "vm-106" -> 106
|
||||
func extractVMIDFromTargetID(targetID string) int {
|
||||
if targetID == "" {
|
||||
return 0
|
||||
}
|
||||
|
||||
// Try parsing the whole thing as a number first
|
||||
if vmid, err := strconv.Atoi(targetID); err == nil && vmid > 0 {
|
||||
return vmid
|
||||
}
|
||||
|
||||
// Split by hyphen and take the last numeric part
|
||||
parts := strings.Split(targetID, "-")
|
||||
for i := len(parts) - 1; i >= 0; i-- {
|
||||
if vmid, err := strconv.Atoi(parts[i]); err == nil && vmid > 0 {
|
||||
return vmid
|
||||
}
|
||||
}
|
||||
|
||||
return 0
|
||||
}
|
||||
|
||||
// findClusterPeerAgent finds an agent that can execute commands for a node in the same cluster.
|
||||
// For PVE clusters, any node can execute pvesh/vzdump commands, but pct exec/qm guest exec
|
||||
// require the agent to be on the specific node.
|
||||
func (s *Service) findClusterPeerAgent(targetNode string, agents []agentexec.ConnectedAgent) string {
|
||||
// Check for nil persistence
|
||||
if s.persistence == nil {
|
||||
return ""
|
||||
}
|
||||
|
||||
// Load nodes config to check cluster membership
|
||||
nodesConfig, err := s.persistence.LoadNodesConfig()
|
||||
if err != nil || nodesConfig == nil {
|
||||
return ""
|
||||
}
|
||||
|
||||
// Find which cluster the target node belongs to
|
||||
var targetCluster string
|
||||
var clusterEndpoints []config.ClusterEndpoint
|
||||
|
||||
for _, pve := range nodesConfig.PVEInstances {
|
||||
if strings.EqualFold(pve.Name, targetNode) {
|
||||
if pve.IsCluster && pve.ClusterName != "" {
|
||||
targetCluster = pve.ClusterName
|
||||
clusterEndpoints = pve.ClusterEndpoints
|
||||
}
|
||||
break
|
||||
}
|
||||
// Also check cluster endpoints
|
||||
for _, ep := range pve.ClusterEndpoints {
|
||||
if strings.EqualFold(ep.NodeName, targetNode) {
|
||||
if pve.IsCluster && pve.ClusterName != "" {
|
||||
targetCluster = pve.ClusterName
|
||||
clusterEndpoints = pve.ClusterEndpoints
|
||||
}
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if targetCluster == "" {
|
||||
return ""
|
||||
}
|
||||
|
||||
// Build list of cluster member nodes
|
||||
clusterNodes := make(map[string]bool)
|
||||
for _, ep := range clusterEndpoints {
|
||||
clusterNodes[strings.ToLower(ep.NodeName)] = true
|
||||
}
|
||||
|
||||
// Find an agent on any cluster member
|
||||
for _, agent := range agents {
|
||||
agentHostname := strings.ToLower(agent.Hostname)
|
||||
if clusterNodes[agentHostname] {
|
||||
log.Debug().
|
||||
Str("target_node", targetNode).
|
||||
Str("cluster", targetCluster).
|
||||
Str("peer_agent", agent.Hostname).
|
||||
Msg("Found cluster peer agent")
|
||||
return agent.AgentID
|
||||
}
|
||||
}
|
||||
|
||||
return ""
|
||||
}
|
||||
276
internal/ai/routing_test.go
Normal file
276
internal/ai/routing_test.go
Normal file
@@ -0,0 +1,276 @@
|
||||
package ai
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/rcourtman/pulse-go-rewrite/internal/agentexec"
|
||||
)
|
||||
|
||||
func TestExtractVMIDFromTargetID(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
targetID string
|
||||
want int
|
||||
}{
|
||||
// Standard formats
|
||||
{"plain vmid", "106", 106},
|
||||
{"node-vmid", "minipc-106", 106},
|
||||
{"instance-node-vmid", "delly-minipc-106", 106},
|
||||
|
||||
// Edge cases with hyphenated names
|
||||
{"hyphenated-node-vmid", "pve-node-01-106", 106},
|
||||
{"hyphenated-instance-node-vmid", "my-cluster-pve-node-01-106", 106},
|
||||
|
||||
// Type prefixes
|
||||
{"lxc prefix", "lxc-106", 106},
|
||||
{"vm prefix", "vm-106", 106},
|
||||
{"ct prefix", "ct-106", 106},
|
||||
|
||||
// Non-numeric - should return 0
|
||||
{"non-numeric", "mycontainer", 0},
|
||||
{"no-vmid", "node-name", 0},
|
||||
{"empty", "", 0},
|
||||
|
||||
// Large VMIDs (Proxmox uses up to 999999999)
|
||||
{"large vmid", "node-999999", 999999},
|
||||
}
|
||||
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
got := extractVMIDFromTargetID(tt.targetID)
|
||||
if got != tt.want {
|
||||
t.Errorf("extractVMIDFromTargetID(%q) = %d, want %d", tt.targetID, got, tt.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestRoutingError(t *testing.T) {
|
||||
t.Run("with suggestion", func(t *testing.T) {
|
||||
err := &RoutingError{
|
||||
TargetNode: "minipc",
|
||||
AvailableAgents: []string{"delly", "pimox"},
|
||||
Reason: "No agent connected to node \"minipc\"",
|
||||
Suggestion: "Install pulse-agent on minipc",
|
||||
}
|
||||
|
||||
want := "No agent connected to node \"minipc\". Install pulse-agent on minipc"
|
||||
if err.Error() != want {
|
||||
t.Errorf("Error() = %q, want %q", err.Error(), want)
|
||||
}
|
||||
})
|
||||
|
||||
t.Run("without suggestion", func(t *testing.T) {
|
||||
err := &RoutingError{
|
||||
Reason: "No agents connected",
|
||||
}
|
||||
|
||||
want := "No agents connected"
|
||||
if err.Error() != want {
|
||||
t.Errorf("Error() = %q, want %q", err.Error(), want)
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
func TestRouteToAgent_NoAgents(t *testing.T) {
|
||||
s := &Service{}
|
||||
|
||||
req := ExecuteRequest{
|
||||
TargetType: "container",
|
||||
TargetID: "minipc-106",
|
||||
}
|
||||
|
||||
_, err := s.routeToAgent(req, "pct exec 106 -- hostname", nil)
|
||||
if err == nil {
|
||||
t.Error("expected error for no agents, got nil")
|
||||
}
|
||||
|
||||
routingErr, ok := err.(*RoutingError)
|
||||
if !ok {
|
||||
t.Fatalf("expected RoutingError, got %T", err)
|
||||
}
|
||||
|
||||
if routingErr.Suggestion == "" {
|
||||
t.Error("expected suggestion in error")
|
||||
}
|
||||
}
|
||||
|
||||
func TestRouteToAgent_ExactMatch(t *testing.T) {
|
||||
s := &Service{}
|
||||
|
||||
agents := []agentexec.ConnectedAgent{
|
||||
{AgentID: "agent-1", Hostname: "delly"},
|
||||
{AgentID: "agent-2", Hostname: "minipc"},
|
||||
{AgentID: "agent-3", Hostname: "pimox"},
|
||||
}
|
||||
|
||||
tests := []struct {
|
||||
name string
|
||||
req ExecuteRequest
|
||||
command string
|
||||
wantAgentID string
|
||||
wantHostname string
|
||||
}{
|
||||
{
|
||||
name: "route by context node",
|
||||
req: ExecuteRequest{
|
||||
TargetType: "container",
|
||||
TargetID: "delly-minipc-106",
|
||||
Context: map[string]interface{}{"node": "minipc"},
|
||||
},
|
||||
command: "hostname",
|
||||
wantAgentID: "agent-2",
|
||||
wantHostname: "minipc",
|
||||
},
|
||||
{
|
||||
name: "route by context hostname for host target",
|
||||
req: ExecuteRequest{
|
||||
TargetType: "host",
|
||||
Context: map[string]interface{}{"hostname": "delly"},
|
||||
},
|
||||
command: "uptime",
|
||||
wantAgentID: "agent-1",
|
||||
wantHostname: "delly",
|
||||
},
|
||||
{
|
||||
name: "route by guest_node context",
|
||||
req: ExecuteRequest{
|
||||
TargetType: "vm",
|
||||
TargetID: "100",
|
||||
Context: map[string]interface{}{"guest_node": "pimox"},
|
||||
},
|
||||
command: "hostname",
|
||||
wantAgentID: "agent-3",
|
||||
wantHostname: "pimox",
|
||||
},
|
||||
}
|
||||
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
result, err := s.routeToAgent(tt.req, tt.command, agents)
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error: %v", err)
|
||||
}
|
||||
|
||||
if result.AgentID != tt.wantAgentID {
|
||||
t.Errorf("AgentID = %q, want %q", result.AgentID, tt.wantAgentID)
|
||||
}
|
||||
|
||||
if result.AgentHostname != tt.wantHostname {
|
||||
t.Errorf("AgentHostname = %q, want %q", result.AgentHostname, tt.wantHostname)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestRouteToAgent_NoSubstringMatching(t *testing.T) {
|
||||
s := &Service{}
|
||||
|
||||
// Agent "mini" should NOT match node "minipc"
|
||||
agents := []agentexec.ConnectedAgent{
|
||||
{AgentID: "agent-1", Hostname: "mini"},
|
||||
{AgentID: "agent-2", Hostname: "pc"},
|
||||
}
|
||||
|
||||
req := ExecuteRequest{
|
||||
TargetType: "container",
|
||||
Context: map[string]interface{}{"node": "minipc"},
|
||||
}
|
||||
|
||||
_, err := s.routeToAgent(req, "hostname", agents)
|
||||
if err == nil {
|
||||
t.Error("expected error when no exact match, got nil (substring matching may be occurring)")
|
||||
}
|
||||
|
||||
routingErr, ok := err.(*RoutingError)
|
||||
if !ok {
|
||||
t.Fatalf("expected RoutingError, got %T", err)
|
||||
}
|
||||
|
||||
if routingErr.TargetNode != "minipc" {
|
||||
t.Errorf("TargetNode = %q, want %q", routingErr.TargetNode, "minipc")
|
||||
}
|
||||
}
|
||||
|
||||
func TestRouteToAgent_CaseInsensitive(t *testing.T) {
|
||||
s := &Service{}
|
||||
|
||||
agents := []agentexec.ConnectedAgent{
|
||||
{AgentID: "agent-1", Hostname: "MiniPC"},
|
||||
}
|
||||
|
||||
req := ExecuteRequest{
|
||||
TargetType: "container",
|
||||
Context: map[string]interface{}{"node": "minipc"}, // lowercase
|
||||
}
|
||||
|
||||
result, err := s.routeToAgent(req, "hostname", agents)
|
||||
if err != nil {
|
||||
t.Fatalf("expected case-insensitive match, got error: %v", err)
|
||||
}
|
||||
|
||||
if result.AgentID != "agent-1" {
|
||||
t.Errorf("AgentID = %q, want %q", result.AgentID, "agent-1")
|
||||
}
|
||||
}
|
||||
|
||||
func TestRouteToAgent_HyphenatedNodeNames(t *testing.T) {
|
||||
s := &Service{}
|
||||
|
||||
agents := []agentexec.ConnectedAgent{
|
||||
{AgentID: "agent-1", Hostname: "pve-node-01"},
|
||||
{AgentID: "agent-2", Hostname: "pve-node-02"},
|
||||
}
|
||||
|
||||
req := ExecuteRequest{
|
||||
TargetType: "container",
|
||||
Context: map[string]interface{}{"node": "pve-node-02"},
|
||||
}
|
||||
|
||||
result, err := s.routeToAgent(req, "hostname", agents)
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error for hyphenated node names: %v", err)
|
||||
}
|
||||
|
||||
if result.AgentID != "agent-2" {
|
||||
t.Errorf("AgentID = %q, want %q", result.AgentID, "agent-2")
|
||||
}
|
||||
}
|
||||
|
||||
func TestRouteToAgent_ActionableErrorMessages(t *testing.T) {
|
||||
s := &Service{}
|
||||
|
||||
agents := []agentexec.ConnectedAgent{
|
||||
{AgentID: "agent-1", Hostname: "delly"},
|
||||
}
|
||||
|
||||
req := ExecuteRequest{
|
||||
TargetType: "container",
|
||||
Context: map[string]interface{}{"node": "minipc"},
|
||||
}
|
||||
|
||||
_, err := s.routeToAgent(req, "hostname", agents)
|
||||
if err == nil {
|
||||
t.Fatal("expected error, got nil")
|
||||
}
|
||||
|
||||
routingErr, ok := err.(*RoutingError)
|
||||
if !ok {
|
||||
t.Fatalf("expected RoutingError, got %T", err)
|
||||
}
|
||||
|
||||
// Error should mention the target node
|
||||
if routingErr.TargetNode != "minipc" {
|
||||
t.Errorf("TargetNode = %q, want %q", routingErr.TargetNode, "minipc")
|
||||
}
|
||||
|
||||
// Error should list available agents
|
||||
if len(routingErr.AvailableAgents) == 0 {
|
||||
t.Error("expected available agents in error")
|
||||
}
|
||||
|
||||
// Error should have actionable suggestion
|
||||
if routingErr.Suggestion == "" {
|
||||
t.Error("expected suggestion in error message")
|
||||
}
|
||||
}
|
||||
File diff suppressed because it is too large
Load Diff
187
internal/ai/target_host_test.go
Normal file
187
internal/ai/target_host_test.go
Normal file
@@ -0,0 +1,187 @@
|
||||
package ai
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/rcourtman/pulse-go-rewrite/internal/agentexec"
|
||||
)
|
||||
|
||||
// TestRouteToAgent_TargetHostExplicit tests that explicit target_host in context
|
||||
// takes priority for routing decisions
|
||||
func TestRouteToAgent_TargetHostExplicit(t *testing.T) {
|
||||
s := &Service{}
|
||||
|
||||
agents := []agentexec.ConnectedAgent{
|
||||
{AgentID: "agent-1", Hostname: "delly"},
|
||||
{AgentID: "agent-2", Hostname: "minipc"},
|
||||
{AgentID: "agent-3", Hostname: "pimox"},
|
||||
}
|
||||
|
||||
tests := []struct {
|
||||
name string
|
||||
req ExecuteRequest
|
||||
command string
|
||||
wantAgentID string
|
||||
wantHostname string
|
||||
wantMethod string
|
||||
}{
|
||||
{
|
||||
name: "explicit node in context routes correctly",
|
||||
req: ExecuteRequest{
|
||||
TargetType: "host", // run_on_host=true sets this
|
||||
TargetID: "", // run_on_host clears this
|
||||
Context: map[string]interface{}{"node": "minipc"},
|
||||
},
|
||||
command: "pct exec 106 -- hostname",
|
||||
wantAgentID: "agent-2",
|
||||
wantHostname: "minipc",
|
||||
wantMethod: "context_node",
|
||||
},
|
||||
{
|
||||
name: "guest_node also routes correctly for host commands",
|
||||
req: ExecuteRequest{
|
||||
TargetType: "host",
|
||||
TargetID: "",
|
||||
Context: map[string]interface{}{"guest_node": "pimox"},
|
||||
},
|
||||
command: "qm guest exec 100 hostname",
|
||||
wantAgentID: "agent-3",
|
||||
wantHostname: "pimox",
|
||||
wantMethod: "context_guest_node",
|
||||
},
|
||||
{
|
||||
name: "node takes priority over guest_node",
|
||||
req: ExecuteRequest{
|
||||
TargetType: "host",
|
||||
TargetID: "",
|
||||
Context: map[string]interface{}{
|
||||
"node": "delly",
|
||||
"guest_node": "minipc", // Should be ignored when node is set
|
||||
},
|
||||
},
|
||||
command: "uptime",
|
||||
wantAgentID: "agent-1",
|
||||
wantHostname: "delly",
|
||||
wantMethod: "context_node",
|
||||
},
|
||||
}
|
||||
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
result, err := s.routeToAgent(tt.req, tt.command, agents)
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error: %v", err)
|
||||
}
|
||||
|
||||
if result.AgentID != tt.wantAgentID {
|
||||
t.Errorf("AgentID = %q, want %q", result.AgentID, tt.wantAgentID)
|
||||
}
|
||||
|
||||
if result.AgentHostname != tt.wantHostname {
|
||||
t.Errorf("AgentHostname = %q, want %q", result.AgentHostname, tt.wantHostname)
|
||||
}
|
||||
|
||||
if result.RoutingMethod != tt.wantMethod {
|
||||
t.Errorf("RoutingMethod = %q, want %q", result.RoutingMethod, tt.wantMethod)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// TestRouteToAgent_SingleAgentFallback tests that with only one agent,
|
||||
// we fall back to it with a warning
|
||||
func TestRouteToAgent_SingleAgentFallback(t *testing.T) {
|
||||
s := &Service{}
|
||||
|
||||
agents := []agentexec.ConnectedAgent{
|
||||
{AgentID: "agent-1", Hostname: "delly"},
|
||||
}
|
||||
|
||||
req := ExecuteRequest{
|
||||
TargetType: "host",
|
||||
TargetID: "",
|
||||
Context: nil, // No context at all
|
||||
}
|
||||
|
||||
result, err := s.routeToAgent(req, "uptime", agents)
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error: %v", err)
|
||||
}
|
||||
|
||||
if result.AgentID != "agent-1" {
|
||||
t.Errorf("AgentID = %q, want %q", result.AgentID, "agent-1")
|
||||
}
|
||||
|
||||
if result.RoutingMethod != "single_agent_fallback" {
|
||||
t.Errorf("RoutingMethod = %q, want %q", result.RoutingMethod, "single_agent_fallback")
|
||||
}
|
||||
|
||||
// Should have a warning about the fallback
|
||||
if len(result.Warnings) == 0 {
|
||||
t.Error("expected warning about fallback routing")
|
||||
}
|
||||
}
|
||||
|
||||
// TestRouteToAgent_MultiAgentNoContext tests that with multiple agents
|
||||
// and no context, we get a clear error
|
||||
func TestRouteToAgent_MultiAgentNoContext(t *testing.T) {
|
||||
s := &Service{}
|
||||
|
||||
agents := []agentexec.ConnectedAgent{
|
||||
{AgentID: "agent-1", Hostname: "delly"},
|
||||
{AgentID: "agent-2", Hostname: "minipc"},
|
||||
}
|
||||
|
||||
req := ExecuteRequest{
|
||||
TargetType: "host",
|
||||
TargetID: "",
|
||||
Context: nil, // No context
|
||||
}
|
||||
|
||||
_, err := s.routeToAgent(req, "uptime", agents)
|
||||
if err == nil {
|
||||
t.Fatal("expected error when no context with multiple agents")
|
||||
}
|
||||
|
||||
routingErr, ok := err.(*RoutingError)
|
||||
if !ok {
|
||||
t.Fatalf("expected RoutingError, got %T", err)
|
||||
}
|
||||
|
||||
// Should mention target_host in the suggestion
|
||||
if routingErr.Suggestion == "" {
|
||||
t.Error("expected suggestion in error")
|
||||
}
|
||||
|
||||
// Should list available agents
|
||||
if len(routingErr.AvailableAgents) != 2 {
|
||||
t.Errorf("expected 2 available agents, got %d", len(routingErr.AvailableAgents))
|
||||
}
|
||||
}
|
||||
|
||||
// TestRouteToAgent_VMIDRoutingWithContext tests that VMID-based routing
|
||||
// from context works correctly for pct/qm commands
|
||||
func TestRouteToAgent_VMIDInCommandWithContext(t *testing.T) {
|
||||
s := &Service{}
|
||||
|
||||
agents := []agentexec.ConnectedAgent{
|
||||
{AgentID: "agent-1", Hostname: "delly"},
|
||||
{AgentID: "agent-2", Hostname: "minipc"},
|
||||
}
|
||||
|
||||
// Even with a VMID in the command, if we have node context, use it
|
||||
req := ExecuteRequest{
|
||||
TargetType: "host",
|
||||
TargetID: "",
|
||||
Context: map[string]interface{}{"node": "minipc"},
|
||||
}
|
||||
|
||||
result, err := s.routeToAgent(req, "pct exec 106 -- hostname", agents)
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error: %v", err)
|
||||
}
|
||||
|
||||
if result.AgentHostname != "minipc" {
|
||||
t.Errorf("AgentHostname = %q, want %q", result.AgentHostname, "minipc")
|
||||
}
|
||||
}
|
||||
@@ -3,6 +3,7 @@ package api
|
||||
import (
|
||||
"context"
|
||||
"encoding/json"
|
||||
"io"
|
||||
"net/http"
|
||||
"strings"
|
||||
"time"
|
||||
@@ -158,10 +159,10 @@ func (h *AISettingsHandler) HandleUpdateAISettings(w http.ResponseWriter, r *htt
|
||||
if req.Provider != nil {
|
||||
provider := strings.ToLower(strings.TrimSpace(*req.Provider))
|
||||
switch provider {
|
||||
case config.AIProviderAnthropic, config.AIProviderOpenAI, config.AIProviderOllama:
|
||||
case config.AIProviderAnthropic, config.AIProviderOpenAI, config.AIProviderOllama, config.AIProviderDeepSeek:
|
||||
settings.Provider = provider
|
||||
default:
|
||||
http.Error(w, "Invalid provider. Must be 'anthropic', 'openai', or 'ollama'", http.StatusBadRequest)
|
||||
http.Error(w, "Invalid provider. Must be 'anthropic', 'openai', 'ollama', or 'deepseek'", http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
}
|
||||
@@ -191,7 +192,7 @@ func (h *AISettingsHandler) HandleUpdateAISettings(w http.ResponseWriter, r *htt
|
||||
// Only allow enabling if properly configured
|
||||
if *req.Enabled {
|
||||
switch settings.Provider {
|
||||
case config.AIProviderAnthropic, config.AIProviderOpenAI:
|
||||
case config.AIProviderAnthropic, config.AIProviderOpenAI, config.AIProviderDeepSeek:
|
||||
if settings.APIKey == "" {
|
||||
http.Error(w, "Cannot enable AI: API key is required for "+settings.Provider, http.StatusBadRequest)
|
||||
return
|
||||
@@ -438,7 +439,10 @@ func (h *AISettingsHandler) HandleExecuteStream(w http.ResponseWriter, r *http.R
|
||||
rc := http.NewResponseController(w)
|
||||
if err := rc.SetWriteDeadline(time.Time{}); err != nil {
|
||||
log.Warn().Err(err).Msg("Failed to disable write deadline for SSE")
|
||||
// Continue anyway - heartbeats should help keep connection alive
|
||||
}
|
||||
// Also disable read deadline
|
||||
if err := rc.SetReadDeadline(time.Time{}); err != nil {
|
||||
log.Warn().Err(err).Msg("Failed to disable read deadline for SSE")
|
||||
}
|
||||
|
||||
// Flush headers immediately
|
||||
@@ -453,17 +457,22 @@ func (h *AISettingsHandler) HandleExecuteStream(w http.ResponseWriter, r *http.R
|
||||
// NOTE: We don't check r.Context().Done() because Vite proxy may close
|
||||
// the request context prematurely. We detect real disconnection via write failures.
|
||||
heartbeatDone := make(chan struct{})
|
||||
var clientDisconnected bool
|
||||
go func() {
|
||||
ticker := time.NewTicker(5 * time.Second)
|
||||
defer ticker.Stop()
|
||||
for {
|
||||
select {
|
||||
case <-ticker.C:
|
||||
// Extend write deadline before heartbeat
|
||||
_ = rc.SetWriteDeadline(time.Now().Add(10 * time.Second))
|
||||
// Send SSE comment as heartbeat
|
||||
_, err := w.Write([]byte(": heartbeat\n\n"))
|
||||
if err != nil {
|
||||
log.Debug().Err(err).Msg("Heartbeat write failed, client disconnected")
|
||||
cancel() // Cancel the AI request
|
||||
log.Debug().Err(err).Msg("Heartbeat write failed, stopping heartbeat (AI continues)")
|
||||
clientDisconnected = true
|
||||
// Don't cancel the AI request - let it complete with its own timeout
|
||||
// The SSE connection may have issues but the AI work can still finish
|
||||
return
|
||||
}
|
||||
flusher.Flush()
|
||||
@@ -475,8 +484,31 @@ func (h *AISettingsHandler) HandleExecuteStream(w http.ResponseWriter, r *http.R
|
||||
}()
|
||||
defer close(heartbeatDone)
|
||||
|
||||
// Helper to safely write SSE events, tracking if client disconnected
|
||||
safeWrite := func(data []byte) bool {
|
||||
if clientDisconnected {
|
||||
return false
|
||||
}
|
||||
_ = rc.SetWriteDeadline(time.Now().Add(10 * time.Second))
|
||||
_, err := w.Write(data)
|
||||
if err != nil {
|
||||
log.Debug().Err(err).Msg("Failed to write SSE event (client may have disconnected)")
|
||||
clientDisconnected = true
|
||||
return false
|
||||
}
|
||||
flusher.Flush()
|
||||
return true
|
||||
}
|
||||
|
||||
// Stream callback - write SSE events
|
||||
callback := func(event ai.StreamEvent) {
|
||||
// Skip the 'done' event from service - we'll send our own at the end
|
||||
// This ensures 'complete' comes before 'done'
|
||||
if event.Type == "done" {
|
||||
log.Debug().Msg("Skipping service 'done' event - will send final 'done' after 'complete'")
|
||||
return
|
||||
}
|
||||
|
||||
data, err := json.Marshal(event)
|
||||
if err != nil {
|
||||
log.Error().Err(err).Msg("Failed to marshal stream event")
|
||||
@@ -488,12 +520,7 @@ func (h *AISettingsHandler) HandleExecuteStream(w http.ResponseWriter, r *http.R
|
||||
Msg("Streaming AI event")
|
||||
|
||||
// SSE format: data: <json>\n\n
|
||||
_, writeErr := w.Write([]byte("data: " + string(data) + "\n\n"))
|
||||
if writeErr != nil {
|
||||
log.Debug().Err(writeErr).Msg("Failed to write SSE event (client may have disconnected)")
|
||||
return
|
||||
}
|
||||
flusher.Flush()
|
||||
safeWrite([]byte("data: " + string(data) + "\n\n"))
|
||||
}
|
||||
|
||||
// Convert history from API type to service type
|
||||
@@ -505,6 +532,16 @@ func (h *AISettingsHandler) HandleExecuteStream(w http.ResponseWriter, r *http.R
|
||||
})
|
||||
}
|
||||
|
||||
// Ensure we always send a final 'done' event
|
||||
defer func() {
|
||||
if !clientDisconnected {
|
||||
doneEvent := ai.StreamEvent{Type: "done"}
|
||||
data, _ := json.Marshal(doneEvent)
|
||||
safeWrite([]byte("data: " + string(data) + "\n\n"))
|
||||
log.Debug().Msg("Sent final 'done' event")
|
||||
}
|
||||
}()
|
||||
|
||||
// Execute with streaming
|
||||
resp, err := h.aiService.ExecuteStream(ctx, ai.ExecuteRequest{
|
||||
Prompt: req.Prompt,
|
||||
@@ -519,8 +556,7 @@ func (h *AISettingsHandler) HandleExecuteStream(w http.ResponseWriter, r *http.R
|
||||
// Send error event
|
||||
errEvent := ai.StreamEvent{Type: "error", Data: err.Error()}
|
||||
data, _ := json.Marshal(errEvent)
|
||||
_, _ = w.Write([]byte("data: " + string(data) + "\n\n"))
|
||||
flusher.Flush()
|
||||
safeWrite([]byte("data: " + string(data) + "\n\n"))
|
||||
return
|
||||
}
|
||||
|
||||
@@ -531,7 +567,7 @@ func (h *AISettingsHandler) HandleExecuteStream(w http.ResponseWriter, r *http.R
|
||||
Int("tool_calls", len(resp.ToolCalls)).
|
||||
Msg("AI streaming request completed")
|
||||
|
||||
// Send final response with metadata
|
||||
// Send final response with metadata (before 'done')
|
||||
finalEvent := struct {
|
||||
Type string `json:"type"`
|
||||
Model string `json:"model"`
|
||||
@@ -546,8 +582,8 @@ func (h *AISettingsHandler) HandleExecuteStream(w http.ResponseWriter, r *http.R
|
||||
ToolCalls: resp.ToolCalls,
|
||||
}
|
||||
data, _ := json.Marshal(finalEvent)
|
||||
_, _ = w.Write([]byte("data: " + string(data) + "\n\n"))
|
||||
flusher.Flush()
|
||||
safeWrite([]byte("data: " + string(data) + "\n\n"))
|
||||
// 'done' event is sent by the defer above
|
||||
}
|
||||
|
||||
// AIRunCommandRequest is the request body for POST /api/ai/run-command
|
||||
@@ -557,8 +593,10 @@ type AIRunCommandRequest struct {
|
||||
TargetID string `json:"target_id"`
|
||||
RunOnHost bool `json:"run_on_host"`
|
||||
VMID string `json:"vmid,omitempty"`
|
||||
TargetHost string `json:"target_host,omitempty"` // Explicit host for routing
|
||||
}
|
||||
|
||||
|
||||
// HandleRunCommand executes a single approved command (POST /api/ai/run-command)
|
||||
func (h *AISettingsHandler) HandleRunCommand(w http.ResponseWriter, r *http.Request) {
|
||||
if r.Method != http.MethodPost {
|
||||
@@ -573,8 +611,17 @@ func (h *AISettingsHandler) HandleRunCommand(w http.ResponseWriter, r *http.Requ
|
||||
|
||||
// Parse request
|
||||
r.Body = http.MaxBytesReader(w, r.Body, 16*1024)
|
||||
bodyBytes, readErr := io.ReadAll(r.Body)
|
||||
if readErr != nil {
|
||||
log.Error().Err(readErr).Msg("Failed to read request body")
|
||||
http.Error(w, "Invalid request body", http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
log.Debug().Str("body", string(bodyBytes)).Msg("run-command request body")
|
||||
|
||||
var req AIRunCommandRequest
|
||||
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
|
||||
if err := json.Unmarshal(bodyBytes, &req); err != nil {
|
||||
log.Error().Err(err).Str("body", string(bodyBytes)).Msg("Failed to decode JSON body")
|
||||
http.Error(w, "Invalid request body", http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
@@ -589,6 +636,7 @@ func (h *AISettingsHandler) HandleRunCommand(w http.ResponseWriter, r *http.Requ
|
||||
Str("target_type", req.TargetType).
|
||||
Str("target_id", req.TargetID).
|
||||
Bool("run_on_host", req.RunOnHost).
|
||||
Str("target_host", req.TargetHost).
|
||||
Msg("Executing approved command")
|
||||
|
||||
// Execute with timeout
|
||||
@@ -601,7 +649,9 @@ func (h *AISettingsHandler) HandleRunCommand(w http.ResponseWriter, r *http.Requ
|
||||
TargetID: req.TargetID,
|
||||
RunOnHost: req.RunOnHost,
|
||||
VMID: req.VMID,
|
||||
TargetHost: req.TargetHost,
|
||||
})
|
||||
|
||||
if err != nil {
|
||||
log.Error().Err(err).Msg("Failed to execute command")
|
||||
http.Error(w, "Failed to execute command: "+err.Error(), http.StatusInternalServerError)
|
||||
@@ -612,3 +662,515 @@ func (h *AISettingsHandler) HandleRunCommand(w http.ResponseWriter, r *http.Requ
|
||||
log.Error().Err(err).Msg("Failed to write run command response")
|
||||
}
|
||||
}
|
||||
|
||||
// HandleGetGuestKnowledge returns all notes for a guest
|
||||
func (h *AISettingsHandler) HandleGetGuestKnowledge(w http.ResponseWriter, r *http.Request) {
|
||||
guestID := r.URL.Query().Get("guest_id")
|
||||
if guestID == "" {
|
||||
http.Error(w, "guest_id is required", http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
|
||||
knowledge, err := h.aiService.GetGuestKnowledge(guestID)
|
||||
if err != nil {
|
||||
http.Error(w, "Failed to get knowledge: "+err.Error(), http.StatusInternalServerError)
|
||||
return
|
||||
}
|
||||
|
||||
if err := utils.WriteJSONResponse(w, knowledge); err != nil {
|
||||
log.Error().Err(err).Msg("Failed to write knowledge response")
|
||||
}
|
||||
}
|
||||
|
||||
// HandleSaveGuestNote saves a note for a guest
|
||||
func (h *AISettingsHandler) HandleSaveGuestNote(w http.ResponseWriter, r *http.Request) {
|
||||
var req struct {
|
||||
GuestID string `json:"guest_id"`
|
||||
GuestName string `json:"guest_name"`
|
||||
GuestType string `json:"guest_type"`
|
||||
Category string `json:"category"`
|
||||
Title string `json:"title"`
|
||||
Content string `json:"content"`
|
||||
}
|
||||
|
||||
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
|
||||
http.Error(w, "Invalid request body", http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
|
||||
if req.GuestID == "" || req.Category == "" || req.Title == "" || req.Content == "" {
|
||||
http.Error(w, "guest_id, category, title, and content are required", http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
|
||||
if err := h.aiService.SaveGuestNote(req.GuestID, req.GuestName, req.GuestType, req.Category, req.Title, req.Content); err != nil {
|
||||
http.Error(w, "Failed to save note: "+err.Error(), http.StatusInternalServerError)
|
||||
return
|
||||
}
|
||||
|
||||
w.WriteHeader(http.StatusOK)
|
||||
w.Write([]byte(`{"success": true}`))
|
||||
}
|
||||
|
||||
// HandleDeleteGuestNote deletes a note from a guest
|
||||
func (h *AISettingsHandler) HandleDeleteGuestNote(w http.ResponseWriter, r *http.Request) {
|
||||
var req struct {
|
||||
GuestID string `json:"guest_id"`
|
||||
NoteID string `json:"note_id"`
|
||||
}
|
||||
|
||||
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
|
||||
http.Error(w, "Invalid request body", http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
|
||||
if req.GuestID == "" || req.NoteID == "" {
|
||||
http.Error(w, "guest_id and note_id are required", http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
|
||||
if err := h.aiService.DeleteGuestNote(req.GuestID, req.NoteID); err != nil {
|
||||
http.Error(w, "Failed to delete note: "+err.Error(), http.StatusInternalServerError)
|
||||
return
|
||||
}
|
||||
|
||||
w.WriteHeader(http.StatusOK)
|
||||
w.Write([]byte(`{"success": true}`))
|
||||
}
|
||||
|
||||
// HandleExportGuestKnowledge exports all knowledge for a guest as JSON
|
||||
func (h *AISettingsHandler) HandleExportGuestKnowledge(w http.ResponseWriter, r *http.Request) {
|
||||
guestID := r.URL.Query().Get("guest_id")
|
||||
if guestID == "" {
|
||||
http.Error(w, "guest_id is required", http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
|
||||
knowledge, err := h.aiService.GetGuestKnowledge(guestID)
|
||||
if err != nil {
|
||||
http.Error(w, "Failed to get knowledge: "+err.Error(), http.StatusInternalServerError)
|
||||
return
|
||||
}
|
||||
|
||||
// Set headers for file download
|
||||
w.Header().Set("Content-Type", "application/json")
|
||||
w.Header().Set("Content-Disposition", "attachment; filename=\"pulse-notes-"+guestID+".json\"")
|
||||
|
||||
if err := json.NewEncoder(w).Encode(knowledge); err != nil {
|
||||
log.Error().Err(err).Msg("Failed to encode knowledge export")
|
||||
}
|
||||
}
|
||||
|
||||
// HandleImportGuestKnowledge imports knowledge from a JSON export
|
||||
func (h *AISettingsHandler) HandleImportGuestKnowledge(w http.ResponseWriter, r *http.Request) {
|
||||
if r.Method != http.MethodPost {
|
||||
http.Error(w, "Method not allowed", http.StatusMethodNotAllowed)
|
||||
return
|
||||
}
|
||||
|
||||
// Limit request body size to 1MB
|
||||
r.Body = http.MaxBytesReader(w, r.Body, 1024*1024)
|
||||
|
||||
var importData struct {
|
||||
GuestID string `json:"guest_id"`
|
||||
GuestName string `json:"guest_name"`
|
||||
GuestType string `json:"guest_type"`
|
||||
Notes []struct {
|
||||
Category string `json:"category"`
|
||||
Title string `json:"title"`
|
||||
Content string `json:"content"`
|
||||
} `json:"notes"`
|
||||
Merge bool `json:"merge"` // If true, add to existing notes; if false, replace
|
||||
}
|
||||
|
||||
if err := json.NewDecoder(r.Body).Decode(&importData); err != nil {
|
||||
http.Error(w, "Invalid import data: "+err.Error(), http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
|
||||
if importData.GuestID == "" {
|
||||
http.Error(w, "guest_id is required in import data", http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
|
||||
if len(importData.Notes) == 0 {
|
||||
http.Error(w, "No notes to import", http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
|
||||
// If not merging, we need to delete existing notes first
|
||||
if !importData.Merge {
|
||||
existing, err := h.aiService.GetGuestKnowledge(importData.GuestID)
|
||||
if err == nil && existing != nil {
|
||||
for _, note := range existing.Notes {
|
||||
_ = h.aiService.DeleteGuestNote(importData.GuestID, note.ID)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Import each note
|
||||
imported := 0
|
||||
for _, note := range importData.Notes {
|
||||
if note.Category == "" || note.Title == "" || note.Content == "" {
|
||||
continue
|
||||
}
|
||||
if err := h.aiService.SaveGuestNote(
|
||||
importData.GuestID,
|
||||
importData.GuestName,
|
||||
importData.GuestType,
|
||||
note.Category,
|
||||
note.Title,
|
||||
note.Content,
|
||||
); err != nil {
|
||||
log.Warn().Err(err).Str("title", note.Title).Msg("Failed to import note")
|
||||
continue
|
||||
}
|
||||
imported++
|
||||
}
|
||||
|
||||
w.Header().Set("Content-Type", "application/json")
|
||||
json.NewEncoder(w).Encode(map[string]interface{}{
|
||||
"success": true,
|
||||
"imported": imported,
|
||||
"total": len(importData.Notes),
|
||||
})
|
||||
}
|
||||
|
||||
// HandleClearGuestKnowledge deletes all notes for a guest
|
||||
func (h *AISettingsHandler) HandleClearGuestKnowledge(w http.ResponseWriter, r *http.Request) {
|
||||
if r.Method != http.MethodPost {
|
||||
http.Error(w, "Method not allowed", http.StatusMethodNotAllowed)
|
||||
return
|
||||
}
|
||||
|
||||
var req struct {
|
||||
GuestID string `json:"guest_id"`
|
||||
Confirm bool `json:"confirm"`
|
||||
}
|
||||
|
||||
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
|
||||
http.Error(w, "Invalid request body", http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
|
||||
if req.GuestID == "" {
|
||||
http.Error(w, "guest_id is required", http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
|
||||
if !req.Confirm {
|
||||
http.Error(w, "confirm must be true to clear all notes", http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
|
||||
// Get existing knowledge and delete all notes
|
||||
existing, err := h.aiService.GetGuestKnowledge(req.GuestID)
|
||||
if err != nil {
|
||||
http.Error(w, "Failed to get knowledge: "+err.Error(), http.StatusInternalServerError)
|
||||
return
|
||||
}
|
||||
|
||||
deleted := 0
|
||||
for _, note := range existing.Notes {
|
||||
if err := h.aiService.DeleteGuestNote(req.GuestID, note.ID); err != nil {
|
||||
log.Warn().Err(err).Str("note_id", note.ID).Msg("Failed to delete note")
|
||||
continue
|
||||
}
|
||||
deleted++
|
||||
}
|
||||
|
||||
w.Header().Set("Content-Type", "application/json")
|
||||
json.NewEncoder(w).Encode(map[string]interface{}{
|
||||
"success": true,
|
||||
"deleted": deleted,
|
||||
})
|
||||
}
|
||||
|
||||
// HandleDebugContext returns the system prompt and context that would be sent to the AI
|
||||
// This is useful for debugging when the AI gives incorrect information
|
||||
func (h *AISettingsHandler) HandleDebugContext(w http.ResponseWriter, r *http.Request) {
|
||||
if r.Method != http.MethodGet {
|
||||
http.Error(w, "Method not allowed", http.StatusMethodNotAllowed)
|
||||
return
|
||||
}
|
||||
|
||||
// Build a sample request to see what context would be sent
|
||||
req := ai.ExecuteRequest{
|
||||
Prompt: "Debug context request",
|
||||
TargetType: r.URL.Query().Get("target_type"),
|
||||
TargetID: r.URL.Query().Get("target_id"),
|
||||
}
|
||||
|
||||
// Get the debug context from the service
|
||||
debugInfo := h.aiService.GetDebugContext(req)
|
||||
|
||||
w.Header().Set("Content-Type", "application/json")
|
||||
json.NewEncoder(w).Encode(debugInfo)
|
||||
}
|
||||
|
||||
// HandleGetConnectedAgents returns the list of agents currently connected via WebSocket
|
||||
// This is useful for debugging when AI can't reach certain hosts
|
||||
func (h *AISettingsHandler) HandleGetConnectedAgents(w http.ResponseWriter, r *http.Request) {
|
||||
if r.Method != http.MethodGet {
|
||||
http.Error(w, "Method not allowed", http.StatusMethodNotAllowed)
|
||||
return
|
||||
}
|
||||
|
||||
type agentInfo struct {
|
||||
AgentID string `json:"agent_id"`
|
||||
Hostname string `json:"hostname"`
|
||||
Version string `json:"version"`
|
||||
Platform string `json:"platform"`
|
||||
ConnectedAt string `json:"connected_at"`
|
||||
}
|
||||
|
||||
var agents []agentInfo
|
||||
if h.agentServer != nil {
|
||||
for _, a := range h.agentServer.GetConnectedAgents() {
|
||||
agents = append(agents, agentInfo{
|
||||
AgentID: a.AgentID,
|
||||
Hostname: a.Hostname,
|
||||
Version: a.Version,
|
||||
Platform: a.Platform,
|
||||
ConnectedAt: a.ConnectedAt.Format(time.RFC3339),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
response := map[string]interface{}{
|
||||
"count": len(agents),
|
||||
"agents": agents,
|
||||
"note": "Agents connect via WebSocket to /api/agent/ws. If a host is missing, check that pulse-agent is installed and can reach the Pulse server.",
|
||||
}
|
||||
|
||||
w.Header().Set("Content-Type", "application/json")
|
||||
json.NewEncoder(w).Encode(response)
|
||||
}
|
||||
|
||||
// AIInvestigateAlertRequest is the request body for POST /api/ai/investigate-alert
|
||||
type AIInvestigateAlertRequest struct {
|
||||
AlertID string `json:"alert_id"`
|
||||
ResourceID string `json:"resource_id"`
|
||||
ResourceName string `json:"resource_name"`
|
||||
ResourceType string `json:"resource_type"` // guest, node, storage, docker
|
||||
AlertType string `json:"alert_type"` // cpu, memory, disk, offline, etc.
|
||||
Level string `json:"level"` // warning, critical
|
||||
Value float64 `json:"value"`
|
||||
Threshold float64 `json:"threshold"`
|
||||
Message string `json:"message"`
|
||||
Duration string `json:"duration"` // How long the alert has been active
|
||||
Node string `json:"node,omitempty"`
|
||||
VMID int `json:"vmid,omitempty"`
|
||||
}
|
||||
|
||||
// HandleInvestigateAlert investigates an alert using AI (POST /api/ai/investigate-alert)
|
||||
// This is a dedicated endpoint for one-click alert investigation from the UI
|
||||
func (h *AISettingsHandler) HandleInvestigateAlert(w http.ResponseWriter, r *http.Request) {
|
||||
// Handle CORS
|
||||
origin := r.Header.Get("Origin")
|
||||
if origin != "" {
|
||||
w.Header().Set("Access-Control-Allow-Origin", origin)
|
||||
w.Header().Set("Access-Control-Allow-Credentials", "true")
|
||||
w.Header().Set("Access-Control-Allow-Methods", "POST, OPTIONS")
|
||||
w.Header().Set("Access-Control-Allow-Headers", "Content-Type, Accept, Cookie")
|
||||
w.Header().Set("Vary", "Origin")
|
||||
}
|
||||
|
||||
if r.Method == http.MethodOptions {
|
||||
w.WriteHeader(http.StatusOK)
|
||||
return
|
||||
}
|
||||
|
||||
if r.Method != http.MethodPost {
|
||||
http.Error(w, "Method not allowed", http.StatusMethodNotAllowed)
|
||||
return
|
||||
}
|
||||
|
||||
// Require authentication
|
||||
if !CheckAuth(h.config, w, r) {
|
||||
return
|
||||
}
|
||||
|
||||
// Check if AI is enabled
|
||||
if !h.aiService.IsEnabled() {
|
||||
http.Error(w, "AI is not enabled or configured", http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
|
||||
// Parse request
|
||||
r.Body = http.MaxBytesReader(w, r.Body, 16*1024)
|
||||
var req AIInvestigateAlertRequest
|
||||
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
|
||||
http.Error(w, "Invalid request body", http.StatusBadRequest)
|
||||
return
|
||||
}
|
||||
|
||||
// Build investigation prompt
|
||||
investigationPrompt := ai.GenerateAlertInvestigationPrompt(ai.AlertInvestigationRequest{
|
||||
AlertID: req.AlertID,
|
||||
ResourceID: req.ResourceID,
|
||||
ResourceName: req.ResourceName,
|
||||
ResourceType: req.ResourceType,
|
||||
AlertType: req.AlertType,
|
||||
Level: req.Level,
|
||||
Value: req.Value,
|
||||
Threshold: req.Threshold,
|
||||
Message: req.Message,
|
||||
Duration: req.Duration,
|
||||
Node: req.Node,
|
||||
VMID: req.VMID,
|
||||
})
|
||||
|
||||
log.Info().
|
||||
Str("alert_id", req.AlertID).
|
||||
Str("resource", req.ResourceName).
|
||||
Str("type", req.AlertType).
|
||||
Msg("AI alert investigation started")
|
||||
|
||||
// Set up SSE streaming
|
||||
w.Header().Set("Content-Type", "text/event-stream")
|
||||
w.Header().Set("Cache-Control", "no-cache")
|
||||
w.Header().Set("Connection", "keep-alive")
|
||||
w.Header().Set("X-Accel-Buffering", "no")
|
||||
w.Header().Set("Transfer-Encoding", "identity")
|
||||
|
||||
flusher, ok := w.(http.Flusher)
|
||||
if !ok {
|
||||
http.Error(w, "Streaming not supported", http.StatusInternalServerError)
|
||||
return
|
||||
}
|
||||
|
||||
// Disable write/read deadlines for SSE
|
||||
rc := http.NewResponseController(w)
|
||||
_ = rc.SetWriteDeadline(time.Time{})
|
||||
_ = rc.SetReadDeadline(time.Time{})
|
||||
|
||||
flusher.Flush()
|
||||
|
||||
// Create context with timeout
|
||||
ctx, cancel := context.WithTimeout(context.Background(), 300*time.Second)
|
||||
defer cancel()
|
||||
|
||||
// Heartbeat routine
|
||||
heartbeatDone := make(chan struct{})
|
||||
var clientDisconnected bool
|
||||
go func() {
|
||||
ticker := time.NewTicker(5 * time.Second)
|
||||
defer ticker.Stop()
|
||||
for {
|
||||
select {
|
||||
case <-ticker.C:
|
||||
_ = rc.SetWriteDeadline(time.Now().Add(10 * time.Second))
|
||||
_, err := w.Write([]byte(": heartbeat\n\n"))
|
||||
if err != nil {
|
||||
clientDisconnected = true
|
||||
return
|
||||
}
|
||||
flusher.Flush()
|
||||
case <-heartbeatDone:
|
||||
return
|
||||
}
|
||||
}
|
||||
}()
|
||||
defer close(heartbeatDone)
|
||||
|
||||
safeWrite := func(data []byte) bool {
|
||||
if clientDisconnected {
|
||||
return false
|
||||
}
|
||||
_ = rc.SetWriteDeadline(time.Now().Add(10 * time.Second))
|
||||
_, err := w.Write(data)
|
||||
if err != nil {
|
||||
clientDisconnected = true
|
||||
return false
|
||||
}
|
||||
flusher.Flush()
|
||||
return true
|
||||
}
|
||||
|
||||
// Determine target type and ID from alert info
|
||||
targetType := req.ResourceType
|
||||
targetID := req.ResourceID
|
||||
|
||||
// Map resource type to expected target type format
|
||||
switch req.ResourceType {
|
||||
case "guest":
|
||||
// Could be VM or container - try to determine from VMID
|
||||
if req.VMID > 0 {
|
||||
targetType = "container" // Default to container, AI will figure it out
|
||||
}
|
||||
case "docker":
|
||||
targetType = "docker_container"
|
||||
}
|
||||
|
||||
// Stream callback
|
||||
callback := func(event ai.StreamEvent) {
|
||||
if event.Type == "done" {
|
||||
return
|
||||
}
|
||||
data, err := json.Marshal(event)
|
||||
if err != nil {
|
||||
return
|
||||
}
|
||||
safeWrite([]byte("data: " + string(data) + "\n\n"))
|
||||
}
|
||||
|
||||
// Execute with streaming
|
||||
defer func() {
|
||||
if !clientDisconnected {
|
||||
doneEvent := ai.StreamEvent{Type: "done"}
|
||||
data, _ := json.Marshal(doneEvent)
|
||||
safeWrite([]byte("data: " + string(data) + "\n\n"))
|
||||
}
|
||||
}()
|
||||
|
||||
resp, err := h.aiService.ExecuteStream(ctx, ai.ExecuteRequest{
|
||||
Prompt: investigationPrompt,
|
||||
TargetType: targetType,
|
||||
TargetID: targetID,
|
||||
Context: map[string]interface{}{
|
||||
"alertId": req.AlertID,
|
||||
"alertType": req.AlertType,
|
||||
"alertLevel": req.Level,
|
||||
"alertMessage": req.Message,
|
||||
"guestName": req.ResourceName,
|
||||
"node": req.Node,
|
||||
},
|
||||
}, callback)
|
||||
|
||||
if err != nil {
|
||||
log.Error().Err(err).Msg("AI alert investigation failed")
|
||||
errEvent := ai.StreamEvent{Type: "error", Data: err.Error()}
|
||||
data, _ := json.Marshal(errEvent)
|
||||
safeWrite([]byte("data: " + string(data) + "\n\n"))
|
||||
return
|
||||
}
|
||||
|
||||
// Send completion event
|
||||
finalEvent := struct {
|
||||
Type string `json:"type"`
|
||||
Model string `json:"model"`
|
||||
InputTokens int `json:"input_tokens"`
|
||||
OutputTokens int `json:"output_tokens"`
|
||||
ToolCalls []ai.ToolExecution `json:"tool_calls,omitempty"`
|
||||
}{
|
||||
Type: "complete",
|
||||
Model: resp.Model,
|
||||
InputTokens: resp.InputTokens,
|
||||
OutputTokens: resp.OutputTokens,
|
||||
ToolCalls: resp.ToolCalls,
|
||||
}
|
||||
data, _ := json.Marshal(finalEvent)
|
||||
safeWrite([]byte("data: " + string(data) + "\n\n"))
|
||||
|
||||
log.Info().
|
||||
Str("alert_id", req.AlertID).
|
||||
Str("model", resp.Model).
|
||||
Int("tool_calls", len(resp.ToolCalls)).
|
||||
Msg("AI alert investigation completed")
|
||||
}
|
||||
|
||||
// SetAlertProvider sets the alert provider for AI context
|
||||
func (h *AISettingsHandler) SetAlertProvider(ap ai.AlertProvider) {
|
||||
h.aiService.SetAlertProvider(ap)
|
||||
}
|
||||
|
||||
@@ -27,6 +27,7 @@ import (
|
||||
|
||||
"github.com/rcourtman/pulse-go-rewrite/internal/agentbinaries"
|
||||
"github.com/rcourtman/pulse-go-rewrite/internal/agentexec"
|
||||
"github.com/rcourtman/pulse-go-rewrite/internal/ai"
|
||||
"github.com/rcourtman/pulse-go-rewrite/internal/auth"
|
||||
"github.com/rcourtman/pulse-go-rewrite/internal/config"
|
||||
"github.com/rcourtman/pulse-go-rewrite/internal/models"
|
||||
@@ -201,6 +202,7 @@ func (r *Router) setupRoutes() {
|
||||
r.mux.HandleFunc("/api/storage/", RequireAuth(r.config, RequireScope(config.ScopeMonitoringRead, r.handleStorage)))
|
||||
r.mux.HandleFunc("/api/storage-charts", RequireAuth(r.config, RequireScope(config.ScopeMonitoringRead, r.handleStorageCharts)))
|
||||
r.mux.HandleFunc("/api/charts", RequireAuth(r.config, RequireScope(config.ScopeMonitoringRead, r.handleCharts)))
|
||||
r.mux.HandleFunc("/api/metrics-store/stats", RequireAuth(r.config, RequireScope(config.ScopeMonitoringRead, r.handleMetricsStoreStats)))
|
||||
r.mux.HandleFunc("/api/diagnostics", RequireAuth(r.config, r.handleDiagnostics))
|
||||
r.mux.HandleFunc("/api/diagnostics/temperature-proxy/register-nodes", RequireAdmin(r.config, RequireScope(config.ScopeSettingsWrite, r.handleDiagnosticsRegisterProxyNodes)))
|
||||
r.mux.HandleFunc("/api/diagnostics/docker/prepare-token", RequireAdmin(r.config, RequireScope(config.ScopeSettingsWrite, r.handleDiagnosticsDockerPrepareToken)))
|
||||
@@ -1022,13 +1024,26 @@ func (r *Router) setupRoutes() {
|
||||
// Inject state provider so AI has access to full infrastructure context (VMs, containers, IPs)
|
||||
if r.monitor != nil {
|
||||
r.aiSettingsHandler.SetStateProvider(r.monitor)
|
||||
// Inject alert provider so AI has awareness of current alerts
|
||||
if alertManager := r.monitor.GetAlertManager(); alertManager != nil {
|
||||
r.aiSettingsHandler.SetAlertProvider(ai.NewAlertManagerAdapter(alertManager))
|
||||
}
|
||||
}
|
||||
r.mux.HandleFunc("/api/settings/ai", RequireAdmin(r.config, RequireScope(config.ScopeSettingsRead, r.aiSettingsHandler.HandleGetAISettings)))
|
||||
r.mux.HandleFunc("/api/settings/ai/update", RequireAdmin(r.config, RequireScope(config.ScopeSettingsWrite, r.aiSettingsHandler.HandleUpdateAISettings)))
|
||||
r.mux.HandleFunc("/api/ai/test", RequireAdmin(r.config, RequireScope(config.ScopeSettingsWrite, r.aiSettingsHandler.HandleTestAIConnection)))
|
||||
r.mux.HandleFunc("/api/ai/execute", RequireAuth(r.config, r.aiSettingsHandler.HandleExecute))
|
||||
r.mux.HandleFunc("/api/ai/execute/stream", RequireAuth(r.config, r.aiSettingsHandler.HandleExecuteStream))
|
||||
r.mux.HandleFunc("/api/ai/investigate-alert", RequireAuth(r.config, r.aiSettingsHandler.HandleInvestigateAlert))
|
||||
r.mux.HandleFunc("/api/ai/run-command", RequireAuth(r.config, r.aiSettingsHandler.HandleRunCommand))
|
||||
r.mux.HandleFunc("/api/ai/knowledge", RequireAuth(r.config, r.aiSettingsHandler.HandleGetGuestKnowledge))
|
||||
r.mux.HandleFunc("/api/ai/knowledge/save", RequireAuth(r.config, r.aiSettingsHandler.HandleSaveGuestNote))
|
||||
r.mux.HandleFunc("/api/ai/knowledge/delete", RequireAuth(r.config, r.aiSettingsHandler.HandleDeleteGuestNote))
|
||||
r.mux.HandleFunc("/api/ai/knowledge/export", RequireAuth(r.config, r.aiSettingsHandler.HandleExportGuestKnowledge))
|
||||
r.mux.HandleFunc("/api/ai/knowledge/import", RequireAuth(r.config, r.aiSettingsHandler.HandleImportGuestKnowledge))
|
||||
r.mux.HandleFunc("/api/ai/knowledge/clear", RequireAuth(r.config, r.aiSettingsHandler.HandleClearGuestKnowledge))
|
||||
r.mux.HandleFunc("/api/ai/debug/context", RequireAdmin(r.config, r.aiSettingsHandler.HandleDebugContext))
|
||||
r.mux.HandleFunc("/api/ai/agents", RequireAuth(r.config, r.aiSettingsHandler.HandleGetConnectedAgents))
|
||||
|
||||
// Agent WebSocket for AI command execution
|
||||
r.mux.HandleFunc("/api/agent/ws", r.handleAgentWebSocket)
|
||||
@@ -2909,6 +2924,44 @@ func (r *Router) handleStorageCharts(w http.ResponseWriter, req *http.Request) {
|
||||
}
|
||||
}
|
||||
|
||||
// handleMetricsStoreStats returns statistics about the persistent metrics store
|
||||
func (r *Router) handleMetricsStoreStats(w http.ResponseWriter, req *http.Request) {
|
||||
if req.Method != http.MethodGet {
|
||||
http.Error(w, "Method not allowed", http.StatusMethodNotAllowed)
|
||||
return
|
||||
}
|
||||
|
||||
store := r.monitor.GetMetricsStore()
|
||||
if store == nil {
|
||||
w.Header().Set("Content-Type", "application/json")
|
||||
json.NewEncoder(w).Encode(map[string]interface{}{
|
||||
"enabled": false,
|
||||
"error": "Persistent metrics store not initialized",
|
||||
})
|
||||
return
|
||||
}
|
||||
|
||||
stats := store.GetStats()
|
||||
w.Header().Set("Content-Type", "application/json")
|
||||
if err := json.NewEncoder(w).Encode(map[string]interface{}{
|
||||
"enabled": true,
|
||||
"dbPath": stats.DBPath,
|
||||
"dbSize": stats.DBSize,
|
||||
"rawCount": stats.RawCount,
|
||||
"minuteCount": stats.MinuteCount,
|
||||
"hourlyCount": stats.HourlyCount,
|
||||
"dailyCount": stats.DailyCount,
|
||||
"totalWrites": stats.TotalWrites,
|
||||
"bufferSize": stats.BufferSize,
|
||||
"lastFlush": stats.LastFlush,
|
||||
"lastRollup": stats.LastRollup,
|
||||
"lastRetention": stats.LastRetention,
|
||||
}); err != nil {
|
||||
log.Error().Err(err).Msg("Failed to encode metrics store stats")
|
||||
http.Error(w, "Internal server error", http.StatusInternalServerError)
|
||||
}
|
||||
}
|
||||
|
||||
// handleConfig handles configuration requests
|
||||
func (r *Router) handleConfig(w http.ResponseWriter, req *http.Request) {
|
||||
if req.Method != http.MethodGet {
|
||||
|
||||
@@ -4,7 +4,7 @@ package config
|
||||
// This is stored in ai.enc (encrypted) in the config directory
|
||||
type AIConfig struct {
|
||||
Enabled bool `json:"enabled"`
|
||||
Provider string `json:"provider"` // "anthropic", "openai", "ollama"
|
||||
Provider string `json:"provider"` // "anthropic", "openai", "ollama", "deepseek"
|
||||
APIKey string `json:"api_key"` // encrypted at rest (not needed for ollama)
|
||||
Model string `json:"model"` // e.g., "claude-opus-4-5-20250514", "gpt-4o", "llama3"
|
||||
BaseURL string `json:"base_url"` // custom endpoint (required for ollama, optional for openai)
|
||||
@@ -17,6 +17,7 @@ const (
|
||||
AIProviderAnthropic = "anthropic"
|
||||
AIProviderOpenAI = "openai"
|
||||
AIProviderOllama = "ollama"
|
||||
AIProviderDeepSeek = "deepseek"
|
||||
)
|
||||
|
||||
// Default models per provider
|
||||
@@ -24,7 +25,9 @@ const (
|
||||
DefaultAIModelAnthropic = "claude-opus-4-5-20251101"
|
||||
DefaultAIModelOpenAI = "gpt-4o"
|
||||
DefaultAIModelOllama = "llama3"
|
||||
DefaultAIModelDeepSeek = "deepseek-reasoner"
|
||||
DefaultOllamaBaseURL = "http://localhost:11434"
|
||||
DefaultDeepSeekBaseURL = "https://api.deepseek.com/chat/completions"
|
||||
)
|
||||
|
||||
// NewDefaultAIConfig returns an AIConfig with sensible defaults
|
||||
@@ -43,7 +46,7 @@ func (c *AIConfig) IsConfigured() bool {
|
||||
}
|
||||
|
||||
switch c.Provider {
|
||||
case AIProviderAnthropic, AIProviderOpenAI:
|
||||
case AIProviderAnthropic, AIProviderOpenAI, AIProviderDeepSeek:
|
||||
return c.APIKey != ""
|
||||
case AIProviderOllama:
|
||||
// Ollama doesn't need an API key
|
||||
@@ -58,8 +61,11 @@ func (c *AIConfig) GetBaseURL() string {
|
||||
if c.BaseURL != "" {
|
||||
return c.BaseURL
|
||||
}
|
||||
if c.Provider == AIProviderOllama {
|
||||
switch c.Provider {
|
||||
case AIProviderOllama:
|
||||
return DefaultOllamaBaseURL
|
||||
case AIProviderDeepSeek:
|
||||
return DefaultDeepSeekBaseURL
|
||||
}
|
||||
return ""
|
||||
}
|
||||
@@ -76,6 +82,8 @@ func (c *AIConfig) GetModel() string {
|
||||
return DefaultAIModelOpenAI
|
||||
case AIProviderOllama:
|
||||
return DefaultAIModelOllama
|
||||
case AIProviderDeepSeek:
|
||||
return DefaultAIModelDeepSeek
|
||||
default:
|
||||
return ""
|
||||
}
|
||||
|
||||
@@ -88,6 +88,11 @@ func newConfigPersistence(configDir string) (*ConfigPersistence, error) {
|
||||
return cp, nil
|
||||
}
|
||||
|
||||
// DataDir returns the configuration directory path
|
||||
func (c *ConfigPersistence) DataDir() string {
|
||||
return c.configDir
|
||||
}
|
||||
|
||||
// EnsureConfigDir ensures the configuration directory exists
|
||||
func (c *ConfigPersistence) EnsureConfigDir() error {
|
||||
return os.MkdirAll(c.configDir, 0700)
|
||||
|
||||
575
internal/metrics/store.go
Normal file
575
internal/metrics/store.go
Normal file
@@ -0,0 +1,575 @@
|
||||
// Package metrics provides persistent storage for time-series metrics data
|
||||
// using SQLite for durability across restarts.
|
||||
package metrics
|
||||
|
||||
import (
|
||||
"database/sql"
|
||||
"fmt"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"sync"
|
||||
"time"
|
||||
|
||||
"github.com/rs/zerolog/log"
|
||||
_ "modernc.org/sqlite"
|
||||
)
|
||||
|
||||
// Tier represents the granularity of stored metrics
|
||||
type Tier string
|
||||
|
||||
const (
|
||||
TierRaw Tier = "raw" // Raw data, ~5s intervals
|
||||
TierMinute Tier = "minute" // 1-minute averages
|
||||
TierHourly Tier = "hourly" // 1-hour averages
|
||||
TierDaily Tier = "daily" // 1-day averages
|
||||
)
|
||||
|
||||
// MetricPoint represents a single metric data point
|
||||
type MetricPoint struct {
|
||||
Timestamp time.Time
|
||||
Value float64
|
||||
Min float64 // For aggregated data
|
||||
Max float64 // For aggregated data
|
||||
}
|
||||
|
||||
// StoreConfig holds configuration for the metrics store
|
||||
type StoreConfig struct {
|
||||
DBPath string
|
||||
WriteBufferSize int // Number of records to buffer before batch write
|
||||
FlushInterval time.Duration // Max time between flushes
|
||||
RetentionRaw time.Duration // How long to keep raw data
|
||||
RetentionMinute time.Duration // How long to keep minute data
|
||||
RetentionHourly time.Duration // How long to keep hourly data
|
||||
RetentionDaily time.Duration // How long to keep daily data
|
||||
}
|
||||
|
||||
// DefaultConfig returns sensible defaults for metrics storage
|
||||
func DefaultConfig(dataDir string) StoreConfig {
|
||||
return StoreConfig{
|
||||
DBPath: filepath.Join(dataDir, "metrics.db"),
|
||||
WriteBufferSize: 100,
|
||||
FlushInterval: 5 * time.Second,
|
||||
RetentionRaw: 2 * time.Hour,
|
||||
RetentionMinute: 24 * time.Hour,
|
||||
RetentionHourly: 7 * 24 * time.Hour,
|
||||
RetentionDaily: 90 * 24 * time.Hour,
|
||||
}
|
||||
}
|
||||
|
||||
// bufferedMetric holds a metric waiting to be written
|
||||
type bufferedMetric struct {
|
||||
resourceType string
|
||||
resourceID string
|
||||
metricType string
|
||||
value float64
|
||||
timestamp time.Time
|
||||
}
|
||||
|
||||
// Store provides persistent metrics storage
|
||||
type Store struct {
|
||||
db *sql.DB
|
||||
config StoreConfig
|
||||
|
||||
// Write buffer
|
||||
bufferMu sync.Mutex
|
||||
buffer []bufferedMetric
|
||||
|
||||
// Background workers
|
||||
stopCh chan struct{}
|
||||
doneCh chan struct{}
|
||||
stopOnce sync.Once
|
||||
}
|
||||
|
||||
// NewStore creates a new metrics store with the given configuration
|
||||
func NewStore(config StoreConfig) (*Store, error) {
|
||||
// Ensure directory exists
|
||||
dir := filepath.Dir(config.DBPath)
|
||||
if err := os.MkdirAll(dir, 0755); err != nil {
|
||||
return nil, fmt.Errorf("failed to create metrics directory: %w", err)
|
||||
}
|
||||
|
||||
// Open database with WAL mode for better concurrent access
|
||||
db, err := sql.Open("sqlite", config.DBPath+"?_journal_mode=WAL&_busy_timeout=5000")
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to open metrics database: %w", err)
|
||||
}
|
||||
|
||||
// Configure connection pool (SQLite works best with single writer)
|
||||
db.SetMaxOpenConns(1)
|
||||
db.SetMaxIdleConns(1)
|
||||
db.SetConnMaxLifetime(0)
|
||||
|
||||
store := &Store{
|
||||
db: db,
|
||||
config: config,
|
||||
buffer: make([]bufferedMetric, 0, config.WriteBufferSize),
|
||||
stopCh: make(chan struct{}),
|
||||
doneCh: make(chan struct{}),
|
||||
}
|
||||
|
||||
// Initialize schema
|
||||
if err := store.initSchema(); err != nil {
|
||||
db.Close()
|
||||
return nil, fmt.Errorf("failed to initialize schema: %w", err)
|
||||
}
|
||||
|
||||
// Start background workers
|
||||
go store.backgroundWorker()
|
||||
|
||||
log.Info().
|
||||
Str("path", config.DBPath).
|
||||
Int("bufferSize", config.WriteBufferSize).
|
||||
Msg("Metrics store initialized")
|
||||
|
||||
return store, nil
|
||||
}
|
||||
|
||||
// initSchema creates the database schema if it doesn't exist
|
||||
func (s *Store) initSchema() error {
|
||||
schema := `
|
||||
-- Main metrics table
|
||||
CREATE TABLE IF NOT EXISTS metrics (
|
||||
id INTEGER PRIMARY KEY AUTOINCREMENT,
|
||||
resource_type TEXT NOT NULL,
|
||||
resource_id TEXT NOT NULL,
|
||||
metric_type TEXT NOT NULL,
|
||||
value REAL NOT NULL,
|
||||
min_value REAL,
|
||||
max_value REAL,
|
||||
timestamp INTEGER NOT NULL,
|
||||
tier TEXT NOT NULL DEFAULT 'raw'
|
||||
);
|
||||
|
||||
-- Index for efficient queries by resource and time
|
||||
CREATE INDEX IF NOT EXISTS idx_metrics_lookup
|
||||
ON metrics(resource_type, resource_id, metric_type, tier, timestamp);
|
||||
|
||||
-- Index for retention pruning
|
||||
CREATE INDEX IF NOT EXISTS idx_metrics_tier_time
|
||||
ON metrics(tier, timestamp);
|
||||
|
||||
-- Metadata table for tracking rollup state
|
||||
CREATE TABLE IF NOT EXISTS metrics_meta (
|
||||
key TEXT PRIMARY KEY,
|
||||
value TEXT NOT NULL
|
||||
);
|
||||
`
|
||||
|
||||
_, err := s.db.Exec(schema)
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to create schema: %w", err)
|
||||
}
|
||||
|
||||
log.Debug().Msg("Metrics schema initialized")
|
||||
return nil
|
||||
}
|
||||
|
||||
// Write adds a metric to the write buffer
|
||||
func (s *Store) Write(resourceType, resourceID, metricType string, value float64, timestamp time.Time) {
|
||||
s.bufferMu.Lock()
|
||||
defer s.bufferMu.Unlock()
|
||||
|
||||
s.buffer = append(s.buffer, bufferedMetric{
|
||||
resourceType: resourceType,
|
||||
resourceID: resourceID,
|
||||
metricType: metricType,
|
||||
value: value,
|
||||
timestamp: timestamp,
|
||||
})
|
||||
|
||||
// Flush if buffer is full
|
||||
if len(s.buffer) >= s.config.WriteBufferSize {
|
||||
s.flushLocked()
|
||||
}
|
||||
}
|
||||
|
||||
// flush writes buffered metrics to the database (caller must hold bufferMu)
|
||||
func (s *Store) flushLocked() {
|
||||
if len(s.buffer) == 0 {
|
||||
return
|
||||
}
|
||||
|
||||
// Copy buffer for writing
|
||||
toWrite := make([]bufferedMetric, len(s.buffer))
|
||||
copy(toWrite, s.buffer)
|
||||
s.buffer = s.buffer[:0]
|
||||
|
||||
// Write in background to not block callers
|
||||
go s.writeBatch(toWrite)
|
||||
}
|
||||
|
||||
// writeBatch writes a batch of metrics to the database
|
||||
func (s *Store) writeBatch(metrics []bufferedMetric) {
|
||||
if len(metrics) == 0 {
|
||||
return
|
||||
}
|
||||
|
||||
tx, err := s.db.Begin()
|
||||
if err != nil {
|
||||
log.Error().Err(err).Msg("Failed to begin metrics transaction")
|
||||
return
|
||||
}
|
||||
|
||||
stmt, err := tx.Prepare(`
|
||||
INSERT INTO metrics (resource_type, resource_id, metric_type, value, timestamp, tier)
|
||||
VALUES (?, ?, ?, ?, ?, 'raw')
|
||||
`)
|
||||
if err != nil {
|
||||
tx.Rollback()
|
||||
log.Error().Err(err).Msg("Failed to prepare metrics insert")
|
||||
return
|
||||
}
|
||||
defer stmt.Close()
|
||||
|
||||
for _, m := range metrics {
|
||||
_, err := stmt.Exec(m.resourceType, m.resourceID, m.metricType, m.value, m.timestamp.Unix())
|
||||
if err != nil {
|
||||
log.Warn().Err(err).
|
||||
Str("resource", m.resourceID).
|
||||
Str("metric", m.metricType).
|
||||
Msg("Failed to insert metric")
|
||||
}
|
||||
}
|
||||
|
||||
if err := tx.Commit(); err != nil {
|
||||
log.Error().Err(err).Msg("Failed to commit metrics batch")
|
||||
return
|
||||
}
|
||||
|
||||
log.Debug().Int("count", len(metrics)).Msg("Wrote metrics batch")
|
||||
}
|
||||
|
||||
// Query retrieves metrics for a resource within a time range
|
||||
func (s *Store) Query(resourceType, resourceID, metricType string, start, end time.Time) ([]MetricPoint, error) {
|
||||
// Select appropriate tier based on time range
|
||||
tier := s.selectTier(end.Sub(start))
|
||||
|
||||
rows, err := s.db.Query(`
|
||||
SELECT timestamp, value, COALESCE(min_value, value), COALESCE(max_value, value)
|
||||
FROM metrics
|
||||
WHERE resource_type = ? AND resource_id = ? AND metric_type = ? AND tier = ?
|
||||
AND timestamp >= ? AND timestamp <= ?
|
||||
ORDER BY timestamp ASC
|
||||
`, resourceType, resourceID, metricType, string(tier), start.Unix(), end.Unix())
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to query metrics: %w", err)
|
||||
}
|
||||
defer rows.Close()
|
||||
|
||||
var points []MetricPoint
|
||||
for rows.Next() {
|
||||
var ts int64
|
||||
var p MetricPoint
|
||||
if err := rows.Scan(&ts, &p.Value, &p.Min, &p.Max); err != nil {
|
||||
log.Warn().Err(err).Msg("Failed to scan metric row")
|
||||
continue
|
||||
}
|
||||
p.Timestamp = time.Unix(ts, 0)
|
||||
points = append(points, p)
|
||||
}
|
||||
|
||||
return points, rows.Err()
|
||||
}
|
||||
|
||||
// QueryAll retrieves all metric types for a resource within a time range
|
||||
func (s *Store) QueryAll(resourceType, resourceID string, start, end time.Time) (map[string][]MetricPoint, error) {
|
||||
tier := s.selectTier(end.Sub(start))
|
||||
|
||||
rows, err := s.db.Query(`
|
||||
SELECT metric_type, timestamp, value, COALESCE(min_value, value), COALESCE(max_value, value)
|
||||
FROM metrics
|
||||
WHERE resource_type = ? AND resource_id = ? AND tier = ?
|
||||
AND timestamp >= ? AND timestamp <= ?
|
||||
ORDER BY metric_type, timestamp ASC
|
||||
`, resourceType, resourceID, string(tier), start.Unix(), end.Unix())
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to query all metrics: %w", err)
|
||||
}
|
||||
defer rows.Close()
|
||||
|
||||
result := make(map[string][]MetricPoint)
|
||||
for rows.Next() {
|
||||
var metricType string
|
||||
var ts int64
|
||||
var p MetricPoint
|
||||
if err := rows.Scan(&metricType, &ts, &p.Value, &p.Min, &p.Max); err != nil {
|
||||
log.Warn().Err(err).Msg("Failed to scan metric row")
|
||||
continue
|
||||
}
|
||||
p.Timestamp = time.Unix(ts, 0)
|
||||
result[metricType] = append(result[metricType], p)
|
||||
}
|
||||
|
||||
return result, rows.Err()
|
||||
}
|
||||
|
||||
// selectTier chooses the appropriate data tier based on time range
|
||||
func (s *Store) selectTier(duration time.Duration) Tier {
|
||||
switch {
|
||||
case duration <= s.config.RetentionRaw:
|
||||
return TierRaw
|
||||
case duration <= s.config.RetentionMinute:
|
||||
return TierMinute
|
||||
case duration <= s.config.RetentionHourly:
|
||||
return TierHourly
|
||||
default:
|
||||
return TierDaily
|
||||
}
|
||||
}
|
||||
|
||||
// backgroundWorker runs periodic tasks
|
||||
func (s *Store) backgroundWorker() {
|
||||
defer close(s.doneCh)
|
||||
|
||||
flushTicker := time.NewTicker(s.config.FlushInterval)
|
||||
rollupTicker := time.NewTicker(5 * time.Minute)
|
||||
retentionTicker := time.NewTicker(1 * time.Hour)
|
||||
|
||||
defer flushTicker.Stop()
|
||||
defer rollupTicker.Stop()
|
||||
defer retentionTicker.Stop()
|
||||
|
||||
for {
|
||||
select {
|
||||
case <-s.stopCh:
|
||||
// Final flush before stopping
|
||||
s.Flush()
|
||||
return
|
||||
|
||||
case <-flushTicker.C:
|
||||
s.Flush()
|
||||
|
||||
case <-rollupTicker.C:
|
||||
s.runRollup()
|
||||
|
||||
case <-retentionTicker.C:
|
||||
s.runRetention()
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Flush writes any buffered metrics to the database
|
||||
func (s *Store) Flush() {
|
||||
s.bufferMu.Lock()
|
||||
defer s.bufferMu.Unlock()
|
||||
s.flushLocked()
|
||||
}
|
||||
|
||||
// runRollup aggregates raw data into higher tiers
|
||||
func (s *Store) runRollup() {
|
||||
start := time.Now()
|
||||
|
||||
// Rollup raw -> minute (for data older than 5 minutes)
|
||||
s.rollupTier(TierRaw, TierMinute, time.Minute, 5*time.Minute)
|
||||
|
||||
// Rollup minute -> hourly (for data older than 1 hour)
|
||||
s.rollupTier(TierMinute, TierHourly, time.Hour, time.Hour)
|
||||
|
||||
// Rollup hourly -> daily (for data older than 24 hours)
|
||||
s.rollupTier(TierHourly, TierDaily, 24*time.Hour, 24*time.Hour)
|
||||
|
||||
log.Debug().Dur("duration", time.Since(start)).Msg("Metrics rollup completed")
|
||||
}
|
||||
|
||||
// rollupTier aggregates data from one tier to another
|
||||
func (s *Store) rollupTier(fromTier, toTier Tier, bucketSize, minAge time.Duration) {
|
||||
cutoff := time.Now().Add(-minAge).Unix()
|
||||
bucketSecs := int64(bucketSize.Seconds())
|
||||
|
||||
// Find distinct resource/metric combinations that need rollup
|
||||
rows, err := s.db.Query(`
|
||||
SELECT DISTINCT resource_type, resource_id, metric_type
|
||||
FROM metrics
|
||||
WHERE tier = ? AND timestamp < ?
|
||||
`, string(fromTier), cutoff)
|
||||
if err != nil {
|
||||
log.Error().Err(err).Str("tier", string(fromTier)).Msg("Failed to find rollup candidates")
|
||||
return
|
||||
}
|
||||
|
||||
var candidates []struct {
|
||||
resourceType string
|
||||
resourceID string
|
||||
metricType string
|
||||
}
|
||||
|
||||
for rows.Next() {
|
||||
var c struct {
|
||||
resourceType string
|
||||
resourceID string
|
||||
metricType string
|
||||
}
|
||||
if err := rows.Scan(&c.resourceType, &c.resourceID, &c.metricType); err == nil {
|
||||
candidates = append(candidates, c)
|
||||
}
|
||||
}
|
||||
rows.Close()
|
||||
|
||||
if len(candidates) == 0 {
|
||||
return
|
||||
}
|
||||
|
||||
// Process each candidate
|
||||
for _, c := range candidates {
|
||||
s.rollupCandidate(c.resourceType, c.resourceID, c.metricType, fromTier, toTier, bucketSecs, cutoff)
|
||||
}
|
||||
}
|
||||
|
||||
// rollupCandidate aggregates a single resource/metric from one tier to another
|
||||
func (s *Store) rollupCandidate(resourceType, resourceID, metricType string, fromTier, toTier Tier, bucketSecs, cutoff int64) {
|
||||
tx, err := s.db.Begin()
|
||||
if err != nil {
|
||||
return
|
||||
}
|
||||
defer tx.Rollback()
|
||||
|
||||
// Aggregate data into buckets
|
||||
_, err = tx.Exec(`
|
||||
INSERT INTO metrics (resource_type, resource_id, metric_type, value, min_value, max_value, timestamp, tier)
|
||||
SELECT
|
||||
resource_type,
|
||||
resource_id,
|
||||
metric_type,
|
||||
AVG(value) as value,
|
||||
MIN(value) as min_value,
|
||||
MAX(value) as max_value,
|
||||
(timestamp / ?) * ? as bucket_ts,
|
||||
?
|
||||
FROM metrics
|
||||
WHERE resource_type = ? AND resource_id = ? AND metric_type = ?
|
||||
AND tier = ? AND timestamp < ?
|
||||
GROUP BY resource_type, resource_id, metric_type, bucket_ts
|
||||
`, bucketSecs, bucketSecs, string(toTier), resourceType, resourceID, metricType, string(fromTier), cutoff)
|
||||
|
||||
if err != nil {
|
||||
log.Warn().Err(err).
|
||||
Str("resource", resourceID).
|
||||
Str("from", string(fromTier)).
|
||||
Str("to", string(toTier)).
|
||||
Msg("Failed to rollup metrics")
|
||||
return
|
||||
}
|
||||
|
||||
// Delete rolled-up raw data
|
||||
_, err = tx.Exec(`
|
||||
DELETE FROM metrics
|
||||
WHERE resource_type = ? AND resource_id = ? AND metric_type = ?
|
||||
AND tier = ? AND timestamp < ?
|
||||
`, resourceType, resourceID, metricType, string(fromTier), cutoff)
|
||||
|
||||
if err != nil {
|
||||
log.Warn().Err(err).Msg("Failed to delete rolled-up metrics")
|
||||
return
|
||||
}
|
||||
|
||||
tx.Commit()
|
||||
}
|
||||
|
||||
// runRetention deletes data older than retention period
|
||||
func (s *Store) runRetention() {
|
||||
start := time.Now()
|
||||
now := time.Now()
|
||||
|
||||
// Delete old data for each tier
|
||||
tiers := []struct {
|
||||
tier Tier
|
||||
retention time.Duration
|
||||
}{
|
||||
{TierRaw, s.config.RetentionRaw},
|
||||
{TierMinute, s.config.RetentionMinute},
|
||||
{TierHourly, s.config.RetentionHourly},
|
||||
{TierDaily, s.config.RetentionDaily},
|
||||
}
|
||||
|
||||
var totalDeleted int64
|
||||
for _, t := range tiers {
|
||||
cutoff := now.Add(-t.retention).Unix()
|
||||
result, err := s.db.Exec(`DELETE FROM metrics WHERE tier = ? AND timestamp < ?`, string(t.tier), cutoff)
|
||||
if err != nil {
|
||||
log.Warn().Err(err).Str("tier", string(t.tier)).Msg("Failed to prune metrics")
|
||||
continue
|
||||
}
|
||||
if affected, _ := result.RowsAffected(); affected > 0 {
|
||||
totalDeleted += affected
|
||||
}
|
||||
}
|
||||
|
||||
if totalDeleted > 0 {
|
||||
log.Info().
|
||||
Int64("deleted", totalDeleted).
|
||||
Dur("duration", time.Since(start)).
|
||||
Msg("Metrics retention cleanup completed")
|
||||
}
|
||||
}
|
||||
|
||||
// Close shuts down the store gracefully
|
||||
func (s *Store) Close() error {
|
||||
s.stopOnce.Do(func() {
|
||||
close(s.stopCh)
|
||||
})
|
||||
|
||||
// Wait for background worker to finish
|
||||
select {
|
||||
case <-s.doneCh:
|
||||
case <-time.After(5 * time.Second):
|
||||
log.Warn().Msg("Metrics store shutdown timed out")
|
||||
}
|
||||
|
||||
return s.db.Close()
|
||||
}
|
||||
|
||||
// Stats holds metrics store statistics
|
||||
type Stats struct {
|
||||
DBPath string `json:"dbPath"`
|
||||
DBSize int64 `json:"dbSize"`
|
||||
RawCount int64 `json:"rawCount"`
|
||||
MinuteCount int64 `json:"minuteCount"`
|
||||
HourlyCount int64 `json:"hourlyCount"`
|
||||
DailyCount int64 `json:"dailyCount"`
|
||||
TotalWrites int64 `json:"totalWrites"`
|
||||
BufferSize int `json:"bufferSize"`
|
||||
LastFlush time.Time `json:"lastFlush"`
|
||||
LastRollup time.Time `json:"lastRollup"`
|
||||
LastRetention time.Time `json:"lastRetention"`
|
||||
}
|
||||
|
||||
// GetStats returns storage statistics
|
||||
func (s *Store) GetStats() Stats {
|
||||
stats := Stats{
|
||||
DBPath: s.config.DBPath,
|
||||
}
|
||||
|
||||
// Count by tier
|
||||
rows, err := s.db.Query(`SELECT tier, COUNT(*) FROM metrics GROUP BY tier`)
|
||||
if err == nil {
|
||||
defer rows.Close()
|
||||
for rows.Next() {
|
||||
var tier string
|
||||
var count int64
|
||||
if err := rows.Scan(&tier, &count); err == nil {
|
||||
switch tier {
|
||||
case "raw":
|
||||
stats.RawCount = count
|
||||
case "minute":
|
||||
stats.MinuteCount = count
|
||||
case "hourly":
|
||||
stats.HourlyCount = count
|
||||
case "daily":
|
||||
stats.DailyCount = count
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Get database size
|
||||
if fi, err := os.Stat(s.config.DBPath); err == nil {
|
||||
stats.DBSize = fi.Size()
|
||||
}
|
||||
|
||||
// Get buffer size
|
||||
s.bufferMu.Lock()
|
||||
stats.BufferSize = len(s.buffer)
|
||||
s.bufferMu.Unlock()
|
||||
|
||||
return stats
|
||||
}
|
||||
@@ -24,6 +24,7 @@ import (
|
||||
"github.com/rcourtman/pulse-go-rewrite/internal/discovery"
|
||||
"github.com/rcourtman/pulse-go-rewrite/internal/errors"
|
||||
"github.com/rcourtman/pulse-go-rewrite/internal/logging"
|
||||
"github.com/rcourtman/pulse-go-rewrite/internal/metrics"
|
||||
"github.com/rcourtman/pulse-go-rewrite/internal/mock"
|
||||
"github.com/rcourtman/pulse-go-rewrite/internal/models"
|
||||
"github.com/rcourtman/pulse-go-rewrite/internal/notifications"
|
||||
@@ -553,6 +554,7 @@ type Monitor struct {
|
||||
startTime time.Time
|
||||
rateTracker *RateTracker
|
||||
metricsHistory *MetricsHistory
|
||||
metricsStore *metrics.Store // Persistent SQLite metrics storage
|
||||
alertManager *alerts.Manager
|
||||
notificationMgr *notifications.NotificationManager
|
||||
configPersist *config.ConfigPersistence
|
||||
@@ -2554,7 +2556,7 @@ func checkContainerizedTempMonitoring() {
|
||||
|
||||
// Log warning
|
||||
log.Warn().
|
||||
Msg("🔐 SECURITY NOTICE: Pulse is running in a container with SSH-based temperature monitoring enabled. " +
|
||||
Msg("SECURITY NOTICE: Pulse is running in a container with SSH-based temperature monitoring enabled. " +
|
||||
"SSH private keys are stored inside the container, which could be a security risk if the container is compromised. " +
|
||||
"Future versions will use agent-based architecture for better security. " +
|
||||
"See documentation for hardening recommendations.")
|
||||
@@ -2638,6 +2640,17 @@ func New(cfg *config.Config) (*Monitor, error) {
|
||||
guestAgentVersionTimeout := parseDurationEnv("GUEST_AGENT_VERSION_TIMEOUT", defaultGuestAgentVersionTimeout)
|
||||
guestAgentRetries := parseIntEnv("GUEST_AGENT_RETRIES", defaultGuestAgentRetries)
|
||||
|
||||
// Initialize persistent metrics store (SQLite)
|
||||
var metricsStore *metrics.Store
|
||||
metricsStoreConfig := metrics.DefaultConfig(cfg.DataPath)
|
||||
ms, err := metrics.NewStore(metricsStoreConfig)
|
||||
if err != nil {
|
||||
log.Error().Err(err).Msg("Failed to initialize persistent metrics store - continuing with in-memory only")
|
||||
} else {
|
||||
metricsStore = ms
|
||||
log.Info().Str("path", metricsStoreConfig.DBPath).Msg("Persistent metrics store initialized")
|
||||
}
|
||||
|
||||
m := &Monitor{
|
||||
config: cfg,
|
||||
state: models.NewState(),
|
||||
@@ -2662,6 +2675,7 @@ func New(cfg *config.Config) (*Monitor, error) {
|
||||
startTime: time.Now(),
|
||||
rateTracker: NewRateTracker(),
|
||||
metricsHistory: NewMetricsHistory(1000, 24*time.Hour), // Keep up to 1000 points or 24 hours
|
||||
metricsStore: metricsStore, // Persistent SQLite storage
|
||||
alertManager: alerts.NewManager(),
|
||||
notificationMgr: notifications.NewNotificationManager(cfg.PublicURL),
|
||||
configPersist: config.NewConfigPersistence(cfg.DataPath),
|
||||
@@ -4880,6 +4894,12 @@ func (m *Monitor) pollPVEInstance(ctx context.Context, instanceName string, clie
|
||||
m.metricsHistory.AddNodeMetric(modelNodes[i].ID, "cpu", modelNodes[i].CPU*100, now)
|
||||
m.metricsHistory.AddNodeMetric(modelNodes[i].ID, "memory", modelNodes[i].Memory.Usage, now)
|
||||
m.metricsHistory.AddNodeMetric(modelNodes[i].ID, "disk", modelNodes[i].Disk.Usage, now)
|
||||
// Also write to persistent store
|
||||
if m.metricsStore != nil {
|
||||
m.metricsStore.Write("node", modelNodes[i].ID, "cpu", modelNodes[i].CPU*100, now)
|
||||
m.metricsStore.Write("node", modelNodes[i].ID, "memory", modelNodes[i].Memory.Usage, now)
|
||||
m.metricsStore.Write("node", modelNodes[i].ID, "disk", modelNodes[i].Disk.Usage, now)
|
||||
}
|
||||
}
|
||||
|
||||
// Check thresholds for alerts
|
||||
@@ -5933,6 +5953,43 @@ func (m *Monitor) pollVMsAndContainersEfficient(ctx context.Context, instanceNam
|
||||
m.state.UpdateVMsForInstance(instanceName, allVMs)
|
||||
m.state.UpdateContainersForInstance(instanceName, allContainers)
|
||||
|
||||
// Record guest metrics history for running guests (enables sparkline/trends view)
|
||||
now := time.Now()
|
||||
for _, vm := range allVMs {
|
||||
if vm.Status == "running" {
|
||||
m.metricsHistory.AddGuestMetric(vm.ID, "cpu", vm.CPU*100, now)
|
||||
m.metricsHistory.AddGuestMetric(vm.ID, "memory", vm.Memory.Usage, now)
|
||||
if vm.Disk.Usage >= 0 {
|
||||
m.metricsHistory.AddGuestMetric(vm.ID, "disk", vm.Disk.Usage, now)
|
||||
}
|
||||
// Also write to persistent store
|
||||
if m.metricsStore != nil {
|
||||
m.metricsStore.Write("vm", vm.ID, "cpu", vm.CPU*100, now)
|
||||
m.metricsStore.Write("vm", vm.ID, "memory", vm.Memory.Usage, now)
|
||||
if vm.Disk.Usage >= 0 {
|
||||
m.metricsStore.Write("vm", vm.ID, "disk", vm.Disk.Usage, now)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
for _, ct := range allContainers {
|
||||
if ct.Status == "running" {
|
||||
m.metricsHistory.AddGuestMetric(ct.ID, "cpu", ct.CPU*100, now)
|
||||
m.metricsHistory.AddGuestMetric(ct.ID, "memory", ct.Memory.Usage, now)
|
||||
if ct.Disk.Usage >= 0 {
|
||||
m.metricsHistory.AddGuestMetric(ct.ID, "disk", ct.Disk.Usage, now)
|
||||
}
|
||||
// Also write to persistent store
|
||||
if m.metricsStore != nil {
|
||||
m.metricsStore.Write("container", ct.ID, "cpu", ct.CPU*100, now)
|
||||
m.metricsStore.Write("container", ct.ID, "memory", ct.Memory.Usage, now)
|
||||
if ct.Disk.Usage >= 0 {
|
||||
m.metricsStore.Write("container", ct.ID, "disk", ct.Disk.Usage, now)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
m.pollReplicationStatus(ctx, instanceName, client, allVMs)
|
||||
|
||||
log.Info().
|
||||
@@ -6943,6 +7000,11 @@ func (m *Monitor) GetConfigPersistence() *config.ConfigPersistence {
|
||||
return m.configPersist
|
||||
}
|
||||
|
||||
// GetMetricsStore returns the persistent metrics store
|
||||
func (m *Monitor) GetMetricsStore() *metrics.Store {
|
||||
return m.metricsStore
|
||||
}
|
||||
|
||||
// pollStorageBackupsWithNodes polls backups using a provided nodes list to avoid duplicate GetNodes calls
|
||||
func (m *Monitor) pollStorageBackupsWithNodes(ctx context.Context, instanceName string, client PVEClientInterface, nodes []proxmox.Node, nodeEffectiveStatus map[string]string) {
|
||||
|
||||
@@ -7781,6 +7843,15 @@ func (m *Monitor) Stop() {
|
||||
m.notificationMgr.Stop()
|
||||
}
|
||||
|
||||
// Close persistent metrics store (flushes buffered data)
|
||||
if m.metricsStore != nil {
|
||||
if err := m.metricsStore.Close(); err != nil {
|
||||
log.Error().Err(err).Msg("Failed to close metrics store")
|
||||
} else {
|
||||
log.Info().Msg("Metrics store closed successfully")
|
||||
}
|
||||
}
|
||||
|
||||
log.Info().Msg("Monitor stopped")
|
||||
}
|
||||
|
||||
|
||||
@@ -842,6 +842,26 @@ func (m *Monitor) pollVMsWithNodes(ctx context.Context, instanceName string, cli
|
||||
// Update state with all VMs
|
||||
m.state.UpdateVMsForInstance(instanceName, allVMs)
|
||||
|
||||
// Record guest metrics history for running VMs (enables sparkline/trends view)
|
||||
now := time.Now()
|
||||
for _, vm := range allVMs {
|
||||
if vm.Status == "running" {
|
||||
m.metricsHistory.AddGuestMetric(vm.ID, "cpu", vm.CPU*100, now)
|
||||
m.metricsHistory.AddGuestMetric(vm.ID, "memory", vm.Memory.Usage, now)
|
||||
if vm.Disk.Usage >= 0 {
|
||||
m.metricsHistory.AddGuestMetric(vm.ID, "disk", vm.Disk.Usage, now)
|
||||
}
|
||||
// Also write to persistent store
|
||||
if m.metricsStore != nil {
|
||||
m.metricsStore.Write("vm", vm.ID, "cpu", vm.CPU*100, now)
|
||||
m.metricsStore.Write("vm", vm.ID, "memory", vm.Memory.Usage, now)
|
||||
if vm.Disk.Usage >= 0 {
|
||||
m.metricsStore.Write("vm", vm.ID, "disk", vm.Disk.Usage, now)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
duration := time.Since(startTime)
|
||||
log.Info().
|
||||
Str("instance", instanceName).
|
||||
@@ -1109,6 +1129,26 @@ func (m *Monitor) pollContainersWithNodes(ctx context.Context, instanceName stri
|
||||
// Update state with all containers
|
||||
m.state.UpdateContainersForInstance(instanceName, allContainers)
|
||||
|
||||
// Record guest metrics history for running containers (enables sparkline/trends view)
|
||||
now := time.Now()
|
||||
for _, ct := range allContainers {
|
||||
if ct.Status == "running" {
|
||||
m.metricsHistory.AddGuestMetric(ct.ID, "cpu", ct.CPU*100, now)
|
||||
m.metricsHistory.AddGuestMetric(ct.ID, "memory", ct.Memory.Usage, now)
|
||||
if ct.Disk.Usage >= 0 {
|
||||
m.metricsHistory.AddGuestMetric(ct.ID, "disk", ct.Disk.Usage, now)
|
||||
}
|
||||
// Also write to persistent store
|
||||
if m.metricsStore != nil {
|
||||
m.metricsStore.Write("container", ct.ID, "cpu", ct.CPU*100, now)
|
||||
m.metricsStore.Write("container", ct.ID, "memory", ct.Memory.Usage, now)
|
||||
if ct.Disk.Usage >= 0 {
|
||||
m.metricsStore.Write("container", ct.ID, "disk", ct.Disk.Usage, now)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
duration := time.Since(startTime)
|
||||
log.Info().
|
||||
Str("instance", instanceName).
|
||||
|
||||
@@ -42,11 +42,22 @@ func GetenvTrim(key string) string {
|
||||
return strings.TrimSpace(os.Getenv(key))
|
||||
}
|
||||
|
||||
// NormalizeVersion strips the "v" prefix from version strings for comparison.
|
||||
// This normalizes versions like "v4.33.1" to "4.33.1" so that version strings
|
||||
// from different sources (agent vs server) can be compared consistently.
|
||||
// NormalizeVersion normalizes version strings for comparison by:
|
||||
// 1. Stripping whitespace
|
||||
// 2. Removing the "v" prefix (e.g., "v4.33.1" -> "4.33.1")
|
||||
// 3. Stripping build metadata after "+" (e.g., "4.36.2+git.14.dirty" -> "4.36.2")
|
||||
//
|
||||
// Per semver spec, build metadata MUST be ignored when determining version precedence.
|
||||
// This fixes issues where dirty builds like "4.36.2+git.14.g469307d6.dirty" would
|
||||
// incorrectly be treated as newer than "4.36.2", causing infinite update loops.
|
||||
func NormalizeVersion(version string) string {
|
||||
return strings.TrimPrefix(strings.TrimSpace(version), "v")
|
||||
v := strings.TrimPrefix(strings.TrimSpace(version), "v")
|
||||
// Strip build metadata (everything after +)
|
||||
// Per semver: build metadata MUST be ignored when determining version precedence
|
||||
if idx := strings.Index(v, "+"); idx != -1 {
|
||||
v = v[:idx]
|
||||
}
|
||||
return v
|
||||
}
|
||||
|
||||
// CompareVersions compares two semver-like version strings.
|
||||
|
||||
@@ -307,6 +307,12 @@ func TestNormalizeVersion(t *testing.T) {
|
||||
{"v", ""},
|
||||
{" ", ""},
|
||||
{"vv4.33.1", "v4.33.1"}, // Only removes one v
|
||||
|
||||
// Build metadata (semver +suffix should be stripped)
|
||||
{"4.36.2+git.14.g469307d6.dirty", "4.36.2"},
|
||||
{"v4.36.2+build123", "4.36.2"},
|
||||
{"1.0.0+20231215", "1.0.0"},
|
||||
{"v1.0.0-rc1+build.456", "1.0.0-rc1"},
|
||||
}
|
||||
|
||||
for _, tc := range tests {
|
||||
@@ -353,6 +359,14 @@ func TestCompareVersions(t *testing.T) {
|
||||
{"0.0.1", "0.0.0", 1},
|
||||
{"0.0.0", "0.0.1", -1},
|
||||
{"1.0", "0.9.9", 1},
|
||||
|
||||
// Build metadata should be ignored (semver +suffix)
|
||||
// This is the critical fix for the infinite agent update loop bug
|
||||
{"4.36.2+git.14.g469307d6.dirty", "4.36.2", 0}, // Dirty == clean
|
||||
{"4.36.2", "4.36.2+git.14.g469307d6.dirty", 0}, // Clean == dirty
|
||||
{"v4.36.2+build123", "v4.36.2", 0}, // With v prefix
|
||||
{"4.36.3", "4.36.2+git.14.g469307d6.dirty", 1}, // Newer beats dirty
|
||||
{"4.36.2+git.14.g469307d6.dirty", "4.36.3", -1}, // Dirty older than newer
|
||||
}
|
||||
|
||||
for _, tc := range tests {
|
||||
|
||||
Reference in New Issue
Block a user