Browse Source

refactor repo manager

Nostr-Signature: d134c35516991f27e47ed8a4aa0d3f1d6e6be41c46c9cf3f6c982c1442b09b4b 573634b648634cbad10f2451776089ea21090d9407f715e83c577b4611ae6edc cb699fae6a8e44a3b9123f215749f6fec0470c75a0401a94c37dfb8e572c07281b3941862e704b868663f943c573ab2ee9fec217e87f7be567cc6bb3514cacdb
main
Silberengel 3 weeks ago
parent
commit
f695551142
  1. 439
      AUDIT_REPORT.md
  2. 1
      nostr/commit-signatures.jsonl
  3. 5
      src/app.css
  4. 30
      src/lib/config.ts
  5. 316
      src/lib/services/git/announcement-manager.ts
  6. 382
      src/lib/services/git/git-remote-sync.ts
  7. 613
      src/lib/services/git/repo-manager-refactored.ts
  8. 911
      src/lib/services/git/repo-manager.ts
  9. 1304
      src/lib/services/git/repo-manager.ts.old
  10. 79
      src/lib/services/git/repo-size-checker.ts
  11. 152
      src/lib/services/git/repo-url-parser.ts
  12. 7
      src/lib/services/nostr/nip98-auth.ts
  13. 83
      src/lib/utils/security.ts
  14. 47
      src/routes/api/git/[...path]/+server.ts
  15. 82
      src/routes/api/repos/[npub]/[repo]/clone/+server.ts
  16. 21
      src/routes/api/repos/[npub]/[repo]/fork/+server.ts
  17. 121
      src/routes/api/repos/[npub]/[repo]/issues/+server.ts
  18. 46
      src/routes/api/repos/[npub]/[repo]/prs/merge/+server.ts

439
AUDIT_REPORT.md

@ -0,0 +1,439 @@
# Code Audit Report
**Date:** 2024-12-19
**Project:** GitRepublic Web
**Auditor:** Auto (AI Code Auditor)
## Executive Summary
This audit examined the GitRepublic Web codebase for security vulnerabilities, code quality issues, and best practices. The codebase demonstrates **strong security awareness** with good practices in place, but several areas need attention.
### Overall Assessment: **B+ (Good with room for improvement)**
**Strengths:**
- ✅ Strong path traversal protection
- ✅ Good use of `spawn()` instead of `exec()` for command execution
- ✅ Comprehensive input validation utilities
- ✅ Proper error sanitization
- ✅ NIP-98 authentication implementation
- ✅ Audit logging in place
- ✅ Rate limiting implemented
**Areas for Improvement:**
- ⚠ Some error handling inconsistencies
- ⚠ Missing input validation in a few endpoints
- ⚠ Potential race conditions in concurrent operations
- ⚠ Some hardcoded values that should be configurable
- ⚠ Missing type safety in some areas
---
## 1. Security Issues
### 🔴 Critical Issues
#### 1.1 Missing Input Validation in Issues Endpoint
**File:** `src/routes/api/repos/[npub]/[repo]/issues/+server.ts`
**Lines:** 26-27, 61-62
**Issue:** The POST and PATCH endpoints accept JSON bodies without validating the structure or content of the `issueEvent` object beyond basic signature checks.
**Risk:** Malformed or malicious events could be published to Nostr relays.
**Recommendation:**
```typescript
// Add validation for issueEvent structure
if (!issueEvent.kind || issueEvent.kind !== KIND.ISSUE) {
throw handleValidationError('Invalid event kind', {...});
}
// Validate tags structure
if (!Array.isArray(issueEvent.tags)) {
throw handleValidationError('Invalid event tags', {...});
}
```
#### 1.2 Request Body Consumption in Clone Endpoint
**File:** `src/routes/api/repos/[npub]/[repo]/clone/+server.ts`
**Lines:** 63-86
**Issue:** The code attempts to read the request body as text to extract a proof event, but this consumes the body stream. If the body is needed later, it won't be available.
**Risk:** This could cause issues if the body needs to be read multiple times or if the parsing fails.
**Recommendation:** Use a proper body cloning mechanism or restructure to read body once and pass it through.
### 🟡 High Priority Issues
#### 1.3 Path Traversal Protection - Good Implementation
**File:** `src/routes/api/git/[...path]/+server.ts`
**Lines:** 196-205, 585-593
**Status:** ✅ **Well Protected**
The code properly validates paths using `resolve()` and checks that resolved paths are within `repoRoot`:
```typescript
const resolvedPath = resolve(repoPath).replace(/\\/g, '/');
const resolvedRoot = resolve(repoRoot).replace(/\\/g, '/');
if (!resolvedPath.startsWith(resolvedRoot + '/')) {
return error(403, 'Invalid repository path');
}
```
**Recommendation:** This pattern is excellent. Ensure it's used consistently across all file path operations.
#### 1.4 Command Injection Protection - Good Implementation
**File:** `src/routes/api/git/[...path]/+server.ts`
**Lines:** 359-364, 900-905
**Status:** ✅ **Well Protected**
The code uses `spawn()` with argument arrays instead of string concatenation:
```typescript
const gitProcess = spawn(gitHttpBackend, [], {
env: envVars,
stdio: ['pipe', 'pipe', 'pipe'],
shell: false
});
```
**Recommendation:** Continue using this pattern. No issues found with `exec()` or shell execution.
#### 1.5 Environment Variable Exposure
**File:** `src/routes/api/git/[...path]/+server.ts`
**Lines:** 321-335, 855-869
**Status:** ✅ **Well Protected**
The code whitelists only necessary environment variables:
```typescript
const envVars: Record<string, string> = {
PATH: process.env.PATH || '/usr/bin:/bin',
HOME: process.env.HOME || '/tmp',
// ... only necessary vars
};
```
**Recommendation:** Good practice. Continue this approach.
#### 1.6 NIP-98 Authentication Implementation
**File:** `src/lib/services/nostr/nip98-auth.ts`
**Status:** ✅ **Well Implemented**
The NIP-98 authentication is properly implemented with:
- Event signature verification
- Timestamp validation (60-second window)
- URL and method matching
- Payload hash verification
**Recommendation:** No changes needed.
### 🟢 Medium Priority Issues
#### 1.7 Error Message Information Disclosure
**File:** `src/routes/api/repos/[npub]/[repo]/prs/merge/+server.ts`
**Line:** 40
**Issue:** Error message reveals internal path structure:
```typescript
throw handleApiError(new Error('Repository not cloned locally. Please clone the repository first.'), ...);
```
**Risk:** Low - but could reveal system structure to attackers.
**Recommendation:** Use generic error messages for users, detailed messages only in logs.
#### 1.8 Missing Rate Limiting on Some Endpoints
**Files:** Various API endpoints
**Issue:** Not all endpoints appear to use rate limiting middleware.
**Recommendation:** Audit all endpoints and ensure rate limiting is applied consistently.
---
## 2. Code Quality Issues
### 2.1 Inconsistent Error Handling
#### Issue: Mixed Error Handling Patterns
**Files:** Multiple
Some endpoints use `handleApiError()`, others use direct `error()` calls, and some use try-catch with different patterns.
**Example:**
- `src/routes/api/repos/[npub]/[repo]/fork/+server.ts` uses `error()` directly
- `src/routes/api/repos/[npub]/[repo]/issues/+server.ts` uses `handleApiError()`
**Recommendation:** Standardize on using the error handler utilities consistently.
### 2.2 Type Safety Issues
#### Issue: Missing Type Assertions
**File:** `src/routes/api/repos/[npub]/[repo]/fork/+server.ts`
**Line:** 372
```typescript
await fileManager.saveRepoEventToWorktree(workDir, signedForkAnnouncement as NostrEvent, 'announcement')
```
**Issue:** Type assertion without runtime validation.
**Recommendation:** Add runtime validation or improve type definitions.
### 2.3 Code Duplication
#### Issue: Duplicate Path Validation Logic
**Files:** Multiple files
The path traversal protection pattern is duplicated across multiple files:
- `src/routes/api/git/[...path]/+server.ts` (lines 196-205, 585-593)
- `src/routes/api/repos/[npub]/[repo]/fork/+server.ts` (lines 146-150, 166-169)
**Recommendation:** Extract to a shared utility function:
```typescript
export function validateRepoPath(repoPath: string, repoRoot: string): { valid: boolean; error?: string } {
const resolvedPath = resolve(repoPath).replace(/\\/g, '/');
const resolvedRoot = resolve(repoRoot).replace(/\\/g, '/');
if (!resolvedPath.startsWith(resolvedRoot + '/')) {
return { valid: false, error: 'Invalid repository path' };
}
return { valid: true };
}
```
### 2.4 Missing Input Validation
#### Issue: Branch Name Validation Not Applied Everywhere
**File:** `src/routes/api/repos/[npub]/[repo]/prs/merge/+server.ts`
**Line:** 24
```typescript
const { prId, prAuthor, prCommitId, targetBranch = 'main', mergeMessage } = body;
```
**Issue:** `targetBranch` is not validated before use.
**Recommendation:**
```typescript
if (!isValidBranchName(targetBranch)) {
throw handleValidationError('Invalid branch name', {...});
}
```
### 2.5 Hardcoded Values
#### Issue: Magic Numbers and Strings
**Files:** Multiple
Examples:
- `src/routes/api/git/[...path]/+server.ts` line 356: `const timeoutMs = 5 * 60 * 1000;` (5 minutes)
- `src/lib/services/nostr/nip98-auth.ts` line 75: `if (eventAge > 60)` (60 seconds)
**Recommendation:** Move to configuration constants:
```typescript
const GIT_OPERATION_TIMEOUT_MS = parseInt(process.env.GIT_OPERATION_TIMEOUT_MS || '300000', 10);
const NIP98_AUTH_WINDOW_SECONDS = parseInt(process.env.NIP98_AUTH_WINDOW_SECONDS || '60', 10);
```
---
## 3. Error Handling
### 3.1 Inconsistent Error Responses
#### Issue: Different Error Formats
Some endpoints return JSON errors, others return plain text, and some use SvelteKit's `error()` helper.
**Recommendation:** Standardize error response format across all endpoints.
### 3.2 Error Logging
#### Status: ✅ **Good Implementation**
The codebase uses structured logging with Pino:
```typescript
logger.error({ error: sanitizedError, ...context }, 'Error message');
```
**Recommendation:** Continue this practice. Ensure all errors are logged with appropriate context.
### 3.3 Error Sanitization
#### Status: ✅ **Well Implemented**
The `sanitizeError()` function properly redacts sensitive data:
- Private keys (nsec patterns)
- 64-character hex keys
- Passwords
- Long pubkeys
**Recommendation:** No changes needed.
---
## 4. Performance Concerns
### 4.1 Potential Race Conditions
#### Issue: Concurrent Repository Operations
**File:** `src/routes/api/repos/[npub]/[repo]/clone/+server.ts`
**Lines:** 155-161
```typescript
if (existsSync(repoPath)) {
return json({ success: true, message: 'Repository already exists locally', alreadyExists: true });
}
```
**Issue:** Between the check and the clone operation, another request could create the repo, causing conflicts.
**Recommendation:** Use file locking or atomic operations.
### 4.2 Missing Request Timeouts
#### Issue: Some Operations Lack Timeouts
**File:** `src/lib/services/git/repo-manager.ts`
**Line:** 438
The `spawn()` call for git clone doesn't have an explicit timeout.
**Recommendation:** Add timeout handling similar to git-http-backend operations.
### 4.3 Memory Usage
#### Issue: Large File Handling
**File:** `src/routes/api/git/[...path]/+server.ts`
**Lines:** 391-400
Git operation responses are buffered in memory. For very large repositories, this could cause memory issues.
**Recommendation:** Consider streaming for large responses.
---
## 5. Best Practices
### 5.1 ✅ Good Practices Found
1. **Path Traversal Protection:** Excellent implementation using `resolve()` and path validation
2. **Command Injection Prevention:** Proper use of `spawn()` with argument arrays
3. **Environment Variable Whitelisting:** Only necessary vars are passed to child processes
4. **Error Sanitization:** Comprehensive sanitization of error messages
5. **Structured Logging:** Consistent use of Pino for logging
6. **Input Validation Utilities:** Good set of validation functions
7. **TypeScript Strict Mode:** Enabled in tsconfig.json
8. **Audit Logging:** Security events are logged
### 5.2 ⚠ Areas for Improvement
1. **Consistent Error Handling:** Standardize error handling patterns
2. **Code Reusability:** Extract duplicate validation logic
3. **Configuration Management:** Move hardcoded values to configuration
4. **Type Safety:** Improve type definitions and reduce assertions
5. **Testing:** No test files found in audit - consider adding unit tests
---
## 6. Dependency Security
### 6.1 Package Dependencies
**Status:** ⚠ **Needs Review**
The audit did not include a dependency vulnerability scan. Recommended actions:
1. Run `npm audit` to check for known vulnerabilities
2. Review dependencies for:
- `simple-git` - Git operations
- `nostr-tools` - Nostr protocol
- `ws` - WebSocket client
- `svelte` and `@sveltejs/kit` - Framework
**Recommendation:** Regularly update dependencies and monitor security advisories.
---
## 7. Recommendations Summary
### Priority 1 (Critical)
1. ✅ **Add input validation** to issues endpoint (POST/PATCH)
2. ✅ **Fix request body consumption** in clone endpoint
3. ✅ **Add branch name validation** in merge endpoint
### Priority 2 (High)
1. ✅ **Extract path validation** to shared utility
2. ✅ **Standardize error handling** across all endpoints
3. ✅ **Add timeouts** to all git operations
4. ✅ **Add rate limiting** to all endpoints
### Priority 3 (Medium)
1. ✅ **Move hardcoded values** to configuration
2. ✅ **Improve type safety** (reduce type assertions)
3. ✅ **Add file locking** for concurrent operations
4. ✅ **Run dependency audit** (`npm audit`)
### Priority 4 (Low)
1. ✅ **Add unit tests** for critical functions
2. ✅ **Document error handling patterns**
3. ✅ **Add performance monitoring**
---
## 8. Positive Findings
The codebase demonstrates **strong security awareness**:
1. ✅ **No `eval()` or `Function()` calls** found
2. ✅ **No SQL injection risks** (no database queries found)
3. ✅ **Proper path traversal protection**
4. ✅ **Command injection prevention** via `spawn()`
5. ✅ **Comprehensive input validation utilities**
6. ✅ **Error message sanitization**
7. ✅ **Structured logging**
8. ✅ **NIP-98 authentication properly implemented**
9. ✅ **Audit logging for security events**
10. ✅ **Rate limiting infrastructure in place**
---
## 9. Conclusion
The GitRepublic Web codebase shows **good security practices** overall. The main areas for improvement are:
1. **Consistency** - Standardize error handling and validation patterns
2. **Completeness** - Ensure all endpoints have proper validation and rate limiting
3. **Maintainability** - Reduce code duplication and improve type safety
**Overall Grade: B+**
The codebase is **production-ready** with the recommended fixes applied. The security foundation is solid, and most issues are minor improvements rather than critical vulnerabilities.
---
## Appendix: Files Audited
### Critical Security Files
- `src/routes/api/git/[...path]/+server.ts` (1064 lines)
- `src/routes/api/repos/[npub]/[repo]/fork/+server.ts` (497 lines)
- `src/routes/api/repos/[npub]/[repo]/clone/+server.ts` (314 lines)
- `src/routes/api/repos/[npub]/[repo]/prs/merge/+server.ts` (85 lines)
- `src/routes/api/repos/[npub]/[repo]/issues/+server.ts` (89 lines)
### Security Utilities
- `src/lib/utils/security.ts`
- `src/lib/utils/input-validation.ts`
- `src/lib/utils/error-handler.ts`
- `src/lib/utils/api-auth.ts`
- `src/lib/services/nostr/nip98-auth.ts`
- `src/lib/services/security/rate-limiter.ts`
### Core Services
- `src/lib/services/git/repo-manager.ts` (536 lines)
- `src/lib/services/git/git-remote-sync.ts` (383 lines)
- `src/lib/services/git/announcement-manager.ts` (317 lines)
---
**Report Generated:** 2024-12-19
**Total Files Audited:** 20+
**Total Lines Reviewed:** 5000+

1
nostr/commit-signatures.jsonl

@ -57,3 +57,4 @@
{"kind":1640,"pubkey":"573634b648634cbad10f2451776089ea21090d9407f715e83c577b4611ae6edc","created_at":1771754488,"tags":[["author","Silberengel","silberengel7@protonmail.com"],["message","fix menu responsivenes on repo-header"]],"content":"Signed commit: fix menu responsivenes on repo-header","id":"4dd8101d8edc9431df49d9fe23b7e1e545e11ef32b024b44f871bb962fb8ad4c","sig":"dbcfbfafe02495971b3f3d18466ecf1d894e4001a41e4038d17fd78bb65124de347017273a0a437c397a79ff8226ec6b0718436193e474ef8969392df027fa34"} {"kind":1640,"pubkey":"573634b648634cbad10f2451776089ea21090d9407f715e83c577b4611ae6edc","created_at":1771754488,"tags":[["author","Silberengel","silberengel7@protonmail.com"],["message","fix menu responsivenes on repo-header"]],"content":"Signed commit: fix menu responsivenes on repo-header","id":"4dd8101d8edc9431df49d9fe23b7e1e545e11ef32b024b44f871bb962fb8ad4c","sig":"dbcfbfafe02495971b3f3d18466ecf1d894e4001a41e4038d17fd78bb65124de347017273a0a437c397a79ff8226ec6b0718436193e474ef8969392df027fa34"}
{"kind":1640,"pubkey":"573634b648634cbad10f2451776089ea21090d9407f715e83c577b4611ae6edc","created_at":1771755811,"tags":[["author","Silberengel","silberengel7@protonmail.com"],["message","fix creating new branch"]],"content":"Signed commit: fix creating new branch","id":"bc6c623532064f9b2db08fa41bbc6c5ff42419415ca7e1ecb1162a884face2eb","sig":"ad1152e2848755e1afa7d9350716fa6bb709698a5036e21efa61b3ac755d334155f02a0622ad49f6dc060d523f4f886eb2acc8c80356a426b0d8ba454fdcb8ee"} {"kind":1640,"pubkey":"573634b648634cbad10f2451776089ea21090d9407f715e83c577b4611ae6edc","created_at":1771755811,"tags":[["author","Silberengel","silberengel7@protonmail.com"],["message","fix creating new branch"]],"content":"Signed commit: fix creating new branch","id":"bc6c623532064f9b2db08fa41bbc6c5ff42419415ca7e1ecb1162a884face2eb","sig":"ad1152e2848755e1afa7d9350716fa6bb709698a5036e21efa61b3ac755d334155f02a0622ad49f6dc060d523f4f886eb2acc8c80356a426b0d8ba454fdcb8ee"}
{"kind":1640,"pubkey":"573634b648634cbad10f2451776089ea21090d9407f715e83c577b4611ae6edc","created_at":1771829031,"tags":[["author","Silberengel","silberengel7@protonmail.com"],["message","fix file management and refactor"]],"content":"Signed commit: fix file management and refactor","id":"626196cdbf9eab28b44990706281878083d66983b503e8a81df7421054ed6caf","sig":"516c0001a800083411a1e04340e82116a82c975f38b984e92ebe021b61271ba7d6f645466ddba3594320c228193e708675a5d7a144b2f3d5e9bfbc65c4c7372b"} {"kind":1640,"pubkey":"573634b648634cbad10f2451776089ea21090d9407f715e83c577b4611ae6edc","created_at":1771829031,"tags":[["author","Silberengel","silberengel7@protonmail.com"],["message","fix file management and refactor"]],"content":"Signed commit: fix file management and refactor","id":"626196cdbf9eab28b44990706281878083d66983b503e8a81df7421054ed6caf","sig":"516c0001a800083411a1e04340e82116a82c975f38b984e92ebe021b61271ba7d6f645466ddba3594320c228193e708675a5d7a144b2f3d5e9bfbc65c4c7372b"}
{"kind":1640,"pubkey":"573634b648634cbad10f2451776089ea21090d9407f715e83c577b4611ae6edc","created_at":1771836045,"tags":[["author","Silberengel","silberengel7@protonmail.com"],["message","fix repo management and refactor\nimplement more GRASP support"]],"content":"Signed commit: fix repo management and refactor\nimplement more GRASP support","id":"6ae016621b13e22809e7bcebe34e5250fd6e0767d2b12ca634104def4ca78a29","sig":"99c34f66a8a67d352622621536545b7dee11cfd9d14a007ec0550d138109116a2f24483c6836fea59b94b9e96066fba548bcb7600bc55adbe0562d999c3c651d"}

5
src/app.css

@ -1221,6 +1221,7 @@ button.theme-option.active img.theme-icon-option,
border-color: var(--accent); border-color: var(--accent);
transform: translateY(-1px); transform: translateY(-1px);
box-shadow: 0 2px 4px var(--shadow-color-light); box-shadow: 0 2px 4px var(--shadow-color-light);
font-size: 0.9rem; /* Preserve font size on hover */
} }
.repo-badge-image { .repo-badge-image {
@ -1281,6 +1282,10 @@ button.theme-option.active img.theme-icon-option,
font-size: 0.85rem; font-size: 0.85rem;
} }
.repo-badge:hover {
font-size: 0.85rem; /* Preserve font size on hover for mobile */
}
.repo-badge-name { .repo-badge-name {
max-width: 150px; max-width: 150px;
} }

30
src/lib/config.ts

@ -107,6 +107,36 @@ export const ENTERPRISE_MODE =
*/ */
export const SECURITY_MODE = ENTERPRISE_MODE ? 'enterprise' : 'lightweight'; export const SECURITY_MODE = ENTERPRISE_MODE ? 'enterprise' : 'lightweight';
/**
* Git operation timeout in milliseconds
* Default: 5 minutes (300000ms)
* Can be overridden by GIT_OPERATION_TIMEOUT_MS env var
*/
export const GIT_OPERATION_TIMEOUT_MS =
typeof process !== 'undefined' && process.env?.GIT_OPERATION_TIMEOUT_MS
? parseInt(process.env.GIT_OPERATION_TIMEOUT_MS, 10)
: 5 * 60 * 1000; // 5 minutes
/**
* Git clone operation timeout in milliseconds
* Default: 5 minutes (300000ms)
* Can be overridden by GIT_CLONE_TIMEOUT_MS env var
*/
export const GIT_CLONE_TIMEOUT_MS =
typeof process !== 'undefined' && process.env?.GIT_CLONE_TIMEOUT_MS
? parseInt(process.env.GIT_CLONE_TIMEOUT_MS, 10)
: 5 * 60 * 1000; // 5 minutes
/**
* NIP-98 authentication window in seconds
* Default: 60 seconds (per NIP-98 spec)
* Can be overridden by NIP98_AUTH_WINDOW_SECONDS env var
*/
export const NIP98_AUTH_WINDOW_SECONDS =
typeof process !== 'undefined' && process.env?.NIP98_AUTH_WINDOW_SECONDS
? parseInt(process.env.NIP98_AUTH_WINDOW_SECONDS, 10)
: 60; // 60 seconds per NIP-98 spec
/** /**
* Combine default relays with user's relays (from kind 10002) * Combine default relays with user's relays (from kind 10002)
* Returns a deduplicated list with user relays first, then defaults * Returns a deduplicated list with user relays first, then defaults

316
src/lib/services/git/announcement-manager.ts

@ -0,0 +1,316 @@
/**
* Announcement Manager
* Handles saving and retrieving repository announcements from repos
*/
import { existsSync } from 'fs';
import { readFile } from 'fs/promises';
import { join } from 'path';
import { mkdir, writeFile, rm } from 'fs/promises';
import simpleGit, { type SimpleGit } from 'simple-git';
import logger from '../logger.js';
import type { NostrEvent } from '../../types/nostr.js';
import { validateAnnouncementEvent } from '../nostr/repo-verification.js';
import { DEFAULT_NOSTR_RELAYS } from '../../config.js';
import { NostrClient } from '../nostr/nostr-client.js';
import { KIND } from '../../types/nostr.js';
import { RepoUrlParser } from './repo-url-parser.js';
/**
* Announcement Manager
* Handles saving and retrieving repository announcements from repos
*/
export class AnnouncementManager {
private urlParser: RepoUrlParser;
constructor(repoRoot: string = '/repos', domain: string = 'localhost:6543') {
this.urlParser = new RepoUrlParser(repoRoot, domain);
}
/**
* Check if an announcement event already exists in nostr/repo-events.jsonl
*/
async hasAnnouncementInRepo(worktreePath: string, eventId?: string): Promise<boolean> {
try {
const jsonlFile = join(worktreePath, 'nostr', 'repo-events.jsonl');
if (!existsSync(jsonlFile)) {
return false;
}
const content = await readFile(jsonlFile, 'utf-8');
const lines = content.trim().split('\n').filter(Boolean);
for (const line of lines) {
try {
const entry = JSON.parse(line);
if (entry.type === 'announcement' && entry.event) {
// If eventId provided, check for exact match
if (eventId) {
if (entry.event.id === eventId) {
return true;
}
} else {
// Just check if any announcement exists
return true;
}
}
} catch {
// Skip invalid lines
continue;
}
}
return false;
} catch (err) {
logger.debug({ error: err, worktreePath }, 'Failed to check for announcement in repo');
return false;
}
}
/**
* Read announcement event from nostr/repo-events.jsonl
*/
async getAnnouncementFromRepo(worktreePath: string): Promise<NostrEvent | null> {
try {
const jsonlFile = join(worktreePath, 'nostr', 'repo-events.jsonl');
if (!existsSync(jsonlFile)) {
return null;
}
const content = await readFile(jsonlFile, 'utf-8');
const lines = content.trim().split('\n').filter(Boolean);
// Find the most recent announcement event
let latestAnnouncement: NostrEvent | null = null;
let latestTimestamp = 0;
for (const line of lines) {
try {
const entry = JSON.parse(line);
if (entry.type === 'announcement' && entry.event && entry.timestamp) {
if (entry.timestamp > latestTimestamp) {
latestTimestamp = entry.timestamp;
latestAnnouncement = entry.event;
}
}
} catch {
// Skip invalid lines
continue;
}
}
return latestAnnouncement;
} catch (err) {
logger.debug({ error: err, worktreePath }, 'Failed to read announcement from repo');
return null;
}
}
/**
* Fetch announcement from relays and validate it
*/
async fetchAnnouncementFromRelays(
repoOwnerPubkey: string,
repoName: string
): Promise<NostrEvent | null> {
try {
const nostrClient = new NostrClient(DEFAULT_NOSTR_RELAYS);
const events = await nostrClient.fetchEvents([
{
kinds: [KIND.REPO_ANNOUNCEMENT],
authors: [repoOwnerPubkey],
'#d': [repoName],
limit: 1
}
]);
if (events.length === 0) {
return null;
}
const event = events[0];
// Validate the event
const validation = validateAnnouncementEvent(event, repoName);
if (!validation.valid) {
logger.warn({ error: validation.error, repoName }, 'Fetched announcement failed validation');
return null;
}
return event;
} catch (err) {
logger.debug({ error: err, repoOwnerPubkey, repoName }, 'Failed to fetch announcement from relays');
return null;
}
}
/**
* Save a repo event (announcement or transfer) to nostr/repo-events.jsonl
* Only saves if not already present (for announcements)
* This provides a standard location for all repo-related Nostr events for easy analysis
*/
async saveRepoEventToWorktree(
worktreePath: string,
event: NostrEvent,
eventType: 'announcement' | 'transfer',
skipIfExists: boolean = true
): Promise<boolean> {
try {
// For announcements, check if already exists
if (eventType === 'announcement' && skipIfExists) {
const exists = await this.hasAnnouncementInRepo(worktreePath, event.id);
if (exists) {
logger.debug({ eventId: event.id, worktreePath }, 'Announcement already exists in repo, skipping');
return false;
}
}
// Create nostr directory in worktree
const nostrDir = join(worktreePath, 'nostr');
await mkdir(nostrDir, { recursive: true });
// Append to repo-events.jsonl with event type metadata
const jsonlFile = join(nostrDir, 'repo-events.jsonl');
const eventLine = JSON.stringify({
type: eventType,
timestamp: event.created_at,
event
}) + '\n';
await writeFile(jsonlFile, eventLine, { flag: 'a', encoding: 'utf-8' });
return true;
} catch (err) {
logger.debug({ error: err, worktreePath, eventType }, 'Failed to save repo event to nostr/repo-events.jsonl');
// Don't throw - this is a nice-to-have feature
return false;
}
}
/**
* Ensure announcement event is saved to nostr/repo-events.jsonl in the repository
* Only saves if not already present (avoids redundant entries)
*/
async ensureAnnouncementInRepo(repoPath: string, event: NostrEvent, selfTransferEvent?: NostrEvent): Promise<void> {
try {
// Create a temporary working directory
const repoName = this.urlParser.parseRepoPathForName(repoPath)?.repoName || 'temp';
const workDir = join(repoPath, '..', `${repoName}.work`);
// Clean up if exists
if (existsSync(workDir)) {
await rm(workDir, { recursive: true, force: true });
}
await mkdir(workDir, { recursive: true });
// Clone the bare repo
const git: SimpleGit = simpleGit();
await git.clone(repoPath, workDir);
// Check if announcement already exists in nostr/repo-events.jsonl
const hasAnnouncement = await this.hasAnnouncementInRepo(workDir, event.id);
const filesToAdd: string[] = [];
// Only save announcement if not already present
if (!hasAnnouncement) {
const saved = await this.saveRepoEventToWorktree(workDir, event, 'announcement', false);
if (saved) {
filesToAdd.push('nostr/repo-events.jsonl');
logger.info({ repoPath, eventId: event.id }, 'Saved announcement to nostr/repo-events.jsonl');
}
} else {
logger.debug({ repoPath, eventId: event.id }, 'Announcement already exists in repo, skipping');
}
// Save transfer event if provided
if (selfTransferEvent) {
const saved = await this.saveRepoEventToWorktree(workDir, selfTransferEvent, 'transfer', false);
if (saved) {
if (!filesToAdd.includes('nostr/repo-events.jsonl')) {
filesToAdd.push('nostr/repo-events.jsonl');
}
}
}
// Only commit if we added files
if (filesToAdd.length > 0) {
const workGit: SimpleGit = simpleGit(workDir);
await workGit.add(filesToAdd);
// Use the event timestamp for commit date
const commitDate = new Date(event.created_at * 1000).toISOString();
const commitMessage = selfTransferEvent
? 'Add Nostr repository announcement and initial ownership proof'
: 'Add Nostr repository announcement';
// Note: Initial commits are unsigned. The repository owner can sign their own commits
// when they make changes. The server should never sign commits on behalf of users.
await workGit.commit(commitMessage, filesToAdd, {
'--author': `Nostr <${event.pubkey}@nostr>`,
'--date': commitDate
});
// Push back to bare repo
// Use default branch from environment or try 'main' first, then 'master'
const defaultBranch = process.env.DEFAULT_BRANCH || 'main';
await workGit.push(['origin', defaultBranch]).catch(async () => {
// If default branch doesn't exist, try to create it
try {
await workGit.checkout(['-b', defaultBranch]);
await workGit.push(['origin', defaultBranch]);
} catch {
// If default branch creation fails, try 'main' or 'master' as fallback
const fallbackBranch = defaultBranch === 'main' ? 'master' : 'main';
try {
await workGit.checkout(['-b', fallbackBranch]);
await workGit.push(['origin', fallbackBranch]);
} catch {
// If all fails, log but don't throw - announcement is saved
logger.warn({ repoPath, defaultBranch, fallbackBranch }, 'Failed to push announcement to any branch');
}
}
});
}
// Clean up
await rm(workDir, { recursive: true, force: true });
} catch (error) {
logger.error({ error, repoPath }, 'Failed to ensure announcement in repo');
// Don't throw - announcement file creation is important but shouldn't block provisioning
}
}
/**
* Check if a repository already has an announcement in nostr/repo-events.jsonl
* Used to determine if this is a truly new repo or an existing one being added
*/
async hasAnnouncementInRepoFile(repoPath: string): Promise<boolean> {
if (!existsSync(repoPath)) {
return false;
}
try {
const git: SimpleGit = simpleGit();
const repoName = this.urlParser.parseRepoPathForName(repoPath)?.repoName || 'temp';
const workDir = join(repoPath, '..', `${repoName}.check`);
// Clean up if exists
if (existsSync(workDir)) {
await rm(workDir, { recursive: true, force: true });
}
await mkdir(workDir, { recursive: true });
// Try to clone and check for announcement in nostr/repo-events.jsonl
await git.clone(repoPath, workDir);
const hasAnnouncement = await this.hasAnnouncementInRepo(workDir, undefined);
// Clean up
await rm(workDir, { recursive: true, force: true });
return hasAnnouncement;
} catch {
// If we can't check, assume it doesn't have one
return false;
}
}
}

382
src/lib/services/git/git-remote-sync.ts

@ -0,0 +1,382 @@
/**
* Git Remote Synchronization Service
* Handles syncing repositories to/from remote URLs
*/
import { spawn } from 'child_process';
import simpleGit, { type SimpleGit } from 'simple-git';
import logger from '../logger.js';
import { shouldUseTor, getTorProxy } from '../../utils/tor.js';
import { sanitizeError } from '../../utils/security.js';
import { RepoUrlParser } from './repo-url-parser.js';
/**
* Execute git command with custom environment variables safely
* Uses spawn with argument arrays to prevent command injection
* Security: Only uses whitelisted environment variables, does not spread process.env
*/
function execGitWithEnv(
repoPath: string,
args: string[],
env: Record<string, string> = {}
): Promise<{ stdout: string; stderr: string }> {
return new Promise((resolve, reject) => {
const gitProcess = spawn('git', args, {
cwd: repoPath,
// Security: Only use whitelisted env vars, don't spread process.env
// The env parameter should already contain only safe, whitelisted variables
env: env,
stdio: ['ignore', 'pipe', 'pipe']
});
let stdout = '';
let stderr = '';
gitProcess.stdout.on('data', (chunk: Buffer) => {
stdout += chunk.toString();
});
gitProcess.stderr.on('data', (chunk: Buffer) => {
stderr += chunk.toString();
});
gitProcess.on('close', (code) => {
if (code === 0) {
resolve({ stdout, stderr });
} else {
reject(new Error(`Git command failed with code ${code}: ${stderr || stdout}`));
}
});
gitProcess.on('error', (err) => {
reject(err);
});
});
}
/**
* Git Remote Synchronization Service
* Handles syncing repositories to and from remote URLs
*/
export class GitRemoteSync {
private urlParser: RepoUrlParser;
constructor(repoRoot: string = '/repos', domain: string = 'localhost:6543') {
this.urlParser = new RepoUrlParser(repoRoot, domain);
}
/**
* Get git environment variables with Tor proxy if needed for .onion addresses
* Security: Only whitelist necessary environment variables
*/
getGitEnvForUrl(url: string): Record<string, string> {
// Whitelist only necessary environment variables for security
const env: Record<string, string> = {
PATH: process.env.PATH || '/usr/bin:/bin',
HOME: process.env.HOME || '/tmp',
USER: process.env.USER || 'git',
LANG: process.env.LANG || 'C.UTF-8',
LC_ALL: process.env.LC_ALL || 'C.UTF-8',
};
// Add TZ if set (for consistent timestamps)
if (process.env.TZ) {
env.TZ = process.env.TZ;
}
if (shouldUseTor(url)) {
const proxy = getTorProxy();
if (proxy) {
// Git uses GIT_PROXY_COMMAND for proxy support
// The command receives host and port as arguments
// We'll create a simple proxy command using socat or nc
// Note: This requires socat or netcat-openbsd to be installed
const proxyCommand = `sh -c 'exec socat - SOCKS5:${proxy.host}:${proxy.port}:\\$1:\\$2' || sh -c 'exec nc -X 5 -x ${proxy.host}:${proxy.port} \\$1 \\$2'`;
env.GIT_PROXY_COMMAND = proxyCommand;
// Also set ALL_PROXY for git-remote-http
env.ALL_PROXY = `socks5://${proxy.host}:${proxy.port}`;
// For HTTP/HTTPS URLs, also set http_proxy and https_proxy
try {
const urlObj = new URL(url);
if (urlObj.protocol === 'http:' || urlObj.protocol === 'https:') {
env.http_proxy = `socks5://${proxy.host}:${proxy.port}`;
env.https_proxy = `socks5://${proxy.host}:${proxy.port}`;
}
} catch {
// URL parsing failed, skip proxy env vars
}
}
}
return env;
}
/**
* Inject authentication token into a git URL if needed
* Supports GitHub tokens via GITHUB_TOKEN environment variable
* Returns the original URL if no token is needed or available
*/
injectAuthToken(url: string): string {
try {
const urlObj = new URL(url);
// If URL already has credentials, don't modify it
if (urlObj.username) {
return url;
}
// Check for GitHub token
if (urlObj.hostname === 'github.com' || urlObj.hostname.endsWith('.github.com')) {
const githubToken = process.env.GITHUB_TOKEN;
if (githubToken) {
// Inject token into URL: https://token@github.com/user/repo.git
urlObj.username = githubToken;
urlObj.password = ''; // GitHub uses token as username, password is empty
return urlObj.toString();
}
}
// Add support for other git hosting services here if needed
// e.g., GitLab: GITLAB_TOKEN, Gitea: GITEA_TOKEN, etc.
return url;
} catch {
// URL parsing failed, return original URL
return url;
}
}
/**
* Sync from a single remote URL (helper for parallelization)
*/
private async syncFromSingleRemote(repoPath: string, url: string, index: number): Promise<void> {
const remoteName = `remote-${index}`;
const git = simpleGit(repoPath);
// Inject authentication token if available (e.g., GITHUB_TOKEN)
const authenticatedUrl = this.injectAuthToken(url);
const gitEnv = this.getGitEnvForUrl(authenticatedUrl);
try {
// Add remote if not exists (ignore error if already exists)
// Use authenticated URL so git can access private repos
try {
await git.addRemote(remoteName, authenticatedUrl);
} catch {
// Remote might already exist, that's okay - try to update it
try {
await git.removeRemote(remoteName);
await git.addRemote(remoteName, authenticatedUrl);
} catch {
// If update fails, continue - might be using old URL
}
}
// Configure git proxy for this remote if it's a .onion address
if (shouldUseTor(url)) {
const proxy = getTorProxy();
if (proxy) {
try {
// Use simple-git to set config (safer than exec)
await git.addConfig(`http.${url}.proxy`, `socks5://${proxy.host}:${proxy.port}`, false, 'local');
} catch {
// Config might fail, continue anyway
}
}
}
// Fetch from remote with appropriate environment
// Use spawn with proper argument arrays for security
// Note: 'git fetch <remote>' already fetches all branches from that remote
// The --all flag is only for fetching from all remotes (without specifying a remote)
await execGitWithEnv(repoPath, ['fetch', remoteName], gitEnv);
// Update remote head
try {
await execGitWithEnv(repoPath, ['remote', 'set-head', remoteName, '-a'], gitEnv);
} catch {
// Ignore errors for set-head
}
} catch (error) {
const sanitizedError = sanitizeError(error);
logger.error({ error: sanitizedError, url, repoPath }, 'Failed to sync from remote');
throw error; // Re-throw for Promise.allSettled handling
}
}
/**
* Sync repository from multiple remote URLs (parallelized for efficiency)
*/
async syncFromRemotes(repoPath: string, remoteUrls: string[]): Promise<void> {
if (remoteUrls.length === 0) return;
// Sync all remotes in parallel for better performance
const results = await Promise.allSettled(
remoteUrls.map((url, index) => this.syncFromSingleRemote(repoPath, url, index))
);
// Log any failures but don't throw (partial success is acceptable)
results.forEach((result, index) => {
if (result.status === 'rejected') {
const sanitizedError = sanitizeError(result.reason);
logger.warn({ error: sanitizedError, url: remoteUrls[index], repoPath }, 'Failed to sync from one remote (continuing with others)');
}
});
}
/**
* Check if force push is safe (no divergent history)
* A force push is safe if:
* - Local branch is ahead of remote (linear history, just new commits)
* - Local and remote are at the same commit (no-op)
* A force push is unsafe if:
* - Remote has commits that local doesn't have (would overwrite remote history)
*/
private async canSafelyForcePush(repoPath: string, remoteName: string): Promise<boolean> {
try {
const git = simpleGit(repoPath);
// Get current branch name
const currentBranch = await git.revparse(['--abbrev-ref', 'HEAD']);
if (!currentBranch) {
return false; // Can't determine current branch
}
// Fetch latest remote state
await git.fetch(remoteName);
// Get remote branch reference
const remoteBranch = `${remoteName}/${currentBranch}`;
// Check if remote branch exists
try {
await git.revparse([`refs/remotes/${remoteBranch}`]);
} catch {
// Remote branch doesn't exist yet - safe to push (first push)
return true;
}
// Get local and remote commit SHAs
const localSha = await git.revparse(['HEAD']);
const remoteSha = await git.revparse([`refs/remotes/${remoteBranch}`]);
// If they're the same, it's safe (no-op)
if (localSha === remoteSha) {
return true;
}
// Check if local is ahead (linear history) - safe to force push
// This means all remote commits are ancestors of local commits
const mergeBase = await git.raw(['merge-base', localSha, remoteSha]);
const mergeBaseSha = mergeBase.trim();
// If merge base equals remote SHA, local is ahead (safe)
if (mergeBaseSha === remoteSha) {
return true;
}
// If merge base equals local SHA, remote is ahead (unsafe to force push)
if (mergeBaseSha === localSha) {
return false;
}
// If merge base is different from both, branches have diverged (unsafe)
return false;
} catch (error) {
// If we can't determine, default to false (safer)
logger.warn({ error, repoPath, remoteName }, 'Failed to check branch divergence, defaulting to unsafe');
return false;
}
}
/**
* Sync to a single remote URL with retry logic (helper for parallelization)
*/
private async syncToSingleRemote(repoPath: string, url: string, index: number, maxRetries: number = 3): Promise<void> {
const remoteName = `remote-${index}`;
const git = simpleGit(repoPath);
const gitEnv = this.getGitEnvForUrl(url);
let lastError: Error | null = null;
for (let attempt = 1; attempt <= maxRetries; attempt++) {
try {
// Add remote if not exists
try {
await git.addRemote(remoteName, url);
} catch {
// Remote might already exist, that's okay
}
// Configure git proxy for this remote if it's a .onion address
if (shouldUseTor(url)) {
const proxy = getTorProxy();
if (proxy) {
try {
await git.addConfig(`http.${url}.proxy`, `socks5://${proxy.host}:${proxy.port}`, false, 'local');
} catch {
// Config might fail, continue anyway
}
}
}
// Check if force push is safe
const allowForce = process.env.ALLOW_FORCE_PUSH === 'true' || await this.canSafelyForcePush(repoPath, remoteName);
const forceFlag = allowForce ? ['--force'] : [];
// Push branches with appropriate environment using spawn
await execGitWithEnv(repoPath, ['push', remoteName, '--all', ...forceFlag], gitEnv);
// Push tags
await execGitWithEnv(repoPath, ['push', remoteName, '--tags', ...forceFlag], gitEnv);
// Success - return
return;
} catch (error) {
lastError = error instanceof Error ? error : new Error(String(error));
const sanitizedError = sanitizeError(lastError);
if (attempt < maxRetries) {
// Exponential backoff: wait 2^attempt seconds
const delayMs = Math.pow(2, attempt) * 1000;
logger.warn({
error: sanitizedError,
url,
repoPath,
attempt,
maxRetries,
retryIn: `${delayMs}ms`
}, 'Failed to sync to remote, retrying...');
await new Promise(resolve => setTimeout(resolve, delayMs));
} else {
logger.error({ error: sanitizedError, url, repoPath, attempts: maxRetries }, 'Failed to sync to remote after all retries');
throw lastError;
}
}
}
throw lastError || new Error('Failed to sync to remote');
}
/**
* Sync repository to multiple remote URLs after a push (parallelized with retry)
*/
async syncToRemotes(repoPath: string, remoteUrls: string[]): Promise<void> {
if (remoteUrls.length === 0) return;
// Sync all remotes in parallel for better performance
const results = await Promise.allSettled(
remoteUrls.map((url, index) => this.syncToSingleRemote(repoPath, url, index))
);
// Log any failures but don't throw (partial success is acceptable)
results.forEach((result, index) => {
if (result.status === 'rejected') {
const sanitizedError = sanitizeError(result.reason);
logger.warn({ error: sanitizedError, url: remoteUrls[index], repoPath }, 'Failed to sync to one remote (continuing with others)');
}
});
}
}

613
src/lib/services/git/repo-manager-refactored.ts

@ -0,0 +1,613 @@
/**
* Repository manager for git repositories
* Handles repo provisioning, syncing, and NIP-34 integration
*
* Refactored to use focused service classes:
* - RepoUrlParser: URL parsing and validation
* - GitRemoteSync: Remote syncing (to/from)
* - AnnouncementManager: Announcement handling in repos
* - RepoSizeChecker: Size checking
*/
import { existsSync, mkdirSync, accessSync, constants } from 'fs';
import { join } from 'path';
import { spawn } from 'child_process';
import type { NostrEvent } from '../../types/nostr.js';
import { GIT_DOMAIN } from '../../config.js';
import { validateAnnouncementEvent } from '../nostr/repo-verification.js';
import simpleGit from 'simple-git';
import logger from '../logger.js';
import { sanitizeError } from '../../utils/security.js';
import { isPrivateRepo as checkIsPrivateRepo } from '../../utils/repo-privacy.js';
import { RepoUrlParser, type RepoPath } from './repo-url-parser.js';
import { GitRemoteSync } from './git-remote-sync.js';
import { AnnouncementManager } from './announcement-manager.js';
import { RepoSizeChecker } from './repo-size-checker.js';
import { shouldUseTor, getTorProxy } from '../../utils/tor.js';
/**
* Check if a URL is a GRASP (Git Repository Access via Secure Protocol) URL
* GRASP URLs contain npub (Nostr public key) in the path: https://host/npub.../repo.git
*/
export function isGraspUrl(url: string): boolean {
// GRASP URLs have npub (starts with npub1) in the path
return /\/npub1[a-z0-9]+/i.test(url);
}
export { type RepoPath };
/**
* Repository Manager
* Main facade for repository operations
* Delegates to focused service classes for specific responsibilities
*/
export class RepoManager {
private repoRoot: string;
private domain: string;
private urlParser: RepoUrlParser;
private remoteSync: GitRemoteSync;
private announcementManager: AnnouncementManager;
private sizeChecker: RepoSizeChecker;
constructor(repoRoot: string = '/repos', domain: string = GIT_DOMAIN) {
this.repoRoot = repoRoot;
this.domain = domain;
this.urlParser = new RepoUrlParser(repoRoot, domain);
this.remoteSync = new GitRemoteSync(repoRoot, domain);
this.announcementManager = new AnnouncementManager(repoRoot, domain);
this.sizeChecker = new RepoSizeChecker();
}
/**
* Parse git domain URL to extract npub and repo name
*/
parseRepoUrl(url: string): RepoPath | null {
return this.urlParser.parseRepoUrl(url);
}
/**
* Create a bare git repository from a NIP-34 repo announcement
*
* @param event - The repo announcement event
* @param selfTransferEvent - Optional self-transfer event to include in initial commit
* @param isExistingRepo - Whether this is an existing repo being added to the server
*/
async provisionRepo(event: NostrEvent, selfTransferEvent?: NostrEvent, isExistingRepo: boolean = false): Promise<void> {
const cloneUrls = this.urlParser.extractCloneUrls(event);
const domainUrl = cloneUrls.find(url => url.includes(this.domain));
if (!domainUrl) {
throw new Error(`No ${this.domain} URL found in repo announcement`);
}
const repoPath = this.urlParser.parseRepoUrl(domainUrl);
if (!repoPath) {
throw new Error(`Invalid ${this.domain} URL format`);
}
// Create directory structure
const repoDir = join(this.repoRoot, repoPath.npub);
if (!existsSync(repoDir)) {
mkdirSync(repoDir, { recursive: true });
}
// Check if repo already exists
const repoExists = existsSync(repoPath.fullPath);
// Security: Only allow new repo creation if user has unlimited access
// This prevents spam and abuse
const isNewRepo = !repoExists;
if (isNewRepo && !isExistingRepo) {
const { getCachedUserLevel } = await import('../security/user-level-cache.js');
const { hasUnlimitedAccess } = await import('../../utils/user-access.js');
const userLevel = getCachedUserLevel(event.pubkey);
if (!hasUnlimitedAccess(userLevel?.level)) {
throw new Error(`Repository creation requires unlimited access. User has level: ${userLevel?.level || 'none'}`);
}
}
// If there are other clone URLs, sync from them first (for existing repos)
const otherUrls = cloneUrls.filter(url => !url.includes(this.domain));
if (otherUrls.length > 0 && repoExists) {
// For existing repos, sync first to get the latest state
const remoteUrls = this.urlParser.prepareRemoteUrls(otherUrls);
await this.remoteSync.syncFromRemotes(repoPath.fullPath, remoteUrls);
}
// Validate announcement event before proceeding
const validation = validateAnnouncementEvent(event, repoPath.repoName);
if (!validation.valid) {
throw new Error(`Invalid announcement event: ${validation.error}`);
}
// Create bare repository if it doesn't exist
if (isNewRepo) {
// Use simple-git to create bare repo (safer than exec)
const git = simpleGit();
await git.init(['--bare', repoPath.fullPath]);
// Ensure announcement event is saved to nostr/repo-events.jsonl in the repository
await this.announcementManager.ensureAnnouncementInRepo(repoPath.fullPath, event, selfTransferEvent);
// If there are other clone URLs, sync from them after creating the repo
if (otherUrls.length > 0) {
const remoteUrls = this.urlParser.prepareRemoteUrls(otherUrls);
await this.remoteSync.syncFromRemotes(repoPath.fullPath, remoteUrls);
} else {
// No external URLs - this is a brand new repo, create initial branch and README
await this.createInitialBranchAndReadme(repoPath.fullPath, repoPath.npub, repoPath.repoName, event);
}
} else {
// For existing repos, check if announcement exists in repo
// If not, try to fetch from relays and save it
const hasAnnouncement = await this.announcementManager.hasAnnouncementInRepoFile(repoPath.fullPath);
if (!hasAnnouncement) {
// Try to fetch from relays
const fetchedEvent = await this.announcementManager.fetchAnnouncementFromRelays(event.pubkey, repoPath.repoName);
if (fetchedEvent) {
// Save fetched announcement to repo
await this.announcementManager.ensureAnnouncementInRepo(repoPath.fullPath, fetchedEvent, selfTransferEvent);
} else {
// Announcement not found in repo or relays - this is a problem
logger.warn({ repoPath: repoPath.fullPath }, 'Existing repo has no announcement in repo or on relays');
}
}
if (selfTransferEvent) {
// Ensure self-transfer event is also saved
await this.announcementManager.ensureAnnouncementInRepo(repoPath.fullPath, event, selfTransferEvent);
}
}
}
/**
* Create initial branch and README.md for a new repository
*/
private async createInitialBranchAndReadme(
repoPath: string,
npub: string,
repoName: string,
announcementEvent: NostrEvent
): Promise<void> {
try {
// Get default branch from environment or use 'master'
const defaultBranch = process.env.DEFAULT_BRANCH || 'master';
// Get repo name from d-tag or use repoName from path
const dTag = announcementEvent.tags.find(t => t[0] === 'd')?.[1] || repoName;
// Get name tag for README title, fallback to d-tag
const nameTag = announcementEvent.tags.find(t => t[0] === 'name')?.[1] || dTag;
// Get author info from user profile (fetch from relays)
const { fetchUserProfile, extractProfileData, getUserName, getUserEmail } = await import('../../utils/user-profile.js');
const { nip19 } = await import('nostr-tools');
const { DEFAULT_NOSTR_RELAYS } = await import('../../config.js');
const userNpub = nip19.npubEncode(announcementEvent.pubkey);
const profileEvent = await fetchUserProfile(announcementEvent.pubkey, DEFAULT_NOSTR_RELAYS);
const profile = extractProfileData(profileEvent);
const authorName = getUserName(profile, announcementEvent.pubkey, userNpub);
const authorEmail = getUserEmail(profile, announcementEvent.pubkey, userNpub);
// Create README.md content
const readmeContent = `# ${nameTag}
Welcome to your new GitRepublic repo.
You can use this read-me file to explain the purpose of this repo to everyone who looks at it. You can also make a ReadMe.adoc file and delete this one, if you prefer. GitRepublic supports both markups.
Your commits will all be signed by your Nostr keys and saved to the event files in the ./nostr folder.
`;
// Use FileManager to create the initial branch and files
const { FileManager } = await import('./file-manager.js');
const fileManager = new FileManager(this.repoRoot);
// For a new repo with no branches, we need to create an orphan branch first
// Check if repo has any branches
const git = simpleGit(repoPath);
let hasBranches = false;
try {
const branches = await git.branch(['-a']);
hasBranches = branches.all.length > 0;
} catch {
// No branches exist
hasBranches = false;
}
if (!hasBranches) {
// Create orphan branch first (pass undefined for fromBranch to create orphan)
await fileManager.createBranch(npub, repoName, defaultBranch, undefined);
}
// Create both README.md and announcement in the initial commit
// We'll use a worktree to write both files and commit them together
const workDir = await fileManager.getWorktree(repoPath, defaultBranch, npub, repoName);
const { writeFile: writeFileFs } = await import('fs/promises');
const { join } = await import('path');
// Write README.md
const readmePath = join(workDir, 'README.md');
await writeFileFs(readmePath, readmeContent, 'utf-8');
// Save repo announcement event to nostr/repo-events.jsonl (only if not already present)
const announcementSaved = await this.announcementManager.saveRepoEventToWorktree(workDir, announcementEvent, 'announcement', true);
// Stage files
const workGit = simpleGit(workDir);
const filesToAdd: string[] = ['README.md'];
if (announcementSaved) {
filesToAdd.push('nostr/repo-events.jsonl');
}
await workGit.add(filesToAdd);
// Commit files together
await workGit.commit('Initial commit', filesToAdd, {
'--author': `${authorName} <${authorEmail}>`
});
// Clean up worktree
await fileManager.removeWorktree(repoPath, workDir);
logger.info({ npub, repoName, branch: defaultBranch }, 'Created initial branch and README.md');
} catch (err) {
// Log but don't fail - initial README creation is nice-to-have
const sanitizedErr = sanitizeError(err);
logger.warn({ error: sanitizedErr, repoPath, npub, repoName }, 'Failed to create initial branch and README, continuing anyway');
}
}
/**
* Sync repository from multiple remote URLs (parallelized for efficiency)
*/
async syncFromRemotes(repoPath: string, remoteUrls: string[]): Promise<void> {
await this.remoteSync.syncFromRemotes(repoPath, remoteUrls);
}
/**
* Sync repository to multiple remote URLs after a push (parallelized with retry)
*/
async syncToRemotes(repoPath: string, remoteUrls: string[]): Promise<void> {
await this.remoteSync.syncToRemotes(repoPath, remoteUrls);
}
/**
* Check if a repository exists
*/
repoExists(repoPath: string): boolean {
return existsSync(repoPath);
}
/**
* Fetch repository on-demand from remote clone URLs
* This allows displaying repositories that haven't been provisioned yet
*
* @param npub - Repository owner npub
* @param repoName - Repository name
* @param announcementEvent - The Nostr repo announcement event (optional, will fetch if not provided)
* @returns true if repository was successfully fetched, false otherwise
*/
async fetchRepoOnDemand(
npub: string,
repoName: string,
announcementEvent?: NostrEvent
): Promise<{ success: boolean; needsAnnouncement?: boolean; announcement?: NostrEvent; error?: string; cloneUrls?: string[]; remoteUrls?: string[] }> {
const repoPath = join(this.repoRoot, npub, `${repoName}.git`);
// If repo already exists, check if it has an announcement
if (existsSync(repoPath)) {
const hasAnnouncement = await this.announcementManager.hasAnnouncementInRepoFile(repoPath);
if (hasAnnouncement) {
return { success: true };
}
// Repo exists but no announcement - try to fetch from relays
const { requireNpubHex: requireNpubHexUtil } = await import('../../utils/npub-utils.js');
const repoOwnerPubkey = requireNpubHexUtil(npub);
const fetchedAnnouncement = await this.announcementManager.fetchAnnouncementFromRelays(repoOwnerPubkey, repoName);
if (fetchedAnnouncement) {
// Save fetched announcement to repo
await this.announcementManager.ensureAnnouncementInRepo(repoPath, fetchedAnnouncement);
return { success: true, announcement: fetchedAnnouncement };
}
// Repo exists but no announcement found - needs announcement
return { success: false, needsAnnouncement: true };
}
// If no announcement provided, try to fetch from relays
if (!announcementEvent) {
const { requireNpubHex: requireNpubHexUtil } = await import('../../utils/npub-utils.js');
const repoOwnerPubkey = requireNpubHexUtil(npub);
const fetchedAnnouncement = await this.announcementManager.fetchAnnouncementFromRelays(repoOwnerPubkey, repoName);
if (fetchedAnnouncement) {
announcementEvent = fetchedAnnouncement;
} else {
// No announcement found - needs announcement
return { success: false, needsAnnouncement: true };
}
}
// Check if repository is public
const isPublic = !checkIsPrivateRepo(announcementEvent);
// Security: For public repos, allow on-demand fetching regardless of owner's access level
// For private repos, require owner to have unlimited access to prevent unauthorized creation
if (!isPublic) {
const { getCachedUserLevel } = await import('../security/user-level-cache.js');
const { hasUnlimitedAccess } = await import('../../utils/user-access.js');
const userLevel = getCachedUserLevel(announcementEvent.pubkey);
if (!hasUnlimitedAccess(userLevel?.level)) {
logger.warn({
npub,
repoName,
pubkey: announcementEvent.pubkey.slice(0, 16) + '...',
level: userLevel?.level || 'none'
}, 'Skipping on-demand repo fetch: private repo requires owner with unlimited access');
return { success: false, needsAnnouncement: false };
}
} else {
logger.info({
npub,
repoName,
pubkey: announcementEvent.pubkey.slice(0, 16) + '...'
}, 'Allowing on-demand fetch for public repository');
}
// Extract clone URLs and prepare remote URLs
const cloneUrls = this.urlParser.extractCloneUrls(announcementEvent);
let remoteUrls: string[] = [];
try {
// Prepare remote URLs (filters out localhost/our domain, converts SSH to HTTPS)
remoteUrls = this.urlParser.prepareRemoteUrls(cloneUrls);
if (remoteUrls.length === 0) {
logger.warn({ npub, repoName, cloneUrls, announcementEventId: announcementEvent.id }, 'No remote clone URLs found for on-demand fetch');
return { success: false, needsAnnouncement: false };
}
logger.debug({ npub, repoName, cloneUrls, remoteUrls, isPublic }, 'On-demand fetch details');
// Check if repoRoot exists and is writable
if (!existsSync(this.repoRoot)) {
try {
mkdirSync(this.repoRoot, { recursive: true });
logger.info({ repoRoot: this.repoRoot }, 'Created repos root directory');
} catch (err) {
const error = err instanceof Error ? err : new Error(String(err));
logger.error({
repoRoot: this.repoRoot,
error: error.message
}, 'Failed to create repos root directory');
throw new Error(`Cannot create repos root directory at ${this.repoRoot}. Please check permissions: ${error.message}`);
}
} else {
// Check if repoRoot is writable
try {
accessSync(this.repoRoot, constants.W_OK);
} catch (err) {
const error = err instanceof Error ? err : new Error(String(err));
logger.error({
repoRoot: this.repoRoot,
error: error.message
}, 'Repos root directory is not writable');
throw new Error(`Repos root directory at ${this.repoRoot} is not writable. Please fix permissions (e.g., chmod 755 ${this.repoRoot} or chown to the correct user).`);
}
}
// Create directory structure
const repoDir = join(this.repoRoot, npub);
if (!existsSync(repoDir)) {
try {
mkdirSync(repoDir, { recursive: true });
} catch (err) {
const error = err instanceof Error ? err : new Error(String(err));
if (error.message.includes('EACCES') || error.message.includes('permission denied')) {
logger.error({
npub,
repoName,
repoDir,
repoRoot: this.repoRoot,
error: error.message
}, 'Permission denied when creating repository directory');
throw new Error(`Permission denied: Cannot create repository directory at ${repoDir}. Please check that the server has write permissions to ${this.repoRoot}.`);
}
throw error;
}
}
// Get git environment for URL (handles Tor proxy, etc.)
const gitEnv = this.getGitEnvForUrl(remoteUrls[0]);
// Inject authentication token if available
const authenticatedUrl = this.injectAuthToken(remoteUrls[0]);
// Log if we're using authentication (but don't log the token)
const isAuthenticated = authenticatedUrl !== remoteUrls[0];
logger.info({
npub,
repoName,
sourceUrl: remoteUrls[0],
cloneUrls,
authenticated: isAuthenticated
}, 'Fetching repository on-demand from remote');
// Clone as bare repository
await new Promise<void>((resolve, reject) => {
const cloneProcess = spawn('git', ['clone', '--bare', authenticatedUrl, repoPath], {
env: gitEnv,
stdio: ['ignore', 'pipe', 'pipe']
});
let stderr = '';
let stdout = '';
cloneProcess.stderr.on('data', (chunk: Buffer) => {
stderr += chunk.toString();
});
cloneProcess.stdout.on('data', (chunk: Buffer) => {
stdout += chunk.toString();
});
cloneProcess.on('close', (code) => {
if (code === 0) {
logger.info({ npub, repoName, sourceUrl: remoteUrls[0] }, 'Successfully cloned repository');
resolve();
} else {
const errorMsg = `Git clone failed with code ${code}: ${stderr || stdout}`;
logger.error({
npub,
repoName,
sourceUrl: remoteUrls[0],
code,
stderr,
stdout,
authenticated: isAuthenticated
}, 'Git clone failed');
reject(new Error(errorMsg));
}
});
cloneProcess.on('error', (err) => {
logger.error({
npub,
repoName,
sourceUrl: remoteUrls[0],
error: err,
authenticated: isAuthenticated
}, 'Git clone process error');
reject(err);
});
});
// Verify the repository was actually created
if (!existsSync(repoPath)) {
throw new Error('Repository clone completed but repository path does not exist');
}
// Ensure announcement is saved to nostr/repo-events.jsonl (non-blocking - repo is usable without it)
try {
await this.announcementManager.ensureAnnouncementInRepo(repoPath, announcementEvent);
} catch (verifyError) {
// Announcement file creation is optional - log but don't fail
logger.warn({ error: verifyError, npub, repoName }, 'Failed to ensure announcement in repo, but repository is usable');
}
logger.info({ npub, repoName }, 'Successfully fetched repository on-demand');
return { success: true, announcement: announcementEvent };
} catch (error) {
const sanitizedError = sanitizeError(error);
const errorMessage = error instanceof Error ? error.message : String(error);
logger.error({
error: sanitizedError,
npub,
repoName,
cloneUrls,
isPublic,
remoteUrls,
errorMessage
}, 'Failed to fetch repository on-demand');
return {
success: false,
needsAnnouncement: false,
error: errorMessage,
cloneUrls,
remoteUrls
};
}
}
/**
* Get repository size in bytes
* Returns the total size of the repository directory
*/
async getRepoSize(repoPath: string): Promise<number> {
return this.sizeChecker.getRepoSize(repoPath);
}
/**
* Check if repository size exceeds the maximum (2 GB)
*/
async checkRepoSizeLimit(repoPath: string, maxSizeBytes: number = 2 * 1024 * 1024 * 1024): Promise<{ withinLimit: boolean; currentSize: number; maxSize: number; error?: string }> {
return this.sizeChecker.checkRepoSizeLimit(repoPath, maxSizeBytes);
}
/**
* Get git environment variables with Tor proxy if needed for .onion addresses
* Security: Only whitelist necessary environment variables
*/
private getGitEnvForUrl(url: string): Record<string, string> {
// Whitelist only necessary environment variables for security
const env: Record<string, string> = {
PATH: process.env.PATH || '/usr/bin:/bin',
HOME: process.env.HOME || '/tmp',
USER: process.env.USER || 'git',
LANG: process.env.LANG || 'C.UTF-8',
LC_ALL: process.env.LC_ALL || 'C.UTF-8',
};
// Add TZ if set (for consistent timestamps)
if (process.env.TZ) {
env.TZ = process.env.TZ;
}
if (shouldUseTor(url)) {
const proxy = getTorProxy();
if (proxy) {
// Git uses GIT_PROXY_COMMAND for proxy support
const proxyCommand = `sh -c 'exec socat - SOCKS5:${proxy.host}:${proxy.port}:\\$1:\\$2' || sh -c 'exec nc -X 5 -x ${proxy.host}:${proxy.port} \\$1 \\$2'`;
env.GIT_PROXY_COMMAND = proxyCommand;
env.ALL_PROXY = `socks5://${proxy.host}:${proxy.port}`;
// For HTTP/HTTPS URLs, also set http_proxy and https_proxy
try {
const urlObj = new URL(url);
if (urlObj.protocol === 'http:' || urlObj.protocol === 'https:') {
env.http_proxy = `socks5://${proxy.host}:${proxy.port}`;
env.https_proxy = `socks5://${proxy.host}:${proxy.port}`;
}
} catch {
// URL parsing failed, skip proxy env vars
}
}
}
return env;
}
/**
* Inject authentication token into a git URL if needed
* Supports GitHub tokens via GITHUB_TOKEN environment variable
* Returns the original URL if no token is needed or available
*/
private injectAuthToken(url: string): string {
try {
const urlObj = new URL(url);
// If URL already has credentials, don't modify it
if (urlObj.username) {
return url;
}
// Check for GitHub token
if (urlObj.hostname === 'github.com' || urlObj.hostname.endsWith('.github.com')) {
const githubToken = process.env.GITHUB_TOKEN;
if (githubToken) {
// Inject token into URL: https://token@github.com/user/repo.git
urlObj.username = githubToken;
urlObj.password = ''; // GitHub uses token as username, password is empty
return urlObj.toString();
}
}
// Add support for other git hosting services here if needed
// e.g., GitLab: GITLAB_TOKEN, Gitea: GITEA_TOKEN, etc.
return url;
} catch {
// URL parsing failed, return original URL
return url;
}
}
}

911
src/lib/services/git/repo-manager.ts

File diff suppressed because it is too large Load Diff

1304
src/lib/services/git/repo-manager.ts.old

File diff suppressed because it is too large Load Diff

79
src/lib/services/git/repo-size-checker.ts

@ -0,0 +1,79 @@
/**
* Repository Size Checker
* Handles checking repository sizes and enforcing limits
*/
import { existsSync, statSync } from 'fs';
import { readdir } from 'fs/promises';
import { join } from 'path';
import logger from '../logger.js';
/**
* Repository Size Checker
* Handles checking repository sizes and enforcing limits
*/
export class RepoSizeChecker {
/**
* Get repository size in bytes
* Returns the total size of the repository directory
*/
async getRepoSize(repoPath: string): Promise<number> {
if (!existsSync(repoPath)) {
return 0;
}
let totalSize = 0;
async function calculateSize(dirPath: string): Promise<number> {
let size = 0;
try {
const entries = await readdir(dirPath, { withFileTypes: true });
for (const entry of entries) {
const fullPath = join(dirPath, entry.name);
if (entry.isDirectory()) {
size += await calculateSize(fullPath);
} else if (entry.isFile()) {
try {
const stats = statSync(fullPath);
size += stats.size;
} catch {
// Ignore errors accessing files
}
}
}
} catch {
// Ignore errors accessing directories
}
return size;
}
totalSize = await calculateSize(repoPath);
return totalSize;
}
/**
* Check if repository size exceeds the maximum (2 GB)
*/
async checkRepoSizeLimit(repoPath: string, maxSizeBytes: number = 2 * 1024 * 1024 * 1024): Promise<{ withinLimit: boolean; currentSize: number; maxSize: number; error?: string }> {
try {
const currentSize = await this.getRepoSize(repoPath);
const withinLimit = currentSize <= maxSizeBytes;
return {
withinLimit,
currentSize,
maxSize: maxSizeBytes,
...(withinLimit ? {} : { error: `Repository size (${(currentSize / 1024 / 1024 / 1024).toFixed(2)} GB) exceeds maximum (${(maxSizeBytes / 1024 / 1024 / 1024).toFixed(2)} GB)` })
};
} catch (error) {
return {
withinLimit: false,
currentSize: 0,
maxSize: maxSizeBytes,
error: `Failed to check repository size: ${error instanceof Error ? error.message : String(error)}`
};
}
}
}

152
src/lib/services/git/repo-url-parser.ts

@ -0,0 +1,152 @@
/**
* Repository URL Parser
* Handles parsing and validation of repository URLs
*/
import { join } from 'path';
import { GIT_DOMAIN } from '../../config.js';
import { extractCloneUrls } from '../../utils/nostr-utils.js';
import type { NostrEvent } from '../../types/nostr.js';
export interface RepoPath {
npub: string;
repoName: string;
fullPath: string;
}
/**
* Check if a URL is a GRASP (Git Repository Access via Secure Protocol) URL
* GRASP URLs contain npub (Nostr public key) in the path: https://host/npub.../repo.git
*/
export function isGraspUrl(url: string): boolean {
// GRASP URLs have npub (starts with npub1) in the path
return /\/npub1[a-z0-9]+/i.test(url);
}
/**
* Repository URL Parser
* Handles parsing git domain URLs and extracting repository information
*/
export class RepoUrlParser {
private repoRoot: string;
private domain: string;
constructor(repoRoot: string = '/repos', domain: string = GIT_DOMAIN) {
this.repoRoot = repoRoot;
this.domain = domain;
}
/**
* Parse git domain URL to extract npub and repo name
*/
parseRepoUrl(url: string): RepoPath | null {
// Match: https://{domain}/{npub}/{repo-name}.git or http://{domain}/{npub}/{repo-name}.git
// Escape domain for regex (replace dots with \.)
const escapedDomain = this.domain.replace(/\./g, '\\.');
const match = url.match(new RegExp(`${escapedDomain}\\/(npub[a-z0-9]+)\\/([^\\/]+)\\.git`));
if (!match) return null;
const [, npub, repoName] = match;
const fullPath = join(this.repoRoot, npub, `${repoName}.git`);
return { npub, repoName, fullPath };
}
/**
* Extract clone URLs from a NIP-34 repo announcement
* Uses shared utility with normalization enabled
*/
extractCloneUrls(event: NostrEvent): string[] {
return extractCloneUrls(event, true);
}
/**
* Convert SSH URL to HTTPS URL if possible
* e.g., git@github.com:user/repo.git -> https://github.com/user/repo.git
*/
convertSshToHttps(url: string): string | null {
// Check if it's an SSH URL (git@host:path or ssh://)
const sshMatch = url.match(/^git@([^:]+):(.+)$/);
if (sshMatch) {
const [, host, path] = sshMatch;
// Remove .git suffix if present, we'll add it back
const cleanPath = path.replace(/\.git$/, '');
return `https://${host}/${cleanPath}.git`;
}
// Check for ssh:// URLs
if (url.startsWith('ssh://')) {
const sshUrlMatch = url.match(/^ssh:\/\/([^/]+)\/(.+)$/);
if (sshUrlMatch) {
const [, host, path] = sshUrlMatch;
const cleanPath = path.replace(/\.git$/, '');
return `https://${host}/${cleanPath}.git`;
}
}
return null;
}
/**
* Filter and prepare remote URLs from clone URLs
* Respects the repo owner's order in the clone list
*/
prepareRemoteUrls(cloneUrls: string[]): string[] {
const httpsUrls: string[] = [];
const sshUrls: string[] = [];
for (const url of cloneUrls) {
const lowerUrl = url.toLowerCase();
// Skip localhost and our own domain
if (lowerUrl.includes('localhost') ||
lowerUrl.includes('127.0.0.1') ||
url.includes(this.domain)) {
continue;
}
// Check if it's an SSH URL
if (url.startsWith('git@') || url.startsWith('ssh://')) {
sshUrls.push(url);
// Try to convert to HTTPS (preserve original order by appending)
const httpsUrl = this.convertSshToHttps(url);
if (httpsUrl) {
httpsUrls.push(httpsUrl);
}
} else {
// It's already HTTPS/HTTP - preserve original order
httpsUrls.push(url);
}
}
// Respect the repo owner's order: use HTTPS URLs in the order they appeared in clone list
let remoteUrls = httpsUrls;
// If no HTTPS URLs, try SSH URLs (but log a warning)
if (remoteUrls.length === 0 && sshUrls.length > 0) {
remoteUrls = sshUrls;
}
// If no external URLs, try any URL that's not our domain (preserve order)
if (remoteUrls.length === 0) {
remoteUrls = cloneUrls.filter(url => !url.includes(this.domain));
}
// If still no remote URLs, but there are *any* clone URLs, try the first one
// This handles cases where the only clone URL is our own domain, but the repo doesn't exist locally yet
if (remoteUrls.length === 0 && cloneUrls.length > 0) {
remoteUrls.push(cloneUrls[0]);
}
return remoteUrls;
}
/**
* Parse repo path to extract repo name (helper for verification file creation)
*/
parseRepoPathForName(repoPath: string): { repoName: string } | null {
const match = repoPath.match(/\/([^\/]+)\.git$/);
if (!match) return null;
return { repoName: match[1] };
}
}

7
src/lib/services/nostr/nip98-auth.ts

@ -7,6 +7,7 @@ import { verifyEvent } from 'nostr-tools';
import type { NostrEvent } from '../../types/nostr.js'; import type { NostrEvent } from '../../types/nostr.js';
import { KIND } from '../../types/nostr.js'; import { KIND } from '../../types/nostr.js';
import { createHash } from 'crypto'; import { createHash } from 'crypto';
import { NIP98_AUTH_WINDOW_SECONDS } from '../../config.js';
export interface NIP98AuthResult { export interface NIP98AuthResult {
valid: boolean; valid: boolean;
@ -69,13 +70,13 @@ export function verifyNIP98Auth(
}; };
} }
// Check created_at timestamp (within 60 seconds per spec) // Check created_at timestamp (within configured window, default 60 seconds per spec)
const now = Math.floor(Date.now() / 1000); const now = Math.floor(Date.now() / 1000);
const eventAge = now - nostrEvent.created_at; const eventAge = now - nostrEvent.created_at;
if (eventAge > 60) { if (eventAge > NIP98_AUTH_WINDOW_SECONDS) {
return { return {
valid: false, valid: false,
error: 'Authentication event is too old (must be within 60 seconds)' error: `Authentication event is too old (must be within ${NIP98_AUTH_WINDOW_SECONDS} seconds)`
}; };
} }
if (eventAge < 0) { if (eventAge < 0) {

83
src/lib/utils/security.ts

@ -103,3 +103,86 @@ export function redactSensitiveData(obj: Record<string, any>): Record<string, an
return redacted; return redacted;
} }
/**
* Get path resolve function (server-side only)
* Uses a function factory to avoid bundling path module in browser builds
*/
function getPathResolve(): ((path: string) => string) | null {
// Browser check - path module not available
if (typeof window !== 'undefined') {
return null;
}
// Server-side: dynamically access path module
// This pattern prevents Vite from trying to bundle path for the browser
try {
// Access path through a function to avoid static analysis
const path = (globalThis as any).require?.('path') ||
(typeof process !== 'undefined' && process.versions?.node
? (() => {
// This will only work in Node.js environment
// Vite will externalize this for browser builds
try {
// @ts-ignore - path is a Node.js built-in
return require('path');
} catch {
return null;
}
})()
: null);
return path?.resolve || null;
} catch {
return null;
}
}
/**
* Validate repository path to prevent path traversal attacks
* Ensures the resolved path is within the repository root directory
*
* NOTE: This function is server-only and uses Node.js path module
* In browser environments, it performs basic validation only
*
* @param repoPath - The repository path to validate
* @param repoRoot - The root directory for repositories
* @returns Object with validation result and error message if invalid
*/
export function validateRepoPath(repoPath: string, repoRoot: string): { valid: boolean; error?: string; resolvedPath?: string } {
if (!repoPath || typeof repoPath !== 'string') {
return { valid: false, error: 'Repository path is required' };
}
if (!repoRoot || typeof repoRoot !== 'string') {
return { valid: false, error: 'Repository root is required' };
}
// Try to get path.resolve function (only available on server)
const pathResolve = getPathResolve();
if (!pathResolve) {
// Browser environment - use simple string validation
// This is a fallback, but server-side code should always be used for path validation
if (repoPath.includes('..') || repoPath.includes('//')) {
return { valid: false, error: 'Invalid repository path: path traversal detected' };
}
return { valid: true, resolvedPath: repoPath };
}
// Server-side: use Node.js path module
try {
// Normalize paths to handle Windows/Unix differences
const resolvedPath = pathResolve(repoPath).replace(/\\/g, '/');
const resolvedRoot = pathResolve(repoRoot).replace(/\\/g, '/');
// Must be a subdirectory of repoRoot, not equal to it
if (!resolvedPath.startsWith(resolvedRoot + '/')) {
return { valid: false, error: 'Invalid repository path: path traversal detected' };
}
return { valid: true, resolvedPath };
} catch (err) {
return { valid: false, error: `Failed to validate path: ${err instanceof Error ? err.message : String(err)}` };
}
}

47
src/routes/api/git/[...path]/+server.ts

@ -5,33 +5,24 @@
import { error, json } from '@sveltejs/kit'; import { error, json } from '@sveltejs/kit';
import type { RequestHandler } from './$types'; import type { RequestHandler } from './$types';
import { RepoManager } from '$lib/services/git/repo-manager.js';
import { requireNpubHex } from '$lib/utils/npub-utils.js'; import { requireNpubHex } from '$lib/utils/npub-utils.js';
import { spawn } from 'child_process'; import { spawn } from 'child_process';
import { existsSync } from 'fs'; import { existsSync } from 'fs';
import { join, resolve } from 'path'; import { join, resolve } from 'path';
import { DEFAULT_NOSTR_RELAYS } from '$lib/config.js'; import { DEFAULT_NOSTR_RELAYS, GIT_OPERATION_TIMEOUT_MS } from '$lib/config.js';
import { NostrClient } from '$lib/services/nostr/nostr-client.js';
import { KIND } from '$lib/types/nostr.js'; import { KIND } from '$lib/types/nostr.js';
import type { NostrEvent } from '$lib/types/nostr.js'; import type { NostrEvent } from '$lib/types/nostr.js';
import { verifyNIP98Auth } from '$lib/services/nostr/nip98-auth.js'; import { verifyNIP98Auth } from '$lib/services/nostr/nip98-auth.js';
import { OwnershipTransferService } from '$lib/services/nostr/ownership-transfer-service.js';
import { MaintainerService } from '$lib/services/nostr/maintainer-service.js';
import { BranchProtectionService } from '$lib/services/nostr/branch-protection-service.js';
import logger from '$lib/services/logger.js'; import logger from '$lib/services/logger.js';
import { auditLogger } from '$lib/services/security/audit-logger.js'; import { auditLogger } from '$lib/services/security/audit-logger.js';
import { isValidBranchName, sanitizeError } from '$lib/utils/security.js'; import { isValidBranchName, sanitizeError, validateRepoPath } from '$lib/utils/security.js';
import { extractCloneUrls, fetchRepoAnnouncementsWithCache, findRepoAnnouncement } from '$lib/utils/nostr-utils.js'; import { extractCloneUrls, fetchRepoAnnouncementsWithCache, findRepoAnnouncement } from '$lib/utils/nostr-utils.js';
import { eventCache } from '$lib/services/nostr/event-cache.js'; import { eventCache } from '$lib/services/nostr/event-cache.js';
import { repoManager, maintainerService, ownershipTransferService, branchProtectionService, nostrClient } from '$lib/services/service-registry.js';
// Resolve GIT_REPO_ROOT to absolute path (handles both relative and absolute paths) // Resolve GIT_REPO_ROOT to absolute path (handles both relative and absolute paths)
const repoRootEnv = process.env.GIT_REPO_ROOT || '/repos'; const repoRootEnv = process.env.GIT_REPO_ROOT || '/repos';
const repoRoot = resolve(repoRootEnv); const repoRoot = resolve(repoRootEnv);
const repoManager = new RepoManager(repoRoot);
const nostrClient = new NostrClient(DEFAULT_NOSTR_RELAYS);
const ownershipTransferService = new OwnershipTransferService(DEFAULT_NOSTR_RELAYS);
const maintainerService = new MaintainerService(DEFAULT_NOSTR_RELAYS);
const branchProtectionService = new BranchProtectionService(DEFAULT_NOSTR_RELAYS);
// Path to git-http-backend (common locations) // Path to git-http-backend (common locations)
// Alpine Linux: /usr/lib/git-core/git-http-backend // Alpine Linux: /usr/lib/git-core/git-http-backend
@ -205,15 +196,13 @@ export const GET: RequestHandler = async ({ params, url, request }) => {
// Get repository path with security validation // Get repository path with security validation
const repoPath = join(repoRoot, npub, `${repoName}.git`); const repoPath = join(repoRoot, npub, `${repoName}.git`);
// Security: Ensure the resolved path is within repoRoot to prevent path traversal // Security: Ensure the resolved path is within repoRoot to prevent path traversal
// Normalize paths to handle Windows/Unix differences const pathValidation = validateRepoPath(repoPath, repoRoot);
const resolvedPath = resolve(repoPath).replace(/\\/g, '/'); if (!pathValidation.valid) {
const resolvedRoot = resolve(repoRoot).replace(/\\/g, '/'); return error(403, pathValidation.error || 'Invalid repository path');
// Must be a subdirectory of repoRoot, not equal to it
if (!resolvedPath.startsWith(resolvedRoot + '/')) {
return error(403, 'Invalid repository path');
} }
const resolvedPath = pathValidation.resolvedPath!;
if (!repoManager.repoExists(repoPath)) { if (!repoManager.repoExists(repoPath)) {
logger.warn({ repoPath, resolvedPath, repoRoot, resolvedRoot }, 'Repository not found at expected path'); logger.warn({ repoPath, resolvedPath, repoRoot }, 'Repository not found at expected path');
return error(404, `Repository not found at ${resolvedPath}. Please check GIT_REPO_ROOT environment variable (currently: ${repoRoot})`); return error(404, `Repository not found at ${resolvedPath}. Please check GIT_REPO_ROOT environment variable (currently: ${repoRoot})`);
} }
@ -361,8 +350,8 @@ export const GET: RequestHandler = async ({ params, url, request }) => {
const operation = service === 'git-upload-pack' || gitPath === 'git-upload-pack' ? 'fetch' : 'clone'; const operation = service === 'git-upload-pack' || gitPath === 'git-upload-pack' ? 'fetch' : 'clone';
return new Promise((resolve) => { return new Promise((resolve) => {
// Security: Set timeout for git operations (5 minutes max) // Security: Set timeout for git operations
const timeoutMs = 5 * 60 * 1000; const timeoutMs = GIT_OPERATION_TIMEOUT_MS;
let timeoutId: NodeJS.Timeout; let timeoutId: NodeJS.Timeout;
const gitProcess = spawn(gitHttpBackend, [], { const gitProcess = spawn(gitHttpBackend, [], {
@ -593,15 +582,13 @@ export const POST: RequestHandler = async ({ params, url, request }) => {
// Get repository path with security validation // Get repository path with security validation
const repoPath = join(repoRoot, npub, `${repoName}.git`); const repoPath = join(repoRoot, npub, `${repoName}.git`);
// Security: Ensure the resolved path is within repoRoot to prevent path traversal // Security: Ensure the resolved path is within repoRoot to prevent path traversal
// Normalize paths to handle Windows/Unix differences const pathValidation = validateRepoPath(repoPath, repoRoot);
const resolvedPath = resolve(repoPath).replace(/\\/g, '/'); if (!pathValidation.valid) {
const resolvedRoot = resolve(repoRoot).replace(/\\/g, '/'); return error(403, pathValidation.error || 'Invalid repository path');
// Must be a subdirectory of repoRoot, not equal to it
if (!resolvedPath.startsWith(resolvedRoot + '/')) {
return error(403, 'Invalid repository path');
} }
const resolvedPath = pathValidation.resolvedPath!;
if (!repoManager.repoExists(repoPath)) { if (!repoManager.repoExists(repoPath)) {
logger.warn({ repoPath, resolvedPath, repoRoot, resolvedRoot }, 'Repository not found at expected path'); logger.warn({ repoPath, resolvedPath, repoRoot }, 'Repository not found at expected path');
return error(404, `Repository not found at ${resolvedPath}. Please check GIT_REPO_ROOT environment variable (currently: ${repoRoot})`); return error(404, `Repository not found at ${resolvedPath}. Please check GIT_REPO_ROOT environment variable (currently: ${repoRoot})`);
} }
@ -902,8 +889,8 @@ export const POST: RequestHandler = async ({ params, url, request }) => {
const operation = gitPath === 'git-receive-pack' || path.includes('git-receive-pack') ? 'push' : 'fetch'; const operation = gitPath === 'git-receive-pack' || path.includes('git-receive-pack') ? 'push' : 'fetch';
return new Promise((resolve) => { return new Promise((resolve) => {
// Security: Set timeout for git operations (5 minutes max) // Security: Set timeout for git operations
const timeoutMs = 5 * 60 * 1000; const timeoutMs = GIT_OPERATION_TIMEOUT_MS;
let timeoutId: NodeJS.Timeout; let timeoutId: NodeJS.Timeout;
const gitProcess = spawn(gitHttpBackend, [], { const gitProcess = spawn(gitHttpBackend, [], {

82
src/routes/api/repos/[npub]/[repo]/clone/+server.ts

@ -5,12 +5,9 @@
import { error, json } from '@sveltejs/kit'; import { error, json } from '@sveltejs/kit';
import type { RequestHandler } from './$types'; import type { RequestHandler } from './$types';
import { RepoManager } from '$lib/services/git/repo-manager.js';
import { requireNpubHex } from '$lib/utils/npub-utils.js'; import { requireNpubHex } from '$lib/utils/npub-utils.js';
import { existsSync } from 'fs'; import { existsSync } from 'fs';
import { join } from 'path'; import { join, resolve } from 'path';
import { DEFAULT_NOSTR_RELAYS } from '$lib/config.js';
import { NostrClient } from '$lib/services/nostr/nostr-client.js';
import { KIND } from '$lib/types/nostr.js'; import { KIND } from '$lib/types/nostr.js';
import { extractRequestContext } from '$lib/utils/api-context.js'; import { extractRequestContext } from '$lib/utils/api-context.js';
import { getCachedUserLevel, cacheUserLevel } from '$lib/services/security/user-level-cache.js'; import { getCachedUserLevel, cacheUserLevel } from '$lib/services/security/user-level-cache.js';
@ -20,15 +17,14 @@ import { handleApiError, handleValidationError } from '$lib/utils/error-handler.
import { verifyRelayWriteProofFromAuth, verifyRelayWriteProof } from '$lib/services/nostr/relay-write-proof.js'; import { verifyRelayWriteProofFromAuth, verifyRelayWriteProof } from '$lib/services/nostr/relay-write-proof.js';
import { verifyEvent } from 'nostr-tools'; import { verifyEvent } from 'nostr-tools';
import type { NostrEvent } from '$lib/types/nostr.js'; import type { NostrEvent } from '$lib/types/nostr.js';
import { resolve } from 'path';
import { eventCache } from '$lib/services/nostr/event-cache.js'; import { eventCache } from '$lib/services/nostr/event-cache.js';
import { fetchRepoAnnouncementsWithCache, findRepoAnnouncement } from '$lib/utils/nostr-utils.js'; import { fetchRepoAnnouncementsWithCache, findRepoAnnouncement } from '$lib/utils/nostr-utils.js';
import { repoManager, nostrClient } from '$lib/services/service-registry.js';
import { DEFAULT_NOSTR_RELAYS } from '$lib/config.js';
// Resolve GIT_REPO_ROOT to absolute path (handles both relative and absolute paths) // Resolve GIT_REPO_ROOT to absolute path (handles both relative and absolute paths)
const repoRootEnv = process.env.GIT_REPO_ROOT || '/repos'; const repoRootEnv = process.env.GIT_REPO_ROOT || '/repos';
const repoRoot = resolve(repoRootEnv); const repoRoot = resolve(repoRootEnv);
const repoManager = new RepoManager(repoRoot);
const nostrClient = new NostrClient(DEFAULT_NOSTR_RELAYS);
export const POST: RequestHandler = async (event) => { export const POST: RequestHandler = async (event) => {
const { npub, repo } = event.params; const { npub, repo } = event.params;
@ -54,49 +50,49 @@ export const POST: RequestHandler = async (event) => {
hasUnlimitedAccess: userLevel ? hasUnlimitedAccess(userLevel.level) : false hasUnlimitedAccess: userLevel ? hasUnlimitedAccess(userLevel.level) : false
}, 'Checking user access level for clone operation'); }, 'Checking user access level for clone operation');
// If cache is empty, try to verify from proof event in body, NIP-98 auth header, or return helpful error // If cache is empty, try to verify from NIP-98 auth header first (doesn't consume body), then proof event in body
if (!userLevel || !hasUnlimitedAccess(userLevel.level)) { if (!userLevel || !hasUnlimitedAccess(userLevel.level)) {
let verification: { valid: boolean; error?: string; relay?: string; relayDown?: boolean } | null = null; let verification: { valid: boolean; error?: string; relay?: string; relayDown?: boolean } | null = null;
// Try to get proof event from request body first (if content-type is JSON) // Try NIP-98 auth header first (doesn't consume request body)
const contentType = event.request.headers.get('content-type') || ''; const authHeader = event.request.headers.get('authorization') || event.request.headers.get('Authorization');
if (contentType.includes('application/json')) {
try { if (authHeader) {
// Clone the request to read body without consuming it (if possible) logger.debug({ userPubkeyHex: userPubkeyHex.slice(0, 16) + '...' }, 'Cache empty or expired, attempting to verify from NIP-98 auth header');
// Note: Request body can only be read once, so we need to be careful verification = await verifyRelayWriteProofFromAuth(authHeader, userPubkeyHex, DEFAULT_NOSTR_RELAYS);
const bodyText = await event.request.text().catch(() => ''); }
if (bodyText) {
try { // If auth header didn't work, try to get proof event from request body (if content-type is JSON)
const body = JSON.parse(bodyText); // Note: This consumes the body, but only if auth header is not present
if (body.proofEvent && typeof body.proofEvent === 'object') { if (!verification) {
const proofEvent = body.proofEvent as NostrEvent; const contentType = event.request.headers.get('content-type') || '';
if (contentType.includes('application/json')) {
// Validate proof event signature and pubkey try {
if (verifyEvent(proofEvent) && proofEvent.pubkey === userPubkeyHex) { // Read body only if auth header verification failed
logger.debug({ userPubkeyHex: userPubkeyHex.slice(0, 16) + '...' }, 'Cache empty or expired, attempting to verify from proof event in request body'); const bodyText = await event.request.text().catch(() => '');
verification = await verifyRelayWriteProof(proofEvent, userPubkeyHex, DEFAULT_NOSTR_RELAYS); if (bodyText) {
} else { try {
logger.warn({ userPubkeyHex: userPubkeyHex.slice(0, 16) + '...' }, 'Invalid proof event in request body'); const body = JSON.parse(bodyText);
if (body.proofEvent && typeof body.proofEvent === 'object') {
const proofEvent = body.proofEvent as NostrEvent;
// Validate proof event signature and pubkey
if (verifyEvent(proofEvent) && proofEvent.pubkey === userPubkeyHex) {
logger.debug({ userPubkeyHex: userPubkeyHex.slice(0, 16) + '...' }, 'Cache empty or expired, attempting to verify from proof event in request body');
verification = await verifyRelayWriteProof(proofEvent, userPubkeyHex, DEFAULT_NOSTR_RELAYS);
} else {
logger.warn({ userPubkeyHex: userPubkeyHex.slice(0, 16) + '...' }, 'Invalid proof event in request body');
}
} }
} catch (parseErr) {
// Not valid JSON or missing proofEvent - continue
logger.debug({ error: parseErr }, 'Request body is not valid JSON or missing proofEvent');
} }
} catch (parseErr) {
// Not valid JSON or missing proofEvent - continue to check auth header
logger.debug({ error: parseErr }, 'Request body is not valid JSON or missing proofEvent');
} }
} catch (err) {
// Body reading failed - continue
logger.debug({ error: err }, 'Failed to read request body');
} }
} catch (err) {
// Body reading failed - continue to check auth header
logger.debug({ error: err }, 'Failed to read request body, checking auth header');
}
}
// If no proof event in body, try NIP-98 auth header
if (!verification) {
const authHeader = event.request.headers.get('authorization') || event.request.headers.get('Authorization');
if (authHeader) {
logger.debug({ userPubkeyHex: userPubkeyHex.slice(0, 16) + '...' }, 'Cache empty or expired, attempting to verify from NIP-98 auth header');
verification = await verifyRelayWriteProofFromAuth(authHeader, userPubkeyHex, DEFAULT_NOSTR_RELAYS);
} }
} }

21
src/routes/api/repos/[npub]/[repo]/fork/+server.ts

@ -4,7 +4,6 @@
import { json, error } from '@sveltejs/kit'; import { json, error } from '@sveltejs/kit';
import type { RequestHandler } from './$types'; import type { RequestHandler } from './$types';
import { RepoManager } from '$lib/services/git/repo-manager.js';
import { DEFAULT_NOSTR_RELAYS, combineRelays, getGitUrl } from '$lib/config.js'; import { DEFAULT_NOSTR_RELAYS, combineRelays, getGitUrl } from '$lib/config.js';
import { getUserRelays } from '$lib/services/nostr/user-relays.js'; import { getUserRelays } from '$lib/services/nostr/user-relays.js';
import { NostrClient } from '$lib/services/nostr/nostr-client.js'; import { NostrClient } from '$lib/services/nostr/nostr-client.js';
@ -17,7 +16,7 @@ import { existsSync } from 'fs';
import { rm } from 'fs/promises'; import { rm } from 'fs/promises';
import { join, resolve } from 'path'; import { join, resolve } from 'path';
import simpleGit from 'simple-git'; import simpleGit from 'simple-git';
import { isValidBranchName } from '$lib/utils/security.js'; import { isValidBranchName, validateRepoPath } from '$lib/utils/security.js';
import { ResourceLimits } from '$lib/services/security/resource-limits.js'; import { ResourceLimits } from '$lib/services/security/resource-limits.js';
import { auditLogger } from '$lib/services/security/audit-logger.js'; import { auditLogger } from '$lib/services/security/audit-logger.js';
import { ForkCountService } from '$lib/services/nostr/fork-count-service.js'; import { ForkCountService } from '$lib/services/nostr/fork-count-service.js';
@ -28,13 +27,12 @@ import { handleApiError, handleValidationError, handleNotFoundError, handleAutho
import { eventCache } from '$lib/services/nostr/event-cache.js'; import { eventCache } from '$lib/services/nostr/event-cache.js';
import { fetchRepoAnnouncementsWithCache, findRepoAnnouncement } from '$lib/utils/nostr-utils.js'; import { fetchRepoAnnouncementsWithCache, findRepoAnnouncement } from '$lib/utils/nostr-utils.js';
import { repoManager, nostrClient, forkCountService } from '$lib/services/service-registry.js';
// Resolve GIT_REPO_ROOT to absolute path (handles both relative and absolute paths) // Resolve GIT_REPO_ROOT to absolute path (handles both relative and absolute paths)
const repoRootEnv = process.env.GIT_REPO_ROOT || '/repos'; const repoRootEnv = process.env.GIT_REPO_ROOT || '/repos';
const repoRoot = resolve(repoRootEnv); const repoRoot = resolve(repoRootEnv);
const repoManager = new RepoManager(repoRoot);
const nostrClient = new NostrClient(DEFAULT_NOSTR_RELAYS);
const resourceLimits = new ResourceLimits(repoRoot); const resourceLimits = new ResourceLimits(repoRoot);
const forkCountService = new ForkCountService(DEFAULT_NOSTR_RELAYS);
/** /**
* Retry publishing an event with exponential backoff * Retry publishing an event with exponential backoff
@ -145,10 +143,9 @@ export const POST: RequestHandler = async ({ params, request }) => {
// Check if original repo exists // Check if original repo exists
const originalRepoPath = join(repoRoot, npub, `${repo}.git`); const originalRepoPath = join(repoRoot, npub, `${repo}.git`);
// Security: Ensure resolved path is within repoRoot // Security: Ensure resolved path is within repoRoot
const resolvedOriginalPath = resolve(originalRepoPath).replace(/\\/g, '/'); const originalPathValidation = validateRepoPath(originalRepoPath, repoRoot);
const resolvedRoot = resolve(repoRoot).replace(/\\/g, '/'); if (!originalPathValidation.valid) {
if (!resolvedOriginalPath.startsWith(resolvedRoot + '/')) { return error(403, originalPathValidation.error || 'Invalid repository path');
return error(403, 'Invalid repository path');
} }
if (!existsSync(originalRepoPath)) { if (!existsSync(originalRepoPath)) {
return error(404, 'Original repository not found'); return error(404, 'Original repository not found');
@ -165,9 +162,9 @@ export const POST: RequestHandler = async ({ params, request }) => {
// Check if fork already exists // Check if fork already exists
const forkRepoPath = join(repoRoot, userNpub, `${forkRepoName}.git`); const forkRepoPath = join(repoRoot, userNpub, `${forkRepoName}.git`);
// Security: Ensure resolved path is within repoRoot // Security: Ensure resolved path is within repoRoot
const resolvedForkPath = resolve(forkRepoPath).replace(/\\/g, '/'); const forkPathValidation = validateRepoPath(forkRepoPath, repoRoot);
if (!resolvedForkPath.startsWith(resolvedRoot + '/')) { if (!forkPathValidation.valid) {
return error(403, 'Invalid fork repository path'); return error(403, forkPathValidation.error || 'Invalid fork repository path');
} }
if (existsSync(forkRepoPath)) { if (existsSync(forkRepoPath)) {
return error(409, 'Fork already exists'); return error(409, 'Fork already exists');

121
src/routes/api/repos/[npub]/[repo]/issues/+server.ts

@ -9,11 +9,12 @@ import { createRepoGetHandler, withRepoValidation } from '$lib/utils/api-handler
import type { RepoRequestContext, RequestEvent } from '$lib/utils/api-context.js'; import type { RepoRequestContext, RequestEvent } from '$lib/utils/api-context.js';
import { handleValidationError, handleApiError } from '$lib/utils/error-handler.js'; import { handleValidationError, handleApiError } from '$lib/utils/error-handler.js';
import { DEFAULT_NOSTR_RELAYS } from '$lib/config.js'; import { DEFAULT_NOSTR_RELAYS } from '$lib/config.js';
import { MaintainerService } from '$lib/services/nostr/maintainer-service.js';
import { forwardEventIfEnabled } from '$lib/services/messaging/event-forwarder.js'; import { forwardEventIfEnabled } from '$lib/services/messaging/event-forwarder.js';
import logger from '$lib/services/logger.js'; import logger from '$lib/services/logger.js';
import { maintainerService } from '$lib/services/service-registry.js';
const maintainerService = new MaintainerService(DEFAULT_NOSTR_RELAYS); import { KIND, type NostrEvent } from '$lib/types/nostr.js';
import { verifyEvent } from 'nostr-tools';
import { validatePubkey } from '$lib/utils/input-validation.js';
export const GET: RequestHandler = createRepoGetHandler( export const GET: RequestHandler = createRepoGetHandler(
async (context: RepoRequestContext) => { async (context: RepoRequestContext) => {
@ -23,6 +24,59 @@ export const GET: RequestHandler = createRepoGetHandler(
{ operation: 'getIssues', requireRepoExists: false, requireRepoAccess: false } // Issues are stored in Nostr, don't require local repo { operation: 'getIssues', requireRepoExists: false, requireRepoAccess: false } // Issues are stored in Nostr, don't require local repo
); );
/**
* Validate issue event structure
*/
function validateIssueEvent(event: any, repoOwnerPubkey: string, repoName: string): event is NostrEvent {
if (!event || typeof event !== 'object') {
return false;
}
// Check required fields
if (!event.kind || event.kind !== KIND.ISSUE) {
return false;
}
if (!event.pubkey || typeof event.pubkey !== 'string') {
return false;
}
// Validate pubkey format
const pubkeyValidation = validatePubkey(event.pubkey);
if (!pubkeyValidation.valid) {
return false;
}
if (!event.id || typeof event.id !== 'string' || event.id.length !== 64) {
return false;
}
if (!event.sig || typeof event.sig !== 'string' || event.sig.length !== 128) {
return false;
}
if (typeof event.created_at !== 'number' || event.created_at <= 0) {
return false;
}
// Validate tags structure
if (!Array.isArray(event.tags)) {
return false;
}
// Validate content is a string
if (typeof event.content !== 'string') {
return false;
}
// Verify event signature
if (!verifyEvent(event as NostrEvent)) {
return false;
}
return true;
}
export const POST: RequestHandler = withRepoValidation( export const POST: RequestHandler = withRepoValidation(
async ({ repoContext, requestContext, event }) => { async ({ repoContext, requestContext, event }) => {
const body = await event.request.json(); const body = await event.request.json();
@ -32,9 +86,9 @@ export const POST: RequestHandler = withRepoValidation(
throw handleValidationError('Missing event in request body', { operation: 'createIssue', npub: repoContext.npub, repo: repoContext.repo }); throw handleValidationError('Missing event in request body', { operation: 'createIssue', npub: repoContext.npub, repo: repoContext.repo });
} }
// Verify the event is properly signed (basic check) // Validate event structure and signature
if (!issueEvent.sig || !issueEvent.id) { if (!validateIssueEvent(issueEvent, repoContext.repoOwnerPubkey, repoContext.repo)) {
throw handleValidationError('Invalid event: missing signature or ID', { operation: 'createIssue', npub: repoContext.npub, repo: repoContext.repo }); throw handleValidationError('Invalid event: missing required fields, invalid format, or invalid signature', { operation: 'createIssue', npub: repoContext.npub, repo: repoContext.repo });
} }
// Publish the event to relays // Publish the event to relays
@ -58,18 +112,61 @@ export const POST: RequestHandler = withRepoValidation(
{ operation: 'createIssue', requireRepoAccess: false } // Issues can be created by anyone with access { operation: 'createIssue', requireRepoAccess: false } // Issues can be created by anyone with access
); );
/**
* Validate issue status update request
*/
type StatusUpdateValidation =
| { valid: true; issueId: string; issueAuthor: string; status: 'open' | 'closed' | 'resolved' | 'draft' }
| { valid: false; error: string };
function validateStatusUpdate(body: any): StatusUpdateValidation {
if (!body || typeof body !== 'object') {
return { valid: false, error: 'Invalid request body' };
}
const { issueId, issueAuthor, status } = body;
if (!issueId || typeof issueId !== 'string' || issueId.length !== 64) {
return { valid: false, error: 'Invalid issueId: must be a 64-character hex string' };
}
if (!issueAuthor || typeof issueAuthor !== 'string') {
return { valid: false, error: 'Invalid issueAuthor: must be a string' };
}
// Validate pubkey format
const pubkeyValidation = validatePubkey(issueAuthor);
if (!pubkeyValidation.valid) {
return { valid: false, error: `Invalid issueAuthor: ${pubkeyValidation.error}` };
}
if (!status || typeof status !== 'string') {
return { valid: false, error: 'Invalid status: must be a string' };
}
// Validate status value - must match the service's expected types
const validStatuses: ('open' | 'closed' | 'resolved' | 'draft')[] = ['open', 'closed', 'resolved', 'draft'];
const normalizedStatus = status.toLowerCase() as 'open' | 'closed' | 'resolved' | 'draft';
if (!validStatuses.includes(normalizedStatus)) {
return { valid: false, error: `Invalid status: must be one of ${validStatuses.join(', ')}` };
}
return { valid: true, issueId, issueAuthor, status: normalizedStatus };
}
export const PATCH: RequestHandler = withRepoValidation( export const PATCH: RequestHandler = withRepoValidation(
async ({ repoContext, requestContext, event }) => { async ({ repoContext, requestContext, event }) => {
const body = await event.request.json(); const body = await event.request.json();
const { issueId, issueAuthor, status } = body;
// Validate request body
if (!issueId || !issueAuthor || !status) { const validation = validateStatusUpdate(body);
throw handleValidationError('Missing required fields: issueId, issueAuthor, status', { operation: 'updateIssueStatus', npub: repoContext.npub, repo: repoContext.repo }); if (!validation.valid) {
throw handleValidationError(validation.error || 'Invalid request', { operation: 'updateIssueStatus', npub: repoContext.npub, repo: repoContext.repo });
} }
const { issueId, issueAuthor, status } = validation;
// Check if user is maintainer or issue author // Check if user is maintainer or issue author
const { IssuesService } = await import('$lib/services/nostr/issues-service.js');
const issuesService = new IssuesService(DEFAULT_NOSTR_RELAYS);
const isMaintainer = await maintainerService.isMaintainer(requestContext.userPubkeyHex || '', repoContext.repoOwnerPubkey, repoContext.repo); const isMaintainer = await maintainerService.isMaintainer(requestContext.userPubkeyHex || '', repoContext.repoOwnerPubkey, repoContext.repo);
const isAuthor = requestContext.userPubkeyHex === issueAuthor; const isAuthor = requestContext.userPubkeyHex === issueAuthor;

46
src/routes/api/repos/[npub]/[repo]/prs/merge/+server.ts

@ -8,28 +8,50 @@ import type { RequestHandler } from './$types';
import { withRepoValidation } from '$lib/utils/api-handlers.js'; import { withRepoValidation } from '$lib/utils/api-handlers.js';
import type { RepoRequestContext } from '$lib/utils/api-context.js'; import type { RepoRequestContext } from '$lib/utils/api-context.js';
import { handleValidationError, handleApiError } from '$lib/utils/error-handler.js'; import { handleValidationError, handleApiError } from '$lib/utils/error-handler.js';
import { DEFAULT_NOSTR_RELAYS } from '$lib/config.js'; import { prsService, repoManager, fileManager, maintainerService } from '$lib/services/service-registry.js';
import { RepoManager } from '$lib/services/git/repo-manager.js';
import { FileManager } from '$lib/services/git/file-manager.js';
import { MaintainerService } from '$lib/services/nostr/maintainer-service.js';
import { prsService } from '$lib/services/service-registry.js';
import { simpleGit } from 'simple-git'; import { simpleGit } from 'simple-git';
import { join, resolve } from 'path'; import { join } from 'path';
import { existsSync } from 'fs'; import { existsSync } from 'fs';
import logger from '$lib/services/logger.js'; import logger from '$lib/services/logger.js';
import { isValidBranchName } from '$lib/utils/security.js';
import { validatePubkey } from '$lib/utils/input-validation.js';
const repoRoot = process.env.GIT_REPO_ROOT || '/repos'; const repoRoot = typeof process !== 'undefined' && process.env?.GIT_REPO_ROOT
const repoManager = new RepoManager(repoRoot); ? process.env.GIT_REPO_ROOT
const fileManager = new FileManager(repoRoot); : '/repos';
const maintainerService = new MaintainerService(DEFAULT_NOSTR_RELAYS);
export const POST: RequestHandler = withRepoValidation( export const POST: RequestHandler = withRepoValidation(
async ({ repoContext, requestContext, event }) => { async ({ repoContext, requestContext, event }) => {
const body = await event.request.json(); const body = await event.request.json();
const { prId, prAuthor, prCommitId, targetBranch = 'main', mergeMessage } = body; const { prId, prAuthor, prCommitId, targetBranch = 'main', mergeMessage } = body;
if (!prId || !prAuthor || !prCommitId) { // Validate required fields
throw handleValidationError('Missing required fields: prId, prAuthor, prCommitId', { operation: 'mergePR', npub: repoContext.npub, repo: repoContext.repo }); if (!prId || typeof prId !== 'string' || prId.length !== 64) {
throw handleValidationError('Invalid prId: must be a 64-character hex string', { operation: 'mergePR', npub: repoContext.npub, repo: repoContext.repo });
}
if (!prAuthor || typeof prAuthor !== 'string') {
throw handleValidationError('Invalid prAuthor: must be a string', { operation: 'mergePR', npub: repoContext.npub, repo: repoContext.repo });
}
// Validate pubkey format
const pubkeyValidation = validatePubkey(prAuthor);
if (!pubkeyValidation.valid) {
throw handleValidationError(`Invalid prAuthor: ${pubkeyValidation.error}`, { operation: 'mergePR', npub: repoContext.npub, repo: repoContext.repo });
}
if (!prCommitId || typeof prCommitId !== 'string' || prCommitId.length !== 40) {
throw handleValidationError('Invalid prCommitId: must be a 40-character commit hash', { operation: 'mergePR', npub: repoContext.npub, repo: repoContext.repo });
}
// Validate branch name
if (!isValidBranchName(targetBranch)) {
throw handleValidationError(`Invalid branch name: ${targetBranch}`, { operation: 'mergePR', npub: repoContext.npub, repo: repoContext.repo });
}
// Validate merge message if provided
if (mergeMessage && (typeof mergeMessage !== 'string' || mergeMessage.length > 10000)) {
throw handleValidationError('Invalid mergeMessage: must be a string with max 10000 characters', { operation: 'mergePR', npub: repoContext.npub, repo: repoContext.repo });
} }
// Check if user is maintainer // Check if user is maintainer

Loading…
Cancel
Save