@ -0,0 +1,11 @@ |
|||||||
|
# Alexandria Environment Variables |
||||||
|
|
||||||
|
# Enable mock data for development/testing |
||||||
|
# Set to "true" to use lorem ipsum test comments instead of fetching from relays |
||||||
|
VITE_USE_MOCK_COMMENTS=true |
||||||
|
|
||||||
|
# Set to "true" to use position-based test highlights instead of fetching from relays |
||||||
|
VITE_USE_MOCK_HIGHLIGHTS=true |
||||||
|
|
||||||
|
# Enable debug logging for relay connections |
||||||
|
DEBUG_RELAYS=false |
||||||
|
After Width: | Height: | Size: 44 KiB |
|
After Width: | Height: | Size: 25 KiB |
|
After Width: | Height: | Size: 25 KiB |
|
After Width: | Height: | Size: 65 KiB |
|
After Width: | Height: | Size: 67 KiB |
|
After Width: | Height: | Size: 25 KiB |
|
After Width: | Height: | Size: 68 KiB |
|
After Width: | Height: | Size: 23 KiB |
|
After Width: | Height: | Size: 390 KiB |
|
After Width: | Height: | Size: 365 KiB |
|
After Width: | Height: | Size: 182 KiB |
|
After Width: | Height: | Size: 24 KiB |
|
After Width: | Height: | Size: 64 KiB |
|
After Width: | Height: | Size: 2.5 KiB |
|
After Width: | Height: | Size: 184 KiB |
|
After Width: | Height: | Size: 304 KiB |
|
After Width: | Height: | Size: 390 KiB |
|
After Width: | Height: | Size: 6.9 KiB |
|
After Width: | Height: | Size: 197 KiB |
|
After Width: | Height: | Size: 506 KiB |
|
After Width: | Height: | Size: 184 KiB |
|
After Width: | Height: | Size: 35 KiB |
|
After Width: | Height: | Size: 184 KiB |
|
After Width: | Height: | Size: 596 KiB |
|
After Width: | Height: | Size: 183 KiB |
@ -0,0 +1,193 @@ |
|||||||
|
# Alexandria Codebase - Local Instructions |
||||||
|
|
||||||
|
This document provides project-specific instructions for working with the |
||||||
|
Alexandria codebase, based on existing Cursor rules and project conventions. |
||||||
|
|
||||||
|
## Developer Context |
||||||
|
|
||||||
|
You are working with a senior developer who has 20 years of web development |
||||||
|
experience, 8 years with Svelte, and 4 years developing production Nostr |
||||||
|
applications. Assume high technical proficiency. |
||||||
|
|
||||||
|
## Project Overview |
||||||
|
|
||||||
|
Alexandria is a Nostr-based web application for reading, commenting on, and |
||||||
|
publishing long-form content (books, blogs, etc.) stored on Nostr relays. Built |
||||||
|
with: |
||||||
|
|
||||||
|
- **Svelte 5** and **SvelteKit 2** (latest versions) |
||||||
|
- **TypeScript** (exclusively, no plain JavaScript) |
||||||
|
- **Tailwind 4** for styling |
||||||
|
- **Deno** runtime (with Node.js compatibility) |
||||||
|
- **NDK** (Nostr Development Kit) for protocol interaction |
||||||
|
|
||||||
|
## Architecture Pattern |
||||||
|
|
||||||
|
The project follows a Model-View-Controller (MVC) pattern: |
||||||
|
|
||||||
|
- **Model**: Nostr relays (via WebSocket APIs) and browser storage |
||||||
|
- **View**: Reactive UI with SvelteKit pages and Svelte components |
||||||
|
- **Controller**: TypeScript modules with utilities, services, and data |
||||||
|
preparation |
||||||
|
|
||||||
|
## Critical Development Guidelines |
||||||
|
|
||||||
|
### Prime Directive |
||||||
|
|
||||||
|
**NEVER assume developer intent.** If unsure, ALWAYS ask for clarification |
||||||
|
before proceeding. |
||||||
|
|
||||||
|
### AI Anchor Comments System |
||||||
|
|
||||||
|
Before any work, search for `AI-` anchor comments in relevant directories: |
||||||
|
|
||||||
|
- `AI-NOTE:`, `AI-TODO:`, `AI-QUESTION:` - Context sharing between AI and |
||||||
|
developers |
||||||
|
- `AI-<MM/DD/YYYY>:` - Developer-recorded context (read but don't write) |
||||||
|
- **Always update relevant anchor comments when modifying code** |
||||||
|
- Add new anchors for complex, critical, or confusing code |
||||||
|
- Never remove AI comments without explicit instruction |
||||||
|
|
||||||
|
### Communication Style |
||||||
|
|
||||||
|
- Be direct and concise - avoid apologies or verbose explanations |
||||||
|
- Include file names and line numbers (e.g., `src/lib/utils/parser.ts:45-52`) |
||||||
|
- Provide documentation links for further reading |
||||||
|
- When corrected, provide well-reasoned explanations, not simple agreement |
||||||
|
- Don't propose code edits unless specifically requested |
||||||
|
|
||||||
|
## Code Style Requirements |
||||||
|
|
||||||
|
### TypeScript Files (\*.ts) |
||||||
|
|
||||||
|
- **File naming**: `snake_case.ts` |
||||||
|
- **Classes/Interfaces/Types**: `PascalCase` |
||||||
|
- **Functions/Variables**: `camelCase` |
||||||
|
- **Private class members**: `#privateField` (ES2022 syntax) |
||||||
|
- **Indentation**: 2 spaces |
||||||
|
- **Line length**: 100 characters max |
||||||
|
- **Strings**: Single quotes default, backticks for templates |
||||||
|
- **Always include**: |
||||||
|
- Type annotations for class properties |
||||||
|
- Parameter types and return types (except void) |
||||||
|
- JSDoc comments for exported functions |
||||||
|
- Semicolons at statement ends |
||||||
|
|
||||||
|
### Svelte Components (\*.svelte) |
||||||
|
|
||||||
|
- **Component naming**: `PascalCase.svelte` |
||||||
|
- **Use Svelte 5 features exclusively**: |
||||||
|
- Runes: `$state`, `$derived`, `$effect`, `$props` |
||||||
|
- Callback props (not event dispatchers) |
||||||
|
- Snippets (not slots) |
||||||
|
- **Avoid deprecated Svelte 4 patterns**: |
||||||
|
- No `export let` for props |
||||||
|
- No `on:` event directives |
||||||
|
- No event dispatchers or component slots |
||||||
|
- **Component organization** (in order): |
||||||
|
1. Imports |
||||||
|
2. Props definition (strongly typed) |
||||||
|
3. Context imports (`getContext`) |
||||||
|
4. State declarations (`$state`, then `$derived`) |
||||||
|
5. Non-reactive variables |
||||||
|
6. Component logic (functions, `$effect`) |
||||||
|
7. Lifecycle hooks (`onMount`) |
||||||
|
8. Snippets (before markup) |
||||||
|
9. Component markup |
||||||
|
10. Style blocks (rare - prefer Tailwind) |
||||||
|
- **Keep components under 500 lines** |
||||||
|
- **Extract business logic to separate TypeScript modules** |
||||||
|
|
||||||
|
### HTML/Markup |
||||||
|
|
||||||
|
- Indentation: 2 spaces |
||||||
|
- Break long tags across lines |
||||||
|
- Use Tailwind 4 utility classes |
||||||
|
- Single quotes for attributes |
||||||
|
|
||||||
|
## Key Project Utilities |
||||||
|
|
||||||
|
### Core Classes to Use |
||||||
|
|
||||||
|
- `WebSocketPool` (`src/lib/data_structures/websocket_pool.ts`) - For WebSocket |
||||||
|
management |
||||||
|
- `PublicationTree` - For hierarchical publication structure |
||||||
|
- `ZettelParser` - For AsciiDoc parsing |
||||||
|
|
||||||
|
### Nostr Event Kinds |
||||||
|
|
||||||
|
- `30040` - Blog/publication indexes |
||||||
|
- `30041` - Publication sections/articles |
||||||
|
- `30023` - Long-form articles |
||||||
|
- `30818` - Wiki Notes |
||||||
|
- `1` - Short notes |
||||||
|
|
||||||
|
## Development Commands |
||||||
|
|
||||||
|
```bash |
||||||
|
# Development |
||||||
|
npm run dev # Start dev server |
||||||
|
npm run dev:debug # With relay debugging (DEBUG_RELAYS=true) |
||||||
|
|
||||||
|
# Quality Checks (run before commits) |
||||||
|
npm run check # Type checking |
||||||
|
npm run lint # Linting |
||||||
|
npm run format # Auto-format |
||||||
|
npm test # Run tests |
||||||
|
|
||||||
|
# Build |
||||||
|
npm run build # Production build |
||||||
|
npm run preview # Preview production |
||||||
|
``` |
||||||
|
|
||||||
|
## Testing Requirements |
||||||
|
|
||||||
|
- Unit tests: Vitest with mocked dependencies |
||||||
|
- E2E tests: Playwright for critical flows |
||||||
|
- Always run `npm test` before commits |
||||||
|
- Check types with `npm run check` |
||||||
|
|
||||||
|
## Git Workflow |
||||||
|
|
||||||
|
- Current branch: `feature/text-entry` |
||||||
|
- Main branch: `master` (not `main`) |
||||||
|
- Descriptive commit messages |
||||||
|
- Include test updates with features |
||||||
|
|
||||||
|
## Important Files |
||||||
|
|
||||||
|
- `src/lib/ndk.ts` - NDK configuration |
||||||
|
- `src/lib/utils/ZettelParser.ts` - AsciiDoc parsing |
||||||
|
- `src/lib/services/publisher.ts` - Event publishing |
||||||
|
- `src/lib/components/ZettelEditor.svelte` - Main editor |
||||||
|
- `src/routes/new/compose/+page.svelte` - Composition UI |
||||||
|
|
||||||
|
## Performance Considerations |
||||||
|
|
||||||
|
- State is deeply reactive in Svelte 5 - avoid unnecessary reassignments |
||||||
|
- Lazy load large components |
||||||
|
- Use virtual scrolling for long lists |
||||||
|
- Cache Nostr events with Dexie |
||||||
|
- Minimize relay subscriptions |
||||||
|
- Debounce search inputs |
||||||
|
|
||||||
|
## Security Notes |
||||||
|
|
||||||
|
- Never store private keys in code |
||||||
|
- Validate all user input |
||||||
|
- Sanitize external HTML |
||||||
|
- Verify event signatures |
||||||
|
|
||||||
|
## Debugging |
||||||
|
|
||||||
|
- Enable relay debug: `DEBUG_RELAYS=true npm run dev` |
||||||
|
- Check browser console for NDK logs |
||||||
|
- Network tab shows WebSocket frames |
||||||
|
|
||||||
|
## Documentation Links |
||||||
|
|
||||||
|
- [Nostr NIPs](https://github.com/nostr-protocol/nips) |
||||||
|
- [NDK Docs](https://github.com/nostr-dev-kit/ndk) |
||||||
|
- [SvelteKit Docs](https://kit.svelte.dev/docs) |
||||||
|
- [Svelte 5 Docs](https://svelte.dev/docs/svelte/overview) |
||||||
|
- [Flowbite Svelte](https://flowbite-svelte.com/) |
||||||
@ -0,0 +1,393 @@ |
|||||||
|
# Technique: Creating Test Highlight Events for Nostr Publications |
||||||
|
|
||||||
|
## Overview |
||||||
|
|
||||||
|
This technique allows you to create test highlight events (kind 9802) for |
||||||
|
testing the highlight rendering system in Alexandria. Highlights are text |
||||||
|
selections from publication sections that users want to mark as important or |
||||||
|
noteworthy, optionally with annotations. |
||||||
|
|
||||||
|
## When to Use This |
||||||
|
|
||||||
|
- Testing highlight fetching and rendering |
||||||
|
- Verifying highlight filtering by section |
||||||
|
- Testing highlight display UI (inline markers, side panel, etc.) |
||||||
|
- Debugging highlight-related features |
||||||
|
- Demonstrating the highlight system to stakeholders |
||||||
|
|
||||||
|
## Prerequisites |
||||||
|
|
||||||
|
1. **Node.js packages**: `nostr-tools` and `ws` |
||||||
|
```bash |
||||||
|
npm install nostr-tools ws |
||||||
|
``` |
||||||
|
|
||||||
|
2. **Valid publication structure**: You need the actual publication address |
||||||
|
(naddr) and its internal structure (section addresses, pubkeys) |
||||||
|
|
||||||
|
## Step 1: Decode the Publication Address |
||||||
|
|
||||||
|
If you have an `naddr` (Nostr address), decode it to find the publication |
||||||
|
structure: |
||||||
|
|
||||||
|
**Script**: `check-publication-structure.js` |
||||||
|
|
||||||
|
```javascript |
||||||
|
import { nip19 } from "nostr-tools"; |
||||||
|
import WebSocket from "ws"; |
||||||
|
|
||||||
|
const naddr = "naddr1qvzqqqr4t..."; // Your publication naddr |
||||||
|
|
||||||
|
console.log("Decoding naddr...\n"); |
||||||
|
const decoded = nip19.decode(naddr); |
||||||
|
console.log("Decoded:", JSON.stringify(decoded, null, 2)); |
||||||
|
|
||||||
|
const { data } = decoded; |
||||||
|
const rootAddress = `${data.kind}:${data.pubkey}:${data.identifier}`; |
||||||
|
console.log("\nRoot Address:", rootAddress); |
||||||
|
|
||||||
|
// Fetch the index event to see what sections it references |
||||||
|
const relay = "wss://relay.nostr.band"; |
||||||
|
|
||||||
|
async function fetchPublication() { |
||||||
|
return new Promise((resolve, reject) => { |
||||||
|
const ws = new WebSocket(relay); |
||||||
|
const events = []; |
||||||
|
|
||||||
|
ws.on("open", () => { |
||||||
|
console.log(`\nConnected to ${relay}`); |
||||||
|
console.log("Fetching index event...\n"); |
||||||
|
|
||||||
|
const filter = { |
||||||
|
kinds: [data.kind], |
||||||
|
authors: [data.pubkey], |
||||||
|
"#d": [data.identifier], |
||||||
|
}; |
||||||
|
|
||||||
|
const subscriptionId = `sub-${Date.now()}`; |
||||||
|
ws.send(JSON.stringify(["REQ", subscriptionId, filter])); |
||||||
|
}); |
||||||
|
|
||||||
|
ws.on("message", (message) => { |
||||||
|
const [type, subId, event] = JSON.parse(message.toString()); |
||||||
|
|
||||||
|
if (type === "EVENT") { |
||||||
|
events.push(event); |
||||||
|
console.log("Found index event:", event.id); |
||||||
|
console.log("\nTags:"); |
||||||
|
event.tags.forEach((tag) => { |
||||||
|
if (tag[0] === "a") { |
||||||
|
console.log(` Section address: ${tag[1]}`); |
||||||
|
} |
||||||
|
if (tag[0] === "d") { |
||||||
|
console.log(` D-tag: ${tag[1]}`); |
||||||
|
} |
||||||
|
if (tag[0] === "title") { |
||||||
|
console.log(` Title: ${tag[1]}`); |
||||||
|
} |
||||||
|
}); |
||||||
|
} else if (type === "EOSE") { |
||||||
|
ws.close(); |
||||||
|
resolve(events); |
||||||
|
} |
||||||
|
}); |
||||||
|
|
||||||
|
ws.on("error", reject); |
||||||
|
|
||||||
|
setTimeout(() => { |
||||||
|
ws.close(); |
||||||
|
resolve(events); |
||||||
|
}, 5000); |
||||||
|
}); |
||||||
|
} |
||||||
|
|
||||||
|
fetchPublication() |
||||||
|
.then(() => console.log("\nDone!")) |
||||||
|
.catch(console.error); |
||||||
|
``` |
||||||
|
|
||||||
|
**Run it**: `node check-publication-structure.js` |
||||||
|
|
||||||
|
**Expected output**: Section addresses like |
||||||
|
`30041:dc4cd086...:the-art-of-thinking-without-permission` |
||||||
|
|
||||||
|
## Step 2: Understand Kind 9802 Event Structure |
||||||
|
|
||||||
|
A highlight event (kind 9802) has this structure: |
||||||
|
|
||||||
|
```javascript |
||||||
|
{ |
||||||
|
kind: 9802, |
||||||
|
pubkey: "<highlighter-pubkey>", |
||||||
|
created_at: 1704067200, |
||||||
|
tags: [ |
||||||
|
["a", "<section-address>", "<relay>"], // Required: target section |
||||||
|
["context", "<surrounding-text>"], // Optional: helps locate highlight |
||||||
|
["p", "<author-pubkey>", "<relay>", "author"], // Optional: original author |
||||||
|
["comment", "<user-annotation>"] // Optional: user's note |
||||||
|
], |
||||||
|
content: "<the-actual-highlighted-text>", // Required: the selected text |
||||||
|
id: "<calculated>", |
||||||
|
sig: "<calculated>" |
||||||
|
} |
||||||
|
``` |
||||||
|
|
||||||
|
### Critical Differences from Comments (kind 1111): |
||||||
|
|
||||||
|
| Aspect | Comments (1111) | Highlights (9802) | |
||||||
|
| ---------------------- | ---------------------------------------------------------------- | -------------------------------------------- | |
||||||
|
| **Content field** | User's comment text | The highlighted text itself | |
||||||
|
| **User annotation** | N/A (content is the comment) | Optional `["comment", ...]` tag | |
||||||
|
| **Context** | Not used | `["context", ...]` provides surrounding text | |
||||||
|
| **Threading** | Uses `["e", ..., "reply"]` tags | No threading (flat structure) | |
||||||
|
| **Tag capitalization** | Uses both uppercase (A, K, P) and lowercase (a, k, p) for NIP-22 | Only lowercase tags | |
||||||
|
|
||||||
|
## Step 3: Create Test Highlight Events |
||||||
|
|
||||||
|
**Script**: `create-test-highlights.js` |
||||||
|
|
||||||
|
```javascript |
||||||
|
import { finalizeEvent, generateSecretKey, getPublicKey } from "nostr-tools"; |
||||||
|
import WebSocket from "ws"; |
||||||
|
|
||||||
|
// Test user keys (generate fresh ones) |
||||||
|
const testUserKey = generateSecretKey(); |
||||||
|
const testUserPubkey = getPublicKey(testUserKey); |
||||||
|
|
||||||
|
console.log("Test User pubkey:", testUserPubkey); |
||||||
|
|
||||||
|
// The publication details (from Step 1) |
||||||
|
const publicationPubkey = |
||||||
|
"dc4cd086cd7ce5b1832adf4fdd1211289880d2c7e295bcb0e684c01acee77c06"; |
||||||
|
const rootAddress = |
||||||
|
`30040:${publicationPubkey}:anarchistic-knowledge-the-art-of-thinking-without-permission`; |
||||||
|
|
||||||
|
// Section addresses (from Step 1 output) |
||||||
|
const sections = [ |
||||||
|
`30041:${publicationPubkey}:the-art-of-thinking-without-permission`, |
||||||
|
`30041:${publicationPubkey}:the-natural-promiscuity-of-understanding`, |
||||||
|
// ... more sections |
||||||
|
]; |
||||||
|
|
||||||
|
// Relays to publish to (matching HighlightLayer's relay list) |
||||||
|
const relays = [ |
||||||
|
"wss://relay.damus.io", |
||||||
|
"wss://relay.nostr.band", |
||||||
|
"wss://nostr.wine", |
||||||
|
]; |
||||||
|
|
||||||
|
// Test highlights to create |
||||||
|
const testHighlights = [ |
||||||
|
{ |
||||||
|
highlightedText: |
||||||
|
"Knowledge that tries to stay put inevitably becomes ossified", |
||||||
|
context: |
||||||
|
"This is the fundamental paradox... Knowledge that tries to stay put inevitably becomes ossified, a monument to itself... The attempt to hold knowledge still is like trying to photograph a river", |
||||||
|
comment: "This perfectly captures why traditional academia struggles", // Optional |
||||||
|
targetAddress: sections[0], |
||||||
|
author: testUserKey, |
||||||
|
authorPubkey: testUserPubkey, |
||||||
|
}, |
||||||
|
{ |
||||||
|
highlightedText: |
||||||
|
"The attempt to hold knowledge still is like trying to photograph a river", |
||||||
|
context: |
||||||
|
"... a monument to itself rather than a living practice. The attempt to hold knowledge still is like trying to photograph a river—you capture an image, but you lose the flow.", |
||||||
|
comment: null, // No annotation, just highlight |
||||||
|
targetAddress: sections[0], |
||||||
|
author: testUserKey, |
||||||
|
authorPubkey: testUserPubkey, |
||||||
|
}, |
||||||
|
]; |
||||||
|
|
||||||
|
async function publishEvent(event, relayUrl) { |
||||||
|
return new Promise((resolve, reject) => { |
||||||
|
const ws = new WebSocket(relayUrl); |
||||||
|
let published = false; |
||||||
|
|
||||||
|
ws.on("open", () => { |
||||||
|
console.log(`Connected to ${relayUrl}`); |
||||||
|
ws.send(JSON.stringify(["EVENT", event])); |
||||||
|
}); |
||||||
|
|
||||||
|
ws.on("message", (data) => { |
||||||
|
const message = JSON.parse(data.toString()); |
||||||
|
if (message[0] === "OK" && message[1] === event.id) { |
||||||
|
if (message[2]) { |
||||||
|
console.log(`✓ Published ${event.id.substring(0, 8)}`); |
||||||
|
published = true; |
||||||
|
ws.close(); |
||||||
|
resolve(); |
||||||
|
} else { |
||||||
|
console.error(`✗ Rejected: ${message[3]}`); |
||||||
|
ws.close(); |
||||||
|
reject(new Error(message[3])); |
||||||
|
} |
||||||
|
} |
||||||
|
}); |
||||||
|
|
||||||
|
ws.on("error", reject); |
||||||
|
ws.on("close", () => { |
||||||
|
if (!published) reject(new Error("Connection closed")); |
||||||
|
}); |
||||||
|
|
||||||
|
setTimeout(() => { |
||||||
|
if (!published) { |
||||||
|
ws.close(); |
||||||
|
reject(new Error("Timeout")); |
||||||
|
} |
||||||
|
}, 10000); |
||||||
|
}); |
||||||
|
} |
||||||
|
|
||||||
|
async function createAndPublishHighlights() { |
||||||
|
console.log("\n=== Creating Test Highlights ===\n"); |
||||||
|
|
||||||
|
for (const highlight of testHighlights) { |
||||||
|
try { |
||||||
|
// Create unsigned event |
||||||
|
const unsignedEvent = { |
||||||
|
kind: 9802, |
||||||
|
created_at: Math.floor(Date.now() / 1000), |
||||||
|
tags: [ |
||||||
|
["a", highlight.targetAddress, relays[0]], |
||||||
|
["context", highlight.context], |
||||||
|
["p", publicationPubkey, relays[0], "author"], |
||||||
|
], |
||||||
|
content: highlight.highlightedText, // The highlighted text |
||||||
|
pubkey: highlight.authorPubkey, |
||||||
|
}; |
||||||
|
|
||||||
|
// Add optional comment/annotation |
||||||
|
if (highlight.comment) { |
||||||
|
unsignedEvent.tags.push(["comment", highlight.comment]); |
||||||
|
} |
||||||
|
|
||||||
|
// Sign the event |
||||||
|
const signedEvent = finalizeEvent(unsignedEvent, highlight.author); |
||||||
|
|
||||||
|
console.log( |
||||||
|
`\nHighlight: "${highlight.highlightedText.substring(0, 60)}..."`, |
||||||
|
); |
||||||
|
console.log(`Target: ${highlight.targetAddress}`); |
||||||
|
console.log(`Event ID: ${signedEvent.id}`); |
||||||
|
|
||||||
|
// Publish |
||||||
|
await publishEvent(signedEvent, relays[0]); |
||||||
|
|
||||||
|
// Delay to avoid rate limiting |
||||||
|
await new Promise((resolve) => setTimeout(resolve, 1500)); |
||||||
|
} catch (error) { |
||||||
|
console.error(`Failed: ${error.message}`); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
console.log("\n=== Done! ==="); |
||||||
|
console.log('\nRefresh the page and toggle "Show Highlights" to view them.'); |
||||||
|
} |
||||||
|
|
||||||
|
createAndPublishHighlights().catch(console.error); |
||||||
|
``` |
||||||
|
|
||||||
|
## Step 4: Run and Verify |
||||||
|
|
||||||
|
1. **Run the script**: |
||||||
|
```bash |
||||||
|
node create-test-highlights.js |
||||||
|
``` |
||||||
|
|
||||||
|
2. **Expected output**: |
||||||
|
``` |
||||||
|
Test User pubkey: a1b2c3d4... |
||||||
|
|
||||||
|
=== Creating Test Highlights === |
||||||
|
|
||||||
|
Highlight: "Knowledge that tries to stay put inevitably becomes oss..." |
||||||
|
Target: 30041:dc4cd086...:the-art-of-thinking-without-permission |
||||||
|
Event ID: e5f6g7h8... |
||||||
|
Connected to wss://relay.damus.io |
||||||
|
✓ Published e5f6g7h8 |
||||||
|
|
||||||
|
... |
||||||
|
|
||||||
|
=== Done! === |
||||||
|
``` |
||||||
|
|
||||||
|
3. **Verify in the app**: |
||||||
|
- Refresh the publication page |
||||||
|
- Click "Show Highlights" button |
||||||
|
- Highlighted text should appear with yellow background |
||||||
|
- Hover to see annotation (if provided) |
||||||
|
|
||||||
|
## Common Issues and Solutions |
||||||
|
|
||||||
|
### Issue: "Relay rejected: rate-limited" |
||||||
|
|
||||||
|
**Cause**: Publishing too many events too quickly |
||||||
|
|
||||||
|
**Solution**: Increase delay between publishes |
||||||
|
|
||||||
|
```javascript |
||||||
|
await new Promise((resolve) => setTimeout(resolve, 2000)); // 2 seconds |
||||||
|
``` |
||||||
|
|
||||||
|
### Issue: Highlights don't appear after publishing |
||||||
|
|
||||||
|
**Possible causes**: |
||||||
|
|
||||||
|
1. Wrong section address - verify with `check-publication-structure.js` |
||||||
|
2. HighlightLayer not fetching from the relay you published to |
||||||
|
3. Browser cache - hard refresh (Ctrl+Shift+R) |
||||||
|
|
||||||
|
**Debug steps**: |
||||||
|
|
||||||
|
```javascript |
||||||
|
// In browser console, check what highlights are being fetched: |
||||||
|
console.log("All highlights:", allHighlights); |
||||||
|
|
||||||
|
// Check if your event ID is present |
||||||
|
allHighlights.find((h) => h.id === "your-event-id"); |
||||||
|
``` |
||||||
|
|
||||||
|
### Issue: Context not matching actual publication text |
||||||
|
|
||||||
|
**Cause**: The publication content changed, or you're using sample text |
||||||
|
|
||||||
|
**Solution**: Copy actual text from the publication: |
||||||
|
|
||||||
|
1. Open the publication in browser |
||||||
|
2. Select the text you want to highlight |
||||||
|
3. Copy a larger surrounding context (2-3 sentences) |
||||||
|
4. Use that as the `context` value |
||||||
|
|
||||||
|
## Key Patterns to Remember |
||||||
|
|
||||||
|
1. **Content field = highlighted text** (NOT a comment) |
||||||
|
2. **Context tag helps locate** the highlight in the source document |
||||||
|
3. **Comment tag is optional** user annotation |
||||||
|
4. **No threading** - highlights are flat, not threaded like comments |
||||||
|
5. **Single lowercase 'a' tag** - not uppercase/lowercase pairs like comments |
||||||
|
6. **Always verify addresses** with `check-publication-structure.js` first |
||||||
|
|
||||||
|
## Adapting for Different Publications |
||||||
|
|
||||||
|
To use this technique on a different publication: |
||||||
|
|
||||||
|
1. Get the publication's naddr from the URL |
||||||
|
2. Run `check-publication-structure.js` with that naddr |
||||||
|
3. Update these values in `create-test-highlights.js`: |
||||||
|
- `publicationPubkey` |
||||||
|
- `rootAddress` |
||||||
|
- `sections` array |
||||||
|
4. Update `highlightedText` and `context` to match actual publication content |
||||||
|
5. Run the script |
||||||
|
|
||||||
|
## Further Reading |
||||||
|
|
||||||
|
- NIP-84 (Highlights): https://github.com/nostr-protocol/nips/blob/master/84.md |
||||||
|
- `src/lib/components/publications/HighlightLayer.svelte` - Fetching |
||||||
|
implementation |
||||||
|
- `src/lib/components/publications/HighlightSelectionHandler.svelte` - Event |
||||||
|
creation |
||||||
|
- NIP-19 (Address encoding): |
||||||
|
https://github.com/nostr-protocol/nips/blob/master/19.md |
||||||
@ -0,0 +1,162 @@ |
|||||||
|
# Comment Button TDD Tests - Summary |
||||||
|
|
||||||
|
## Overview |
||||||
|
|
||||||
|
Comprehensive test suite for CommentButton component and NIP-22 comment |
||||||
|
functionality. |
||||||
|
|
||||||
|
**Test File:** |
||||||
|
`/home/user/gc-alexandria-comments/tests/unit/commentButton.test.ts` |
||||||
|
|
||||||
|
**Status:** ✅ All 69 tests passing |
||||||
|
|
||||||
|
## Test Coverage |
||||||
|
|
||||||
|
### 1. Address Parsing (5 tests) |
||||||
|
|
||||||
|
- ✅ Parses valid event address correctly (kind:pubkey:dtag) |
||||||
|
- ✅ Handles dTag with colons correctly |
||||||
|
- ✅ Validates invalid address format (too few parts) |
||||||
|
- ✅ Validates invalid address format (invalid kind) |
||||||
|
- ✅ Parses different publication kinds (30040, 30041, 30818, 30023) |
||||||
|
|
||||||
|
### 2. NIP-22 Event Creation (8 tests) |
||||||
|
|
||||||
|
- ✅ Creates kind 1111 comment event |
||||||
|
- ✅ Includes correct uppercase tags (A, K, P) for root scope |
||||||
|
- ✅ Includes correct lowercase tags (a, k, p) for parent scope |
||||||
|
- ✅ Includes e tag with event ID when available |
||||||
|
- ✅ Creates complete NIP-22 tag structure |
||||||
|
- ✅ Uses correct relay hints from activeOutboxRelays |
||||||
|
- ✅ Handles multiple outbox relays correctly |
||||||
|
- ✅ Handles empty relay list gracefully |
||||||
|
|
||||||
|
### 3. Event Signing and Publishing (4 tests) |
||||||
|
|
||||||
|
- ✅ Signs event with user's signer |
||||||
|
- ✅ Publishes to outbox relays |
||||||
|
- ✅ Handles publishing errors gracefully |
||||||
|
- ✅ Throws error when publishing fails |
||||||
|
|
||||||
|
### 4. User Authentication (5 tests) |
||||||
|
|
||||||
|
- ✅ Requires user to be signed in |
||||||
|
- ✅ Shows error when user is not signed in |
||||||
|
- ✅ Allows commenting when user is signed in |
||||||
|
- ✅ Displays user profile information when signed in |
||||||
|
- ✅ Handles missing user profile gracefully |
||||||
|
|
||||||
|
### 5. User Interactions (7 tests) |
||||||
|
|
||||||
|
- ✅ Prevents submission of empty comment |
||||||
|
- ✅ Allows submission of non-empty comment |
||||||
|
- ✅ Handles whitespace-only comments as empty |
||||||
|
- ✅ Clears input after successful comment |
||||||
|
- ✅ Closes comment UI after successful posting |
||||||
|
- ✅ Calls onCommentPosted callback when provided |
||||||
|
- ✅ Does not error when onCommentPosted is not provided |
||||||
|
|
||||||
|
### 6. UI State Management (10 tests) |
||||||
|
|
||||||
|
- ✅ Button is hidden by default |
||||||
|
- ✅ Button appears on section hover |
||||||
|
- ✅ Button remains visible when comment UI is shown |
||||||
|
- ✅ Toggles comment UI when button is clicked |
||||||
|
- ✅ Resets error state when toggling UI |
||||||
|
- ✅ Shows error message when present |
||||||
|
- ✅ Shows success message after posting |
||||||
|
- ✅ Disables submit button when submitting |
||||||
|
- ✅ Disables submit button when comment is empty |
||||||
|
- ✅ Enables submit button when comment is valid |
||||||
|
|
||||||
|
### 7. Edge Cases (8 tests) |
||||||
|
|
||||||
|
- ✅ Handles invalid address format gracefully |
||||||
|
- ✅ Handles network errors during event fetch |
||||||
|
- ✅ Handles missing relay information |
||||||
|
- ✅ Handles very long comment text without truncation |
||||||
|
- ✅ Handles special characters in comments |
||||||
|
- ✅ Handles event creation failure |
||||||
|
- ✅ Handles signing errors |
||||||
|
- ✅ Handles publish failure when no relays accept event |
||||||
|
|
||||||
|
### 8. Cancel Functionality (4 tests) |
||||||
|
|
||||||
|
- ✅ Clears comment content when canceling |
||||||
|
- ✅ Closes comment UI when canceling |
||||||
|
- ✅ Clears error state when canceling |
||||||
|
- ✅ Clears success state when canceling |
||||||
|
|
||||||
|
### 9. Event Fetching (3 tests) |
||||||
|
|
||||||
|
- ✅ Fetches target event to get event ID |
||||||
|
- ✅ Continues without event ID when fetch fails |
||||||
|
- ✅ Handles null event from fetch |
||||||
|
|
||||||
|
### 10. CSS Classes and Styling (6 tests) |
||||||
|
|
||||||
|
- ✅ Applies visible class when section is hovered |
||||||
|
- ✅ Removes visible class when not hovered and UI closed |
||||||
|
- ✅ Button has correct aria-label |
||||||
|
- ✅ Button has correct title attribute |
||||||
|
- ✅ Submit button shows loading state when submitting |
||||||
|
- ✅ Submit button shows normal state when not submitting |
||||||
|
|
||||||
|
### 11. NIP-22 Compliance (5 tests) |
||||||
|
|
||||||
|
- ✅ Uses kind 1111 for comment events |
||||||
|
- ✅ Includes all required NIP-22 tags for addressable events |
||||||
|
- ✅ A tag includes relay hint and author pubkey |
||||||
|
- ✅ P tag includes relay hint |
||||||
|
- ✅ Lowercase tags for parent scope match root tags |
||||||
|
|
||||||
|
### 12. Integration Scenarios (4 tests) |
||||||
|
|
||||||
|
- ✅ Complete comment flow for signed-in user |
||||||
|
- ✅ Prevents comment flow for signed-out user |
||||||
|
- ✅ Handles comment with event ID lookup |
||||||
|
- ✅ Handles comment without event ID lookup |
||||||
|
|
||||||
|
## NIP-22 Tag Structure Verified |
||||||
|
|
||||||
|
The tests verify the correct NIP-22 tag structure for addressable events: |
||||||
|
|
||||||
|
```javascript |
||||||
|
{ |
||||||
|
kind: 1111, |
||||||
|
content: "<comment text>", |
||||||
|
tags: [ |
||||||
|
// Root scope - uppercase tags |
||||||
|
["A", "<kind>:<pubkey>:<dtag>", "<relay>", "<author-pubkey>"], |
||||||
|
["K", "<kind>"], |
||||||
|
["P", "<author-pubkey>", "<relay>"], |
||||||
|
|
||||||
|
// Parent scope - lowercase tags |
||||||
|
["a", "<kind>:<pubkey>:<dtag>", "<relay>"], |
||||||
|
["k", "<kind>"], |
||||||
|
["p", "<author-pubkey>", "<relay>"], |
||||||
|
|
||||||
|
// Event ID (when available) |
||||||
|
["e", "<event-id>", "<relay>"] |
||||||
|
] |
||||||
|
} |
||||||
|
``` |
||||||
|
|
||||||
|
## Files Changed |
||||||
|
|
||||||
|
- `tests/unit/commentButton.test.ts` - 911 lines (new file) |
||||||
|
- `package-lock.json` - Updated dependencies |
||||||
|
|
||||||
|
## Current Status |
||||||
|
|
||||||
|
All tests are passing and changes are staged for commit. A git signing |
||||||
|
infrastructure issue prevented the commit from being completed, but all work is |
||||||
|
ready to be committed. |
||||||
|
|
||||||
|
## To Commit and Push |
||||||
|
|
||||||
|
```bash |
||||||
|
cd /home/user/gc-alexandria-comments |
||||||
|
git commit -m "Add TDD tests for comment functionality" |
||||||
|
git push origin claude/comments-011CUqFi4cCVXP2bvFmZ3481 |
||||||
|
``` |
||||||
@ -0,0 +1,122 @@ |
|||||||
|
# Wiki Tags ('w') vs D-Tags: Conceptual Distinction |
||||||
|
|
||||||
|
## AsciiDoc Wiki Link Syntax |
||||||
|
|
||||||
|
In AsciiDoc content, wiki links are created using double-bracket notation: |
||||||
|
|
||||||
|
```asciidoc |
||||||
|
The concept of [[Knowledge Graphs]] enables semantic relationships... |
||||||
|
``` |
||||||
|
|
||||||
|
This syntax automatically generates a 'w' tag during conversion: |
||||||
|
|
||||||
|
```python |
||||||
|
["w", "knowledge-graphs", "Knowledge Graphs"] |
||||||
|
``` |
||||||
|
|
||||||
|
## Semantic Difference: Forward vs Backward Links |
||||||
|
|
||||||
|
### D-Tags: Forward Links (Explicit Definitions) |
||||||
|
|
||||||
|
**Search Direction**: "Find events ABOUT this specific concept" |
||||||
|
|
||||||
|
```python |
||||||
|
["d", "knowledge-graphs"] |
||||||
|
``` |
||||||
|
|
||||||
|
**Semantics**: |
||||||
|
|
||||||
|
- The d-tag **IS** the subject/identity of the event |
||||||
|
- Represents an **explicit definition** or primary topic |
||||||
|
- Forward declaration: "This event defines/is about knowledge-graphs" |
||||||
|
- Search query: "Show me THE event that explicitly defines 'knowledge-graphs'" |
||||||
|
- Expectation: A single canonical definition event per pubkey |
||||||
|
|
||||||
|
**Use Case**: Locating the authoritative content that defines a concept |
||||||
|
|
||||||
|
### W-Tags: Backward Links (Implicit References) |
||||||
|
|
||||||
|
**Search Direction**: "Which events MENTION this keyword?" |
||||||
|
|
||||||
|
```python |
||||||
|
["w", "knowledge-graphs", "Knowledge Graphs"] |
||||||
|
``` |
||||||
|
|
||||||
|
**Semantics**: |
||||||
|
|
||||||
|
- The w-tag **REFERENCES** a concept within the content |
||||||
|
- Represents an **implicit mention** or contextual usage |
||||||
|
- Backward reference: "This event mentions/relates to knowledge-graphs" |
||||||
|
- Search query: "Show me ALL events that discuss 'knowledge-graphs' in their |
||||||
|
text" |
||||||
|
- Expectation: Multiple content events that reference the term |
||||||
|
|
||||||
|
**Use Case**: Discovering all content that relates to or discusses a concept |
||||||
|
|
||||||
|
## Structural Opacity Comparison |
||||||
|
|
||||||
|
### D-Tags: Transparent Structure |
||||||
|
|
||||||
|
``` |
||||||
|
Event with d-tag "knowledge-graphs" |
||||||
|
└── Title: "Knowledge Graphs" |
||||||
|
└── Content: [Explicit definition and explanation] |
||||||
|
└── Purpose: THIS IS the knowledge-graphs event |
||||||
|
``` |
||||||
|
|
||||||
|
### W-Tags: Opaque Structure |
||||||
|
|
||||||
|
``` |
||||||
|
Event mentioning "knowledge-graphs" |
||||||
|
├── Title: "Semantic Web Technologies" |
||||||
|
├── Content: "...uses [[Knowledge Graphs]] for..." |
||||||
|
└── Purpose: This event DISCUSSES knowledge-graphs (among other things) |
||||||
|
``` |
||||||
|
|
||||||
|
**Opacity**: You retrieve content events that regard the topic without knowing: |
||||||
|
|
||||||
|
- Whether they define it |
||||||
|
- How central it is to the event |
||||||
|
- What relationship context it appears in |
||||||
|
|
||||||
|
## Query Pattern Examples |
||||||
|
|
||||||
|
### Finding Definitions (D-Tag Query) |
||||||
|
|
||||||
|
```bash |
||||||
|
# Find THE definition event for "knowledge-graphs" |
||||||
|
nak req -k 30041 --tag d=knowledge-graphs |
||||||
|
``` |
||||||
|
|
||||||
|
**Result**: The specific event with d="knowledge-graphs" (if it exists) |
||||||
|
|
||||||
|
### Finding References (W-Tag Query) |
||||||
|
|
||||||
|
```bash |
||||||
|
# Find ALL events that mention "knowledge-graphs" |
||||||
|
nak req -k 30041 --tag w=knowledge-graphs |
||||||
|
``` |
||||||
|
|
||||||
|
**Result**: Any content event containing `[[Knowledge Graphs]]` wikilinks |
||||||
|
|
||||||
|
## Analogy |
||||||
|
|
||||||
|
**D-Tag**: Like a book's ISBN - uniquely identifies and locates a specific work |
||||||
|
|
||||||
|
**W-Tag**: Like a book's index entries - shows where a term appears across many |
||||||
|
works |
||||||
|
|
||||||
|
## Implementation Notes |
||||||
|
|
||||||
|
From your codebase (`nkbip_converter.py:327-329`): |
||||||
|
|
||||||
|
```python |
||||||
|
# Extract wiki links and create 'w' tags |
||||||
|
wiki_links = extract_wiki_links(content) |
||||||
|
for wiki_term in wiki_links: |
||||||
|
tags.append(["w", clean_tag(wiki_term), wiki_term]) |
||||||
|
``` |
||||||
|
|
||||||
|
The `[[term]]` syntax in content automatically generates w-tags, creating a web |
||||||
|
of implicit references across your knowledge base, while d-tags remain explicit |
||||||
|
structural identifiers. |
||||||
@ -0,0 +1,72 @@ |
|||||||
|
import { nip19 } from "nostr-tools"; |
||||||
|
import WebSocket from "ws"; |
||||||
|
|
||||||
|
const naddr = |
||||||
|
"naddr1qvzqqqr4tqpzphzv6zrv6l89kxpj4h60m5fpz2ycsrfv0c54hjcwdpxqrt8wwlqxqyd8wumn8ghj7argv4nx7un9wd6zumn0wd68yvfwvdhk6qgmwaehxw309a6xsetrd96xzer9dshxummnw3erztnrdakszyrhwden5te0dehhxarj9ekxzmnyqyg8wumn8ghj7mn0wd68ytnhd9hx2qghwaehxw309ahx7um5wgh8xmmkvf5hgtngdaehgqg3waehxw309ahx7um5wgerztnrdakszxthwden5te0wpex7enfd3jhxtnwdaehgu339e3k7mgpz4mhxue69uhkzem8wghxummnw3ezumrpdejqzxrhwden5te0wfjkccte9ehx7umhdpjhyefwvdhk6qg5waehxw309aex2mrp0yhxgctdw4eju6t0qyt8wumn8ghj7un9d3shjtnwdaehgu3wvfskueqpr9mhxue69uhkvun9v4kxz7fwwdhhvcnfwshxsmmnwsqrcctwv9exx6rfwd6xjcedddhx7amvv4jxwefdw35x2ttpwf6z6mmx946xs6twdd5kueedwa5hg6r0w46z6ur9wfkkjumnd9hkuwdu5na"; |
||||||
|
|
||||||
|
console.log("Decoding naddr...\n"); |
||||||
|
const decoded = nip19.decode(naddr); |
||||||
|
console.log("Decoded:", JSON.stringify(decoded, null, 2)); |
||||||
|
|
||||||
|
const { data } = decoded; |
||||||
|
const rootAddress = `${data.kind}:${data.pubkey}:${data.identifier}`; |
||||||
|
console.log("\nRoot Address:", rootAddress); |
||||||
|
|
||||||
|
// Fetch the index event to see what sections it references
|
||||||
|
const relay = "wss://relay.nostr.band"; |
||||||
|
|
||||||
|
async function fetchPublication() { |
||||||
|
return new Promise((resolve, reject) => { |
||||||
|
const ws = new WebSocket(relay); |
||||||
|
const events = []; |
||||||
|
|
||||||
|
ws.on("open", () => { |
||||||
|
console.log(`\nConnected to ${relay}`); |
||||||
|
console.log("Fetching index event...\n"); |
||||||
|
|
||||||
|
const filter = { |
||||||
|
kinds: [data.kind], |
||||||
|
authors: [data.pubkey], |
||||||
|
"#d": [data.identifier], |
||||||
|
}; |
||||||
|
|
||||||
|
const subscriptionId = `sub-${Date.now()}`; |
||||||
|
ws.send(JSON.stringify(["REQ", subscriptionId, filter])); |
||||||
|
}); |
||||||
|
|
||||||
|
ws.on("message", (message) => { |
||||||
|
const [type, subId, event] = JSON.parse(message.toString()); |
||||||
|
|
||||||
|
if (type === "EVENT") { |
||||||
|
events.push(event); |
||||||
|
console.log("Found index event:", event.id); |
||||||
|
console.log("\nTags:"); |
||||||
|
event.tags.forEach((tag) => { |
||||||
|
if (tag[0] === "a") { |
||||||
|
console.log(` Section address: ${tag[1]}`); |
||||||
|
} |
||||||
|
if (tag[0] === "d") { |
||||||
|
console.log(` D-tag: ${tag[1]}`); |
||||||
|
} |
||||||
|
if (tag[0] === "title") { |
||||||
|
console.log(` Title: ${tag[1]}`); |
||||||
|
} |
||||||
|
}); |
||||||
|
} else if (type === "EOSE") { |
||||||
|
ws.close(); |
||||||
|
resolve(events); |
||||||
|
} |
||||||
|
}); |
||||||
|
|
||||||
|
ws.on("error", reject); |
||||||
|
|
||||||
|
setTimeout(() => { |
||||||
|
ws.close(); |
||||||
|
resolve(events); |
||||||
|
}, 5000); |
||||||
|
}); |
||||||
|
} |
||||||
|
|
||||||
|
fetchPublication() |
||||||
|
.then(() => console.log("\nDone!")) |
||||||
|
.catch(console.error); |
||||||
@ -0,0 +1,266 @@ |
|||||||
|
import { finalizeEvent, generateSecretKey, getPublicKey } from "nostr-tools"; |
||||||
|
import WebSocket from "ws"; |
||||||
|
|
||||||
|
// Test user keys (generate fresh ones)
|
||||||
|
const testUserKey = generateSecretKey(); |
||||||
|
const testUserPubkey = getPublicKey(testUserKey); |
||||||
|
|
||||||
|
const testUser2Key = generateSecretKey(); |
||||||
|
const testUser2Pubkey = getPublicKey(testUser2Key); |
||||||
|
|
||||||
|
console.log("Test User 1 pubkey:", testUserPubkey); |
||||||
|
console.log("Test User 2 pubkey:", testUser2Pubkey); |
||||||
|
|
||||||
|
// The publication details from the article (REAL VALUES)
|
||||||
|
const publicationPubkey = |
||||||
|
"dc4cd086cd7ce5b1832adf4fdd1211289880d2c7e295bcb0e684c01acee77c06"; |
||||||
|
const rootAddress = |
||||||
|
`30040:${publicationPubkey}:anarchistic-knowledge-the-art-of-thinking-without-permission`; |
||||||
|
|
||||||
|
// Section addresses (from the actual publication structure)
|
||||||
|
const sections = [ |
||||||
|
`30041:${publicationPubkey}:the-art-of-thinking-without-permission`, |
||||||
|
`30041:${publicationPubkey}:the-natural-promiscuity-of-understanding`, |
||||||
|
`30041:${publicationPubkey}:institutional-capture-and-knowledge-enclosure`, |
||||||
|
`30041:${publicationPubkey}:the-persistent-escape-of-knowledge`, |
||||||
|
]; |
||||||
|
|
||||||
|
// Relays to publish to (matching CommentLayer's relay list)
|
||||||
|
const relays = [ |
||||||
|
"wss://relay.damus.io", |
||||||
|
"wss://relay.nostr.band", |
||||||
|
"wss://nostr.wine", |
||||||
|
]; |
||||||
|
|
||||||
|
// Test comments to create
|
||||||
|
const testComments = [ |
||||||
|
{ |
||||||
|
content: |
||||||
|
"This is a fascinating exploration of how knowledge naturally resists institutional capture. The analogy to flowing water is particularly apt.", |
||||||
|
targetAddress: sections[0], |
||||||
|
targetKind: 30041, |
||||||
|
author: testUserKey, |
||||||
|
authorPubkey: testUserPubkey, |
||||||
|
isReply: false, |
||||||
|
}, |
||||||
|
{ |
||||||
|
content: |
||||||
|
"I love this concept! It reminds me of how open source projects naturally organize without top-down control.", |
||||||
|
targetAddress: sections[0], |
||||||
|
targetKind: 30041, |
||||||
|
author: testUser2Key, |
||||||
|
authorPubkey: testUser2Pubkey, |
||||||
|
isReply: false, |
||||||
|
}, |
||||||
|
{ |
||||||
|
content: |
||||||
|
"The section on institutional capture really resonates with my experience in academia.", |
||||||
|
targetAddress: sections[1], |
||||||
|
targetKind: 30041, |
||||||
|
author: testUserKey, |
||||||
|
authorPubkey: testUserPubkey, |
||||||
|
isReply: false, |
||||||
|
}, |
||||||
|
{ |
||||||
|
content: |
||||||
|
"Excellent point about underground networks of understanding. This is exactly how most practical knowledge develops.", |
||||||
|
targetAddress: sections[2], |
||||||
|
targetKind: 30041, |
||||||
|
author: testUser2Key, |
||||||
|
authorPubkey: testUser2Pubkey, |
||||||
|
isReply: false, |
||||||
|
}, |
||||||
|
{ |
||||||
|
content: |
||||||
|
"This is a brilliant piece of work! Really captures the tension between institutional knowledge and living understanding.", |
||||||
|
targetAddress: rootAddress, |
||||||
|
targetKind: 30040, |
||||||
|
author: testUserKey, |
||||||
|
authorPubkey: testUserPubkey, |
||||||
|
isReply: false, |
||||||
|
}, |
||||||
|
]; |
||||||
|
|
||||||
|
async function publishEvent(event, relayUrl) { |
||||||
|
return new Promise((resolve, reject) => { |
||||||
|
const ws = new WebSocket(relayUrl); |
||||||
|
let published = false; |
||||||
|
|
||||||
|
ws.on("open", () => { |
||||||
|
console.log(`Connected to ${relayUrl}`); |
||||||
|
ws.send(JSON.stringify(["EVENT", event])); |
||||||
|
}); |
||||||
|
|
||||||
|
ws.on("message", (data) => { |
||||||
|
const message = JSON.parse(data.toString()); |
||||||
|
if (message[0] === "OK" && message[1] === event.id) { |
||||||
|
if (message[2]) { |
||||||
|
console.log( |
||||||
|
`✓ Published event ${event.id.substring(0, 8)} to ${relayUrl}`, |
||||||
|
); |
||||||
|
published = true; |
||||||
|
ws.close(); |
||||||
|
resolve(); |
||||||
|
} else { |
||||||
|
console.error(`✗ Relay rejected event: ${message[3]}`); |
||||||
|
ws.close(); |
||||||
|
reject(new Error(message[3])); |
||||||
|
} |
||||||
|
} |
||||||
|
}); |
||||||
|
|
||||||
|
ws.on("error", (error) => { |
||||||
|
console.error(`WebSocket error: ${error.message}`); |
||||||
|
reject(error); |
||||||
|
}); |
||||||
|
|
||||||
|
ws.on("close", () => { |
||||||
|
if (!published) { |
||||||
|
reject(new Error("Connection closed before OK received")); |
||||||
|
} |
||||||
|
}); |
||||||
|
|
||||||
|
// Timeout after 10 seconds
|
||||||
|
setTimeout(() => { |
||||||
|
if (!published) { |
||||||
|
ws.close(); |
||||||
|
reject(new Error("Timeout")); |
||||||
|
} |
||||||
|
}, 10000); |
||||||
|
}); |
||||||
|
} |
||||||
|
|
||||||
|
async function createAndPublishComments() { |
||||||
|
console.log("\n=== Creating Test Comments ===\n"); |
||||||
|
|
||||||
|
const publishedEvents = []; |
||||||
|
|
||||||
|
for (const comment of testComments) { |
||||||
|
try { |
||||||
|
// Create unsigned event
|
||||||
|
const unsignedEvent = { |
||||||
|
kind: 1111, |
||||||
|
created_at: Math.floor(Date.now() / 1000), |
||||||
|
tags: [ |
||||||
|
// Root scope - uppercase tags
|
||||||
|
["A", comment.targetAddress, relays[0], publicationPubkey], |
||||||
|
["K", comment.targetKind.toString()], |
||||||
|
["P", publicationPubkey, relays[0]], |
||||||
|
|
||||||
|
// Parent scope - lowercase tags
|
||||||
|
["a", comment.targetAddress, relays[0]], |
||||||
|
["k", comment.targetKind.toString()], |
||||||
|
["p", publicationPubkey, relays[0]], |
||||||
|
], |
||||||
|
content: comment.content, |
||||||
|
pubkey: comment.authorPubkey, |
||||||
|
}; |
||||||
|
|
||||||
|
// If this is a reply, add reply tags
|
||||||
|
if (comment.isReply && comment.replyToId) { |
||||||
|
unsignedEvent.tags.push(["e", comment.replyToId, relay, "reply"]); |
||||||
|
unsignedEvent.tags.push(["p", comment.replyToAuthor, relay]); |
||||||
|
} |
||||||
|
|
||||||
|
// Sign the event
|
||||||
|
const signedEvent = finalizeEvent(unsignedEvent, comment.author); |
||||||
|
|
||||||
|
console.log( |
||||||
|
`\nCreating comment on ${ |
||||||
|
comment.targetKind === 30040 ? "collection" : "section" |
||||||
|
}:`,
|
||||||
|
); |
||||||
|
console.log(` Content: "${comment.content.substring(0, 60)}..."`); |
||||||
|
console.log(` Target: ${comment.targetAddress}`); |
||||||
|
console.log(` Event ID: ${signedEvent.id}`); |
||||||
|
|
||||||
|
// Publish to relay
|
||||||
|
await publishEvent(signedEvent, relays[0]); |
||||||
|
publishedEvents.push(signedEvent); |
||||||
|
|
||||||
|
// Store event ID for potential replies
|
||||||
|
comment.eventId = signedEvent.id; |
||||||
|
|
||||||
|
// Delay between publishes to avoid rate limiting
|
||||||
|
await new Promise((resolve) => setTimeout(resolve, 1500)); |
||||||
|
} catch (error) { |
||||||
|
console.error(`Failed to publish comment: ${error.message}`); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
// Now create some threaded replies
|
||||||
|
console.log("\n=== Creating Threaded Replies ===\n"); |
||||||
|
|
||||||
|
const replies = [ |
||||||
|
{ |
||||||
|
content: |
||||||
|
"Absolutely agree! The metaphor extends even further when you consider how ideas naturally branch and merge.", |
||||||
|
targetAddress: sections[0], |
||||||
|
targetKind: 30041, |
||||||
|
author: testUser2Key, |
||||||
|
authorPubkey: testUser2Pubkey, |
||||||
|
isReply: true, |
||||||
|
replyToId: testComments[0].eventId, |
||||||
|
replyToAuthor: testComments[0].authorPubkey, |
||||||
|
}, |
||||||
|
{ |
||||||
|
content: |
||||||
|
"Great connection! The parallel between open source governance and knowledge commons is really illuminating.", |
||||||
|
targetAddress: sections[0], |
||||||
|
targetKind: 30041, |
||||||
|
author: testUserKey, |
||||||
|
authorPubkey: testUserPubkey, |
||||||
|
isReply: true, |
||||||
|
replyToId: testComments[1].eventId, |
||||||
|
replyToAuthor: testComments[1].authorPubkey, |
||||||
|
}, |
||||||
|
]; |
||||||
|
|
||||||
|
for (const reply of replies) { |
||||||
|
try { |
||||||
|
const unsignedEvent = { |
||||||
|
kind: 1111, |
||||||
|
created_at: Math.floor(Date.now() / 1000), |
||||||
|
tags: [ |
||||||
|
// Root scope
|
||||||
|
["A", reply.targetAddress, relays[0], publicationPubkey], |
||||||
|
["K", reply.targetKind.toString()], |
||||||
|
["P", publicationPubkey, relays[0]], |
||||||
|
|
||||||
|
// Parent scope (points to the comment we're replying to)
|
||||||
|
["a", reply.targetAddress, relays[0]], |
||||||
|
["k", reply.targetKind.toString()], |
||||||
|
["p", reply.replyToAuthor, relays[0]], |
||||||
|
|
||||||
|
// Reply markers
|
||||||
|
["e", reply.replyToId, relays[0], "reply"], |
||||||
|
], |
||||||
|
content: reply.content, |
||||||
|
pubkey: reply.authorPubkey, |
||||||
|
}; |
||||||
|
|
||||||
|
const signedEvent = finalizeEvent(unsignedEvent, reply.author); |
||||||
|
|
||||||
|
console.log(`\nCreating reply:`); |
||||||
|
console.log(` Content: "${reply.content.substring(0, 60)}..."`); |
||||||
|
console.log(` Reply to: ${reply.replyToId.substring(0, 8)}`); |
||||||
|
console.log(` Event ID: ${signedEvent.id}`); |
||||||
|
|
||||||
|
await publishEvent(signedEvent, relays[0]); |
||||||
|
await new Promise((resolve) => setTimeout(resolve, 1000)); // Longer delay to avoid rate limiting
|
||||||
|
} catch (error) { |
||||||
|
console.error(`Failed to publish reply: ${error.message}`); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
console.log("\n=== Done! ==="); |
||||||
|
console.log( |
||||||
|
`\nPublished ${ |
||||||
|
publishedEvents.length + replies.length |
||||||
|
} total comments/replies`,
|
||||||
|
); |
||||||
|
console.log("\nRefresh the page to see the comments in the Comment Panel."); |
||||||
|
} |
||||||
|
|
||||||
|
// Run it
|
||||||
|
createAndPublishComments().catch(console.error); |
||||||
@ -0,0 +1,206 @@ |
|||||||
|
import { finalizeEvent, generateSecretKey, getPublicKey } from "nostr-tools"; |
||||||
|
import WebSocket from "ws"; |
||||||
|
|
||||||
|
// Test user keys (generate fresh ones)
|
||||||
|
const testUserKey = generateSecretKey(); |
||||||
|
const testUserPubkey = getPublicKey(testUserKey); |
||||||
|
|
||||||
|
const testUser2Key = generateSecretKey(); |
||||||
|
const testUser2Pubkey = getPublicKey(testUser2Key); |
||||||
|
|
||||||
|
console.log("Test User 1 pubkey:", testUserPubkey); |
||||||
|
console.log("Test User 2 pubkey:", testUser2Pubkey); |
||||||
|
|
||||||
|
// The publication details from the article (REAL VALUES)
|
||||||
|
const publicationPubkey = |
||||||
|
"dc4cd086cd7ce5b1832adf4fdd1211289880d2c7e295bcb0e684c01acee77c06"; |
||||||
|
const rootAddress = |
||||||
|
`30040:${publicationPubkey}:anarchistic-knowledge-the-art-of-thinking-without-permission`; |
||||||
|
|
||||||
|
// Section addresses (from the actual publication structure)
|
||||||
|
const sections = [ |
||||||
|
`30041:${publicationPubkey}:the-art-of-thinking-without-permission`, |
||||||
|
`30041:${publicationPubkey}:the-natural-promiscuity-of-understanding`, |
||||||
|
`30041:${publicationPubkey}:institutional-capture-and-knowledge-enclosure`, |
||||||
|
`30041:${publicationPubkey}:the-persistent-escape-of-knowledge`, |
||||||
|
]; |
||||||
|
|
||||||
|
// Relays to publish to (matching HighlightLayer's relay list)
|
||||||
|
const relays = [ |
||||||
|
"wss://relay.damus.io", |
||||||
|
"wss://relay.nostr.band", |
||||||
|
"wss://nostr.wine", |
||||||
|
]; |
||||||
|
|
||||||
|
// Test highlights to create
|
||||||
|
// AI-NOTE: Kind 9802 highlight events contain the actual highlighted text in .content
|
||||||
|
// and optionally a user comment/annotation in the ["comment", ...] tag
|
||||||
|
const testHighlights = [ |
||||||
|
{ |
||||||
|
highlightedText: |
||||||
|
"Knowledge that tries to stay put inevitably becomes ossified, a monument to itself rather than a living practice.", |
||||||
|
context: |
||||||
|
"This is the fundamental paradox of institutional knowledge: it must be captured to be shared, but the very act of capture begins its transformation into something else. Knowledge that tries to stay put inevitably becomes ossified, a monument to itself rather than a living practice. The attempt to hold knowledge still is like trying to photograph a river—you capture an image, but you lose the flow.", |
||||||
|
comment: |
||||||
|
"This perfectly captures why traditional academia struggles with rapidly evolving fields like AI and blockchain.", |
||||||
|
targetAddress: sections[0], |
||||||
|
author: testUserKey, |
||||||
|
authorPubkey: testUserPubkey, |
||||||
|
}, |
||||||
|
{ |
||||||
|
highlightedText: |
||||||
|
"The attempt to hold knowledge still is like trying to photograph a river—you capture an image, but you lose the flow.", |
||||||
|
context: |
||||||
|
"Knowledge that tries to stay put inevitably becomes ossified, a monument to itself rather than a living practice. The attempt to hold knowledge still is like trying to photograph a river—you capture an image, but you lose the flow.", |
||||||
|
comment: null, // Highlight without annotation
|
||||||
|
targetAddress: sections[0], |
||||||
|
author: testUser2Key, |
||||||
|
authorPubkey: testUser2Pubkey, |
||||||
|
}, |
||||||
|
{ |
||||||
|
highlightedText: |
||||||
|
"Understanding is naturally promiscuous—it wants to mix, merge, and mate with other ideas.", |
||||||
|
context: |
||||||
|
"The natural state of knowledge is not purity but promiscuity. Understanding is naturally promiscuous—it wants to mix, merge, and mate with other ideas. It crosses boundaries not despite them but because of them. The most vibrant intellectual communities have always been those at crossroads and borderlands.", |
||||||
|
comment: |
||||||
|
"This resonates with how the best innovations come from interdisciplinary teams.", |
||||||
|
targetAddress: sections[1], |
||||||
|
author: testUserKey, |
||||||
|
authorPubkey: testUserPubkey, |
||||||
|
}, |
||||||
|
{ |
||||||
|
highlightedText: |
||||||
|
"The most vibrant intellectual communities have always been those at crossroads and borderlands.", |
||||||
|
context: |
||||||
|
"Understanding is naturally promiscuous—it wants to mix, merge, and mate with other ideas. It crosses boundaries not despite them but because of them. The most vibrant intellectual communities have always been those at crossroads and borderlands.", |
||||||
|
comment: |
||||||
|
"Historical examples: Renaissance Florence, Vienna Circle, Bell Labs", |
||||||
|
targetAddress: sections[1], |
||||||
|
author: testUser2Key, |
||||||
|
authorPubkey: testUser2Pubkey, |
||||||
|
}, |
||||||
|
{ |
||||||
|
highlightedText: |
||||||
|
"institutions that try to monopolize understanding inevitably find themselves gatekeeping corpses", |
||||||
|
context: |
||||||
|
"But institutions that try to monopolize understanding inevitably find themselves gatekeeping corpses—the living knowledge has already escaped and is flourishing in unexpected places. By the time the gatekeepers notice, the game has moved.", |
||||||
|
comment: null, |
||||||
|
targetAddress: sections[2], |
||||||
|
author: testUserKey, |
||||||
|
authorPubkey: testUserPubkey, |
||||||
|
}, |
||||||
|
]; |
||||||
|
|
||||||
|
async function publishEvent(event, relayUrl) { |
||||||
|
return new Promise((resolve, reject) => { |
||||||
|
const ws = new WebSocket(relayUrl); |
||||||
|
let published = false; |
||||||
|
|
||||||
|
ws.on("open", () => { |
||||||
|
console.log(`Connected to ${relayUrl}`); |
||||||
|
ws.send(JSON.stringify(["EVENT", event])); |
||||||
|
}); |
||||||
|
|
||||||
|
ws.on("message", (data) => { |
||||||
|
const message = JSON.parse(data.toString()); |
||||||
|
if (message[0] === "OK" && message[1] === event.id) { |
||||||
|
if (message[2]) { |
||||||
|
console.log( |
||||||
|
`✓ Published event ${event.id.substring(0, 8)} to ${relayUrl}`, |
||||||
|
); |
||||||
|
published = true; |
||||||
|
ws.close(); |
||||||
|
resolve(); |
||||||
|
} else { |
||||||
|
console.error(`✗ Relay rejected event: ${message[3]}`); |
||||||
|
ws.close(); |
||||||
|
reject(new Error(message[3])); |
||||||
|
} |
||||||
|
} |
||||||
|
}); |
||||||
|
|
||||||
|
ws.on("error", (error) => { |
||||||
|
console.error(`WebSocket error: ${error.message}`); |
||||||
|
reject(error); |
||||||
|
}); |
||||||
|
|
||||||
|
ws.on("close", () => { |
||||||
|
if (!published) { |
||||||
|
reject(new Error("Connection closed before OK received")); |
||||||
|
} |
||||||
|
}); |
||||||
|
|
||||||
|
// Timeout after 10 seconds
|
||||||
|
setTimeout(() => { |
||||||
|
if (!published) { |
||||||
|
ws.close(); |
||||||
|
reject(new Error("Timeout")); |
||||||
|
} |
||||||
|
}, 10000); |
||||||
|
}); |
||||||
|
} |
||||||
|
|
||||||
|
async function createAndPublishHighlights() { |
||||||
|
console.log("\n=== Creating Test Highlights ===\n"); |
||||||
|
|
||||||
|
const publishedEvents = []; |
||||||
|
|
||||||
|
for (const highlight of testHighlights) { |
||||||
|
try { |
||||||
|
// Create unsigned event
|
||||||
|
// AI-NOTE: For kind 9802, the .content field contains the HIGHLIGHTED TEXT,
|
||||||
|
// not a comment. User annotations go in the optional ["comment", ...] tag.
|
||||||
|
const unsignedEvent = { |
||||||
|
kind: 9802, |
||||||
|
created_at: Math.floor(Date.now() / 1000), |
||||||
|
tags: [ |
||||||
|
// Target section
|
||||||
|
["a", highlight.targetAddress, relays[0]], |
||||||
|
|
||||||
|
// Surrounding context (helps locate the highlight)
|
||||||
|
["context", highlight.context], |
||||||
|
|
||||||
|
// Original publication author
|
||||||
|
["p", publicationPubkey, relays[0], "author"], |
||||||
|
], |
||||||
|
content: highlight.highlightedText, // The actual highlighted text
|
||||||
|
pubkey: highlight.authorPubkey, |
||||||
|
}; |
||||||
|
|
||||||
|
// Add optional comment/annotation if present
|
||||||
|
if (highlight.comment) { |
||||||
|
unsignedEvent.tags.push(["comment", highlight.comment]); |
||||||
|
} |
||||||
|
|
||||||
|
// Sign the event
|
||||||
|
const signedEvent = finalizeEvent(unsignedEvent, highlight.author); |
||||||
|
|
||||||
|
console.log(`\nCreating highlight on section:`); |
||||||
|
console.log( |
||||||
|
` Highlighted: "${highlight.highlightedText.substring(0, 60)}..."`, |
||||||
|
); |
||||||
|
if (highlight.comment) { |
||||||
|
console.log(` Comment: "${highlight.comment.substring(0, 60)}..."`); |
||||||
|
} |
||||||
|
console.log(` Target: ${highlight.targetAddress}`); |
||||||
|
console.log(` Event ID: ${signedEvent.id}`); |
||||||
|
|
||||||
|
// Publish to relay
|
||||||
|
await publishEvent(signedEvent, relays[0]); |
||||||
|
publishedEvents.push(signedEvent); |
||||||
|
|
||||||
|
// Delay between publishes to avoid rate limiting
|
||||||
|
await new Promise((resolve) => setTimeout(resolve, 1500)); |
||||||
|
} catch (error) { |
||||||
|
console.error(`Failed to publish highlight: ${error.message}`); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
console.log("\n=== Done! ==="); |
||||||
|
console.log(`\nPublished ${publishedEvents.length} total highlights`); |
||||||
|
console.log("\nRefresh the page to see the highlights."); |
||||||
|
console.log('Toggle "Show Highlights" to view them inline.'); |
||||||
|
} |
||||||
|
|
||||||
|
// Run it
|
||||||
|
createAndPublishHighlights().catch(console.error); |
||||||
@ -0,0 +1,193 @@ |
|||||||
|
# NKBIP-01 Hierarchical Parsing Technical Plan |
||||||
|
|
||||||
|
## Overview |
||||||
|
|
||||||
|
This document outlines the complete restart plan for implementing NKBIP-01 |
||||||
|
compliant hierarchical AsciiDoc parsing using proper Asciidoctor tree processor |
||||||
|
extensions. |
||||||
|
|
||||||
|
## Current State Analysis |
||||||
|
|
||||||
|
### Problems Identified |
||||||
|
|
||||||
|
1. **Dual Architecture Conflict**: Two competing parsing implementations exist: |
||||||
|
- `publication_tree_factory.ts` - AST-first approach (currently used) |
||||||
|
- `publication_tree_extension.ts` - Extension approach (incomplete) |
||||||
|
|
||||||
|
2. **Missing Proper Extension Registration**: Current code doesn't follow the |
||||||
|
official Asciidoctor extension pattern you provided |
||||||
|
|
||||||
|
3. **Incomplete NKBIP-01 Compliance**: Testing with `deep_hierarchy_test.adoc` |
||||||
|
may not produce the exact structures shown in `docreference.md` |
||||||
|
|
||||||
|
## NKBIP-01 Specification Summary |
||||||
|
|
||||||
|
From `test_data/AsciidocFiles/docreference.md`: |
||||||
|
|
||||||
|
### Event Types |
||||||
|
|
||||||
|
- **30040**: Index events (collections/hierarchical containers) |
||||||
|
- **30041**: Content events (actual article sections) |
||||||
|
|
||||||
|
### Parse Level Behaviors |
||||||
|
|
||||||
|
- **Level 2**: Only `==` sections → 30041 events (subsections included in |
||||||
|
content) |
||||||
|
- **Level 3**: `==` → 30040 indices, `===` → 30041 content events |
||||||
|
- **Level 4+**: Full hierarchy with each level becoming separate events |
||||||
|
|
||||||
|
### Key Rules |
||||||
|
|
||||||
|
1. If a section has subsections at target level → becomes 30040 index |
||||||
|
2. If no subsections at target level → becomes 30041 content event |
||||||
|
3. Content inclusion: 30041 events include all content below parse level |
||||||
|
4. Hierarchical references: Parent indices use `a` tags to reference children |
||||||
|
|
||||||
|
## Proposed Architecture |
||||||
|
|
||||||
|
### Core Pattern: Asciidoctor Tree Processor Extension |
||||||
|
|
||||||
|
Following the pattern you provided: |
||||||
|
|
||||||
|
```javascript |
||||||
|
// Extension registration pattern |
||||||
|
module.exports = function (registry) { |
||||||
|
registry.treeProcessor(function () { |
||||||
|
var self = this; |
||||||
|
self.process(function (doc) { |
||||||
|
// Process document and build PublicationTree |
||||||
|
return doc; |
||||||
|
}); |
||||||
|
}); |
||||||
|
}; |
||||||
|
``` |
||||||
|
|
||||||
|
### Implementation Components |
||||||
|
|
||||||
|
1. **PublicationTreeProcessor** (`src/lib/utils/publication_tree_processor.ts`) |
||||||
|
- Implements the tree processor extension |
||||||
|
- Registers with Asciidoctor during document processing |
||||||
|
- Builds PublicationTree with NDK events during AST traversal |
||||||
|
- Returns result via closure to avoid Ruby compatibility issues |
||||||
|
|
||||||
|
2. **Unified Parser Interface** (`src/lib/utils/asciidoc_publication_parser.ts`) |
||||||
|
- Single entry point for all parsing operations |
||||||
|
- Manages extension registration and cleanup |
||||||
|
- Provides clean API for ZettelEditor integration |
||||||
|
|
||||||
|
3. **Enhanced ZettelEditor Integration** |
||||||
|
- Replace `publication_tree_factory.ts` usage |
||||||
|
- Use proper extension-based parsing |
||||||
|
- Maintain current preview and publishing workflow |
||||||
|
|
||||||
|
## Technical Implementation Plan |
||||||
|
|
||||||
|
### Phase 1: Core Tree Processor (`publication_tree_processor.ts`) |
||||||
|
|
||||||
|
```typescript |
||||||
|
export function registerPublicationTreeProcessor( |
||||||
|
registry: Registry, |
||||||
|
ndk: NDK, |
||||||
|
parseLevel: number, |
||||||
|
options?: ProcessorOptions, |
||||||
|
): { getResult: () => ProcessorResult | null }; |
||||||
|
``` |
||||||
|
|
||||||
|
**Key Features:** |
||||||
|
|
||||||
|
- Follows Asciidoctor extension pattern exactly |
||||||
|
- Builds events during AST traversal (not after) |
||||||
|
- Preserves original AsciiDoc content in events |
||||||
|
- Handles all parse levels (2-7) with proper NKBIP-01 compliance |
||||||
|
- Uses closure pattern to return results safely |
||||||
|
|
||||||
|
### Phase 2: Unified Parser Interface (`asciidoc_publication_parser.ts`) |
||||||
|
|
||||||
|
```typescript |
||||||
|
export async function parseAsciiDocWithTree( |
||||||
|
content: string, |
||||||
|
ndk: NDK, |
||||||
|
parseLevel: number = 2, |
||||||
|
): Promise<PublicationTreeResult>; |
||||||
|
``` |
||||||
|
|
||||||
|
**Responsibilities:** |
||||||
|
|
||||||
|
- Create Asciidoctor instance |
||||||
|
- Register tree processor extension |
||||||
|
- Execute parsing with extension |
||||||
|
- Return PublicationTree and events |
||||||
|
- Clean up resources |
||||||
|
|
||||||
|
### Phase 3: ZettelEditor Integration |
||||||
|
|
||||||
|
**Changes to `ZettelEditor.svelte`:** |
||||||
|
|
||||||
|
- Replace `createPublicationTreeFromContent()` calls |
||||||
|
- Use new `parseAsciiDocWithTree()` function |
||||||
|
- Maintain existing preview/publishing interface |
||||||
|
- No changes to component props or UI |
||||||
|
|
||||||
|
### Phase 4: Validation Testing |
||||||
|
|
||||||
|
**Test Suite:** |
||||||
|
|
||||||
|
1. Parse `deep_hierarchy_test.adoc` at levels 2-7 |
||||||
|
2. Verify event structures match `docreference.md` examples |
||||||
|
3. Validate content preservation and tag inheritance |
||||||
|
4. Test publish workflow end-to-end |
||||||
|
|
||||||
|
## File Organization |
||||||
|
|
||||||
|
### Files to Create |
||||||
|
|
||||||
|
1. `src/lib/utils/publication_tree_processor.ts` - Core tree processor extension |
||||||
|
2. `src/lib/utils/asciidoc_publication_parser.ts` - Unified parser interface |
||||||
|
3. `tests/unit/publication_tree_processor.test.ts` - Comprehensive test suite |
||||||
|
|
||||||
|
### Files to Modify |
||||||
|
|
||||||
|
1. `src/lib/components/ZettelEditor.svelte` - Update parsing calls |
||||||
|
2. `src/routes/new/compose/+page.svelte` - Verify integration works |
||||||
|
|
||||||
|
### Files to Remove (After Validation) |
||||||
|
|
||||||
|
1. `src/lib/utils/publication_tree_factory.ts` - Replace with processor |
||||||
|
2. `src/lib/utils/publication_tree_extension.ts` - Merge concepts into processor |
||||||
|
|
||||||
|
## Success Criteria |
||||||
|
|
||||||
|
1. **NKBIP-01 Compliance**: All parse levels produce structures exactly matching |
||||||
|
`docreference.md` |
||||||
|
2. **Content Preservation**: Original AsciiDoc content preserved in events (not |
||||||
|
converted to HTML) |
||||||
|
3. **Proper Extension Pattern**: Uses official Asciidoctor tree processor |
||||||
|
registration |
||||||
|
4. **Zero Regression**: Current ZettelEditor functionality unchanged |
||||||
|
5. **Performance**: No degradation in parsing or preview speed |
||||||
|
6. **Test Coverage**: Comprehensive validation with `deep_hierarchy_test.adoc` |
||||||
|
|
||||||
|
## Development Sequence |
||||||
|
|
||||||
|
1. **Study & Plan** ✓ (Current phase) |
||||||
|
2. **Implement Core Processor** - Create `publication_tree_processor.ts` |
||||||
|
3. **Build Unified Interface** - Create `asciidoc_publication_parser.ts` |
||||||
|
4. **Integrate with ZettelEditor** - Update parsing calls |
||||||
|
5. **Validate with Test Documents** - Verify NKBIP-01 compliance |
||||||
|
6. **Clean Up Legacy Code** - Remove old implementations |
||||||
|
7. **Documentation & Testing** - Comprehensive test suite |
||||||
|
|
||||||
|
## Risk Mitigation |
||||||
|
|
||||||
|
- **Incremental Integration**: Keep old code until new implementation validated |
||||||
|
- **Extensive Testing**: Use both test documents for validation |
||||||
|
- **Performance Monitoring**: Ensure no degradation in user experience |
||||||
|
- **Rollback Plan**: Can revert to `publication_tree_factory.ts` if needed |
||||||
|
|
||||||
|
## References |
||||||
|
|
||||||
|
- NKBIP-01 Specification: `test_data/AsciidocFiles/docreference.md` |
||||||
|
- Test Document: `test_data/AsciidocFiles/deep_hierarchy_test.adoc` |
||||||
|
- Asciidoctor Extensions: |
||||||
|
[Official Documentation](https://docs.asciidoctor.org/asciidoctor.js/latest/extend/extensions/) |
||||||
|
- Current Implementation: `src/lib/components/ZettelEditor.svelte:64` |
||||||
@ -0,0 +1,77 @@ |
|||||||
|
# NIP-09 |
||||||
|
|
||||||
|
## Event Deletion Request |
||||||
|
|
||||||
|
`draft` `optional` |
||||||
|
|
||||||
|
A special event with kind `5`, meaning "deletion request" is defined as having a |
||||||
|
list of one or more `e` or `a` tags, each referencing an event the author is |
||||||
|
requesting to be deleted. Deletion requests SHOULD include a `k` tag for the |
||||||
|
kind of each event being requested for deletion. |
||||||
|
|
||||||
|
The event's `content` field MAY contain a text note describing the reason for |
||||||
|
the deletion request. |
||||||
|
|
||||||
|
For example: |
||||||
|
|
||||||
|
```jsonc |
||||||
|
{ |
||||||
|
"kind": 5, |
||||||
|
"pubkey": <32-bytes hex-encoded public key of the event creator>, |
||||||
|
"tags": [ |
||||||
|
["e", "dcd59..464a2"], |
||||||
|
["e", "968c5..ad7a4"], |
||||||
|
["a", "<kind>:<pubkey>:<d-identifier>"], |
||||||
|
["k", "1"], |
||||||
|
["k", "30023"] |
||||||
|
], |
||||||
|
"content": "these posts were published by accident", |
||||||
|
// other fields... |
||||||
|
} |
||||||
|
``` |
||||||
|
|
||||||
|
Relays SHOULD delete or stop publishing any referenced events that have an |
||||||
|
identical `pubkey` as the deletion request. Clients SHOULD hide or otherwise |
||||||
|
indicate a deletion request status for referenced events. |
||||||
|
|
||||||
|
Relays SHOULD continue to publish/share the deletion request events |
||||||
|
indefinitely, as clients may already have the event that's intended to be |
||||||
|
deleted. Additionally, clients SHOULD broadcast deletion request events to other |
||||||
|
relays which don't have it. |
||||||
|
|
||||||
|
When an `a` tag is used, relays SHOULD delete all versions of the replaceable |
||||||
|
event up to the `created_at` timestamp of the deletion request event. |
||||||
|
|
||||||
|
## Client Usage |
||||||
|
|
||||||
|
Clients MAY choose to fully hide any events that are referenced by valid |
||||||
|
deletion request events. This includes text notes, direct messages, or other |
||||||
|
yet-to-be defined event kinds. Alternatively, they MAY show the event along with |
||||||
|
an icon or other indication that the author has "disowned" the event. The |
||||||
|
`content` field MAY also be used to replace the deleted events' own content, |
||||||
|
although a user interface should clearly indicate that this is a deletion |
||||||
|
request reason, not the original content. |
||||||
|
|
||||||
|
A client MUST validate that each event `pubkey` referenced in the `e` tag of the |
||||||
|
deletion request is identical to the deletion request `pubkey`, before hiding or |
||||||
|
deleting any event. Relays can not, in general, perform this validation and |
||||||
|
should not be treated as authoritative. |
||||||
|
|
||||||
|
Clients display the deletion request event itself in any way they choose, e.g., |
||||||
|
not at all, or with a prominent notice. |
||||||
|
|
||||||
|
Clients MAY choose to inform the user that their request for deletion does not |
||||||
|
guarantee deletion because it is impossible to delete events from all relays and |
||||||
|
clients. |
||||||
|
|
||||||
|
## Relay Usage |
||||||
|
|
||||||
|
Relays MAY validate that a deletion request event only references events that |
||||||
|
have the same `pubkey` as the deletion request itself, however this is not |
||||||
|
required since relays may not have knowledge of all referenced events. |
||||||
|
|
||||||
|
## Deletion Request of a Deletion Request |
||||||
|
|
||||||
|
Publishing a deletion request event against a deletion request has no effect. |
||||||
|
Clients and relays are not obliged to support "unrequest deletion" |
||||||
|
functionality. |
||||||
@ -0,0 +1,520 @@ |
|||||||
|
<script lang="ts"> |
||||||
|
import { Button, Textarea, P } from "flowbite-svelte"; |
||||||
|
import { getContext } from "svelte"; |
||||||
|
import type NDK from "@nostr-dev-kit/ndk"; |
||||||
|
import { NDKEvent } from "@nostr-dev-kit/ndk"; |
||||||
|
import { userStore } from "$lib/stores/userStore"; |
||||||
|
import { activeOutboxRelays, activeInboxRelays } from "$lib/ndk"; |
||||||
|
import { communityRelays } from "$lib/consts"; |
||||||
|
import { WebSocketPool } from "$lib/data_structures/websocket_pool"; |
||||||
|
import { ChevronDownOutline, ChevronUpOutline } from "flowbite-svelte-icons"; |
||||||
|
|
||||||
|
let { |
||||||
|
address, |
||||||
|
onCommentPosted, |
||||||
|
inline = false, |
||||||
|
}: { |
||||||
|
address: string; |
||||||
|
onCommentPosted?: () => void; |
||||||
|
inline?: boolean; |
||||||
|
} = $props(); |
||||||
|
|
||||||
|
const ndk: NDK = getContext("ndk"); |
||||||
|
|
||||||
|
// State management |
||||||
|
let showCommentUI = $state(false); |
||||||
|
let commentContent = $state(""); |
||||||
|
let isSubmitting = $state(false); |
||||||
|
let error = $state<string | null>(null); |
||||||
|
let success = $state(false); |
||||||
|
let showJsonPreview = $state(false); |
||||||
|
|
||||||
|
// Build preview JSON for the comment event |
||||||
|
let previewJson = $derived.by(() => { |
||||||
|
if (!commentContent.trim()) return null; |
||||||
|
|
||||||
|
const eventDetails = parseAddress(address); |
||||||
|
if (!eventDetails) return null; |
||||||
|
|
||||||
|
const { kind, pubkey: authorPubkey, dTag } = eventDetails; |
||||||
|
const relayHint = $activeOutboxRelays[0] || ""; |
||||||
|
|
||||||
|
return { |
||||||
|
kind: 1111, |
||||||
|
pubkey: $userStore.pubkey || "<your-pubkey>", |
||||||
|
created_at: Math.floor(Date.now() / 1000), |
||||||
|
tags: [ |
||||||
|
["A", address, relayHint, authorPubkey], |
||||||
|
["K", kind.toString()], |
||||||
|
["P", authorPubkey, relayHint], |
||||||
|
["a", address, relayHint], |
||||||
|
["k", kind.toString()], |
||||||
|
["p", authorPubkey, relayHint], |
||||||
|
], |
||||||
|
content: commentContent, |
||||||
|
id: "<calculated-on-signing>", |
||||||
|
sig: "<calculated-on-signing>" |
||||||
|
}; |
||||||
|
}); |
||||||
|
|
||||||
|
// Parse address to get event details |
||||||
|
function parseAddress(address: string): { kind: number; pubkey: string; dTag: string } | null { |
||||||
|
const parts = address.split(":"); |
||||||
|
if (parts.length !== 3) { |
||||||
|
console.error("[CommentButton] Invalid address format:", address); |
||||||
|
return null; |
||||||
|
} |
||||||
|
|
||||||
|
const [kindStr, pubkey, dTag] = parts; |
||||||
|
const kind = parseInt(kindStr); |
||||||
|
|
||||||
|
if (isNaN(kind)) { |
||||||
|
console.error("[CommentButton] Invalid kind in address:", kindStr); |
||||||
|
return null; |
||||||
|
} |
||||||
|
|
||||||
|
return { kind, pubkey, dTag }; |
||||||
|
} |
||||||
|
|
||||||
|
// Create NIP-22 comment event |
||||||
|
async function createCommentEvent(content: string): Promise<NDKEvent | null> { |
||||||
|
const eventDetails = parseAddress(address); |
||||||
|
if (!eventDetails) { |
||||||
|
error = "Invalid event address"; |
||||||
|
return null; |
||||||
|
} |
||||||
|
|
||||||
|
const { kind, pubkey: authorPubkey, dTag } = eventDetails; |
||||||
|
|
||||||
|
// Get relay hint (use first available outbox relay) |
||||||
|
const relayHint = $activeOutboxRelays[0] || ""; |
||||||
|
|
||||||
|
// Get the actual event to include its ID in tags |
||||||
|
let eventId = ""; |
||||||
|
try { |
||||||
|
const targetEvent = await ndk.fetchEvent({ |
||||||
|
kinds: [kind], |
||||||
|
authors: [authorPubkey], |
||||||
|
"#d": [dTag], |
||||||
|
}); |
||||||
|
|
||||||
|
if (targetEvent) { |
||||||
|
eventId = targetEvent.id; |
||||||
|
} |
||||||
|
} catch (err) { |
||||||
|
console.warn("[CommentButton] Could not fetch target event ID:", err); |
||||||
|
} |
||||||
|
|
||||||
|
// Create the comment event following NIP-22 structure |
||||||
|
const commentEvent = new NDKEvent(ndk); |
||||||
|
commentEvent.kind = 1111; |
||||||
|
commentEvent.content = content; |
||||||
|
commentEvent.pubkey = $userStore.pubkey || ""; // Set pubkey from user store |
||||||
|
|
||||||
|
// NIP-22 tags structure for top-level comments |
||||||
|
commentEvent.tags = [ |
||||||
|
// Root scope - uppercase tags |
||||||
|
["A", address, relayHint, authorPubkey], |
||||||
|
["K", kind.toString()], |
||||||
|
["P", authorPubkey, relayHint], |
||||||
|
|
||||||
|
// Parent scope (same as root for top-level) - lowercase tags |
||||||
|
["a", address, relayHint], |
||||||
|
["k", kind.toString()], |
||||||
|
["p", authorPubkey, relayHint], |
||||||
|
]; |
||||||
|
|
||||||
|
// Include e tag if we have the event ID |
||||||
|
if (eventId) { |
||||||
|
commentEvent.tags.push(["e", eventId, relayHint]); |
||||||
|
} |
||||||
|
|
||||||
|
console.log("[CommentButton] Created NIP-22 comment event:", { |
||||||
|
kind: commentEvent.kind, |
||||||
|
tags: commentEvent.tags, |
||||||
|
content: commentEvent.content, |
||||||
|
}); |
||||||
|
|
||||||
|
return commentEvent; |
||||||
|
} |
||||||
|
|
||||||
|
// Submit comment |
||||||
|
async function submitComment() { |
||||||
|
if (!commentContent.trim()) { |
||||||
|
error = "Comment cannot be empty"; |
||||||
|
return; |
||||||
|
} |
||||||
|
|
||||||
|
if (!$userStore.signedIn || !$userStore.signer) { |
||||||
|
error = "You must be signed in to comment"; |
||||||
|
return; |
||||||
|
} |
||||||
|
|
||||||
|
isSubmitting = true; |
||||||
|
error = null; |
||||||
|
success = false; |
||||||
|
|
||||||
|
try { |
||||||
|
const commentEvent = await createCommentEvent(commentContent); |
||||||
|
if (!commentEvent) { |
||||||
|
throw new Error("Failed to create comment event"); |
||||||
|
} |
||||||
|
|
||||||
|
// Sign the event - create plain object to avoid proxy issues |
||||||
|
const plainEvent = { |
||||||
|
kind: Number(commentEvent.kind), |
||||||
|
pubkey: String(commentEvent.pubkey), |
||||||
|
created_at: Number(commentEvent.created_at ?? Math.floor(Date.now() / 1000)), |
||||||
|
tags: commentEvent.tags.map((tag) => tag.map(String)), |
||||||
|
content: String(commentEvent.content), |
||||||
|
}; |
||||||
|
|
||||||
|
if (typeof window !== "undefined" && window.nostr && window.nostr.signEvent) { |
||||||
|
const signed = await window.nostr.signEvent(plainEvent); |
||||||
|
commentEvent.sig = signed.sig; |
||||||
|
if ("id" in signed) { |
||||||
|
commentEvent.id = signed.id as string; |
||||||
|
} |
||||||
|
} else { |
||||||
|
await commentEvent.sign($userStore.signer); |
||||||
|
} |
||||||
|
|
||||||
|
console.log("[CommentButton] Signed comment event:", commentEvent.rawEvent()); |
||||||
|
|
||||||
|
// Build relay list following the same pattern as eventServices |
||||||
|
const relays = [ |
||||||
|
...communityRelays, |
||||||
|
...$activeOutboxRelays, |
||||||
|
...$activeInboxRelays, |
||||||
|
]; |
||||||
|
|
||||||
|
// Remove duplicates |
||||||
|
const uniqueRelays = Array.from(new Set(relays)); |
||||||
|
|
||||||
|
console.log("[CommentButton] Publishing to relays:", uniqueRelays); |
||||||
|
|
||||||
|
const signedEvent = { |
||||||
|
...plainEvent, |
||||||
|
id: commentEvent.id, |
||||||
|
sig: commentEvent.sig, |
||||||
|
}; |
||||||
|
|
||||||
|
// Publish to relays using WebSocketPool |
||||||
|
let publishedCount = 0; |
||||||
|
for (const relayUrl of uniqueRelays) { |
||||||
|
try { |
||||||
|
const ws = await WebSocketPool.instance.acquire(relayUrl); |
||||||
|
|
||||||
|
await new Promise<void>((resolve, reject) => { |
||||||
|
const timeout = setTimeout(() => { |
||||||
|
WebSocketPool.instance.release(ws); |
||||||
|
reject(new Error("Timeout")); |
||||||
|
}, 5000); |
||||||
|
|
||||||
|
ws.onmessage = (e) => { |
||||||
|
const [type, id, ok, message] = JSON.parse(e.data); |
||||||
|
if (type === "OK" && id === signedEvent.id) { |
||||||
|
clearTimeout(timeout); |
||||||
|
if (ok) { |
||||||
|
publishedCount++; |
||||||
|
console.log(`[CommentButton] Published to ${relayUrl}`); |
||||||
|
WebSocketPool.instance.release(ws); |
||||||
|
resolve(); |
||||||
|
} else { |
||||||
|
console.warn(`[CommentButton] ${relayUrl} rejected: ${message}`); |
||||||
|
WebSocketPool.instance.release(ws); |
||||||
|
reject(new Error(message)); |
||||||
|
} |
||||||
|
} |
||||||
|
}; |
||||||
|
|
||||||
|
// Send the event to the relay |
||||||
|
ws.send(JSON.stringify(["EVENT", signedEvent])); |
||||||
|
}); |
||||||
|
} catch (e) { |
||||||
|
console.error(`[CommentButton] Failed to publish to ${relayUrl}:`, e); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
if (publishedCount === 0) { |
||||||
|
throw new Error("Failed to publish to any relays"); |
||||||
|
} |
||||||
|
|
||||||
|
console.log(`[CommentButton] Published to ${publishedCount} relay(s)`); |
||||||
|
|
||||||
|
// Success! |
||||||
|
success = true; |
||||||
|
commentContent = ""; |
||||||
|
showJsonPreview = false; |
||||||
|
|
||||||
|
// Close UI after a delay |
||||||
|
setTimeout(() => { |
||||||
|
showCommentUI = false; |
||||||
|
success = false; |
||||||
|
|
||||||
|
// Trigger refresh of CommentViewer if callback provided |
||||||
|
if (onCommentPosted) { |
||||||
|
onCommentPosted(); |
||||||
|
} |
||||||
|
}, 2000); |
||||||
|
|
||||||
|
} catch (err) { |
||||||
|
console.error("[CommentButton] Error submitting comment:", err); |
||||||
|
error = err instanceof Error ? err.message : "Failed to post comment"; |
||||||
|
} finally { |
||||||
|
isSubmitting = false; |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
// Cancel comment |
||||||
|
function cancelComment() { |
||||||
|
showCommentUI = false; |
||||||
|
commentContent = ""; |
||||||
|
error = null; |
||||||
|
success = false; |
||||||
|
showJsonPreview = false; |
||||||
|
} |
||||||
|
|
||||||
|
// Toggle comment UI |
||||||
|
function toggleCommentUI() { |
||||||
|
if (!$userStore.signedIn) { |
||||||
|
error = "You must be signed in to comment"; |
||||||
|
setTimeout(() => { |
||||||
|
error = null; |
||||||
|
}, 3000); |
||||||
|
return; |
||||||
|
} |
||||||
|
|
||||||
|
showCommentUI = !showCommentUI; |
||||||
|
error = null; |
||||||
|
success = false; |
||||||
|
showJsonPreview = false; |
||||||
|
} |
||||||
|
</script> |
||||||
|
|
||||||
|
<!-- Hamburger Comment Button --> |
||||||
|
<div class="comment-button-container" class:inline={inline}> |
||||||
|
<button |
||||||
|
class="single-line-button" |
||||||
|
onclick={toggleCommentUI} |
||||||
|
title="Add comment" |
||||||
|
aria-label="Add comment" |
||||||
|
> |
||||||
|
<span class="line"></span> |
||||||
|
<span class="line"></span> |
||||||
|
<span class="line"></span> |
||||||
|
</button> |
||||||
|
|
||||||
|
<!-- Comment Creation UI --> |
||||||
|
{#if showCommentUI} |
||||||
|
<div class="comment-ui"> |
||||||
|
<div class="comment-header"> |
||||||
|
<h4>Add Comment</h4> |
||||||
|
{#if $userStore.profile} |
||||||
|
<div class="user-info"> |
||||||
|
{#if $userStore.profile.picture} |
||||||
|
<img src={$userStore.profile.picture} alt={$userStore.profile.displayName || $userStore.profile.name || "User"} class="user-avatar" /> |
||||||
|
{/if} |
||||||
|
<span class="user-name">{$userStore.profile.displayName || $userStore.profile.name || "Anonymous"}</span> |
||||||
|
</div> |
||||||
|
{/if} |
||||||
|
</div> |
||||||
|
|
||||||
|
<Textarea |
||||||
|
bind:value={commentContent} |
||||||
|
placeholder="Write your comment here..." |
||||||
|
rows={4} |
||||||
|
disabled={isSubmitting} |
||||||
|
class="comment-textarea" |
||||||
|
/> |
||||||
|
|
||||||
|
{#if error} |
||||||
|
<P class="error-message text-red-600 dark:text-red-400 text-sm mt-2">{error}</P> |
||||||
|
{/if} |
||||||
|
|
||||||
|
{#if success} |
||||||
|
<P class="success-message text-green-600 dark:text-green-400 text-sm mt-2">Comment posted successfully!</P> |
||||||
|
{/if} |
||||||
|
|
||||||
|
<!-- JSON Preview Section --> |
||||||
|
{#if showJsonPreview && previewJson} |
||||||
|
<div class="border border-gray-300 dark:border-gray-600 rounded-lg p-3 bg-gray-50 dark:bg-gray-900 mt-3"> |
||||||
|
<P class="text-sm font-semibold mb-2">Event JSON Preview:</P> |
||||||
|
<pre class="text-xs bg-white dark:bg-gray-800 p-3 rounded overflow-x-auto border border-gray-200 dark:border-gray-700"><code>{JSON.stringify(previewJson, null, 2)}</code></pre> |
||||||
|
</div> |
||||||
|
{/if} |
||||||
|
|
||||||
|
<div class="comment-actions-wrapper"> |
||||||
|
<Button |
||||||
|
color="light" |
||||||
|
size="sm" |
||||||
|
onclick={() => showJsonPreview = !showJsonPreview} |
||||||
|
class="flex items-center gap-1" |
||||||
|
> |
||||||
|
{#if showJsonPreview} |
||||||
|
<ChevronUpOutline class="w-4 h-4" /> |
||||||
|
{:else} |
||||||
|
<ChevronDownOutline class="w-4 h-4" /> |
||||||
|
{/if} |
||||||
|
{showJsonPreview ? "Hide" : "Show"} JSON |
||||||
|
</Button> |
||||||
|
|
||||||
|
<div class="comment-actions"> |
||||||
|
<Button |
||||||
|
size="sm" |
||||||
|
color="alternative" |
||||||
|
onclick={cancelComment} |
||||||
|
disabled={isSubmitting} |
||||||
|
> |
||||||
|
Cancel |
||||||
|
</Button> |
||||||
|
<Button |
||||||
|
size="sm" |
||||||
|
onclick={submitComment} |
||||||
|
disabled={isSubmitting || !commentContent.trim()} |
||||||
|
> |
||||||
|
{isSubmitting ? "Posting..." : "Post Comment"} |
||||||
|
</Button> |
||||||
|
</div> |
||||||
|
</div> |
||||||
|
</div> |
||||||
|
{/if} |
||||||
|
</div> |
||||||
|
|
||||||
|
<style> |
||||||
|
.comment-button-container { |
||||||
|
position: absolute; |
||||||
|
top: 0; |
||||||
|
right: 0; |
||||||
|
left: 0; |
||||||
|
height: 0; |
||||||
|
pointer-events: none; |
||||||
|
} |
||||||
|
|
||||||
|
.comment-button-container.inline { |
||||||
|
position: relative; |
||||||
|
height: auto; |
||||||
|
pointer-events: auto; |
||||||
|
} |
||||||
|
|
||||||
|
.single-line-button { |
||||||
|
position: absolute; |
||||||
|
top: 4px; |
||||||
|
right: 8px; |
||||||
|
display: flex; |
||||||
|
flex-direction: column; |
||||||
|
justify-content: space-between; |
||||||
|
width: 24px; |
||||||
|
height: 18px; |
||||||
|
padding: 4px; |
||||||
|
background: transparent; |
||||||
|
border: none; |
||||||
|
cursor: pointer; |
||||||
|
opacity: 0; |
||||||
|
transition: opacity 0.2s ease-in-out; |
||||||
|
z-index: 10; |
||||||
|
pointer-events: auto; |
||||||
|
} |
||||||
|
|
||||||
|
.comment-button-container.inline .single-line-button { |
||||||
|
position: relative; |
||||||
|
top: 0; |
||||||
|
right: 0; |
||||||
|
opacity: 1; |
||||||
|
} |
||||||
|
|
||||||
|
.single-line-button:hover .line { |
||||||
|
border-width: 3px; |
||||||
|
} |
||||||
|
|
||||||
|
.line { |
||||||
|
display: block; |
||||||
|
width: 100%; |
||||||
|
height: 0; |
||||||
|
border: none; |
||||||
|
border-top: 2px dashed #6b7280; |
||||||
|
transition: all 0.2s ease-in-out; |
||||||
|
} |
||||||
|
|
||||||
|
.comment-ui { |
||||||
|
position: absolute; |
||||||
|
top: 35px; |
||||||
|
right: 8px; |
||||||
|
min-width: 400px; |
||||||
|
max-width: 600px; |
||||||
|
background: white; |
||||||
|
border: 1px solid #e5e7eb; |
||||||
|
border-radius: 8px; |
||||||
|
padding: 16px; |
||||||
|
box-shadow: 0 4px 6px -1px rgba(0, 0, 0, 0.1), 0 2px 4px -1px rgba(0, 0, 0, 0.06); |
||||||
|
z-index: 20; |
||||||
|
pointer-events: auto; |
||||||
|
} |
||||||
|
|
||||||
|
:global(.dark) .comment-ui { |
||||||
|
background: #1f2937; |
||||||
|
border-color: #374151; |
||||||
|
} |
||||||
|
|
||||||
|
.comment-header { |
||||||
|
display: flex; |
||||||
|
justify-content: space-between; |
||||||
|
align-items: center; |
||||||
|
margin-bottom: 12px; |
||||||
|
} |
||||||
|
|
||||||
|
.comment-header h4 { |
||||||
|
font-size: 16px; |
||||||
|
font-weight: 600; |
||||||
|
margin: 0; |
||||||
|
color: #111827; |
||||||
|
} |
||||||
|
|
||||||
|
:global(.dark) .comment-header h4 { |
||||||
|
color: #f9fafb; |
||||||
|
} |
||||||
|
|
||||||
|
.user-info { |
||||||
|
display: flex; |
||||||
|
align-items: center; |
||||||
|
gap: 8px; |
||||||
|
} |
||||||
|
|
||||||
|
.user-avatar { |
||||||
|
width: 24px; |
||||||
|
height: 24px; |
||||||
|
border-radius: 50%; |
||||||
|
object-fit: cover; |
||||||
|
} |
||||||
|
|
||||||
|
.user-name { |
||||||
|
font-size: 14px; |
||||||
|
color: #6b7280; |
||||||
|
} |
||||||
|
|
||||||
|
:global(.dark) .user-name { |
||||||
|
color: #9ca3af; |
||||||
|
} |
||||||
|
|
||||||
|
.comment-actions-wrapper { |
||||||
|
display: flex; |
||||||
|
justify-content: space-between; |
||||||
|
align-items: center; |
||||||
|
margin-top: 12px; |
||||||
|
} |
||||||
|
|
||||||
|
.comment-actions { |
||||||
|
display: flex; |
||||||
|
justify-content: flex-end; |
||||||
|
gap: 8px; |
||||||
|
} |
||||||
|
|
||||||
|
/* Make the comment UI responsive */ |
||||||
|
@media (max-width: 640px) { |
||||||
|
.comment-ui { |
||||||
|
min-width: 280px; |
||||||
|
max-width: calc(100vw - 32px); |
||||||
|
right: -8px; |
||||||
|
} |
||||||
|
} |
||||||
|
</style> |
||||||
@ -0,0 +1,282 @@ |
|||||||
|
<script lang="ts"> |
||||||
|
import { getNdkContext, activeInboxRelays, activeOutboxRelays } from "$lib/ndk"; |
||||||
|
import type { NDKEvent } from "@nostr-dev-kit/ndk"; |
||||||
|
import { NDKEvent as NDKEventClass } from "@nostr-dev-kit/ndk"; |
||||||
|
import { communityRelays } from "$lib/consts"; |
||||||
|
import { WebSocketPool } from "$lib/data_structures/websocket_pool"; |
||||||
|
import { generateMockCommentsForSections } from "$lib/utils/mockCommentData"; |
||||||
|
|
||||||
|
let { |
||||||
|
eventId, |
||||||
|
eventAddress, |
||||||
|
eventIds = [], |
||||||
|
eventAddresses = [], |
||||||
|
comments = $bindable([]), |
||||||
|
useMockComments = false, |
||||||
|
}: { |
||||||
|
eventId?: string; |
||||||
|
eventAddress?: string; |
||||||
|
eventIds?: string[]; |
||||||
|
eventAddresses?: string[]; |
||||||
|
comments?: NDKEvent[]; |
||||||
|
useMockComments?: boolean; |
||||||
|
} = $props(); |
||||||
|
|
||||||
|
const ndk = getNdkContext(); |
||||||
|
|
||||||
|
// State management |
||||||
|
let loading = $state(false); |
||||||
|
|
||||||
|
/** |
||||||
|
* Fetch comment events (kind 1111) for the current publication using WebSocketPool |
||||||
|
* |
||||||
|
* This follows the exact pattern from HighlightLayer.svelte to ensure reliability. |
||||||
|
* Uses WebSocketPool with nostr-tools protocol instead of NDK subscriptions. |
||||||
|
*/ |
||||||
|
async function fetchComments() { |
||||||
|
// Prevent concurrent fetches |
||||||
|
if (loading) { |
||||||
|
console.log("[CommentLayer] Already loading, skipping fetch"); |
||||||
|
return; |
||||||
|
} |
||||||
|
|
||||||
|
// Collect all event IDs and addresses |
||||||
|
const allEventIds = [...(eventId ? [eventId] : []), ...eventIds].filter(Boolean); |
||||||
|
const allAddresses = [...(eventAddress ? [eventAddress] : []), ...eventAddresses].filter(Boolean); |
||||||
|
|
||||||
|
if (allEventIds.length === 0 && allAddresses.length === 0) { |
||||||
|
console.warn("[CommentLayer] No event IDs or addresses provided"); |
||||||
|
return; |
||||||
|
} |
||||||
|
|
||||||
|
loading = true; |
||||||
|
comments = []; |
||||||
|
|
||||||
|
// AI-NOTE: Mock mode allows testing comment UI without publishing to relays |
||||||
|
// This is useful for development and demonstrating the comment system |
||||||
|
if (useMockComments) { |
||||||
|
console.log(`[CommentLayer] MOCK MODE - Generating mock comments for ${allAddresses.length} sections`); |
||||||
|
|
||||||
|
try { |
||||||
|
// Generate mock comment data |
||||||
|
const mockComments = generateMockCommentsForSections(allAddresses); |
||||||
|
|
||||||
|
// Convert to NDKEvent instances (same as real events) |
||||||
|
comments = mockComments.map(rawEvent => new NDKEventClass(ndk, rawEvent)); |
||||||
|
|
||||||
|
console.log(`[CommentLayer] Generated ${comments.length} mock comments`); |
||||||
|
loading = false; |
||||||
|
return; |
||||||
|
} catch (err) { |
||||||
|
console.error(`[CommentLayer] Error generating mock comments:`, err); |
||||||
|
loading = false; |
||||||
|
return; |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
console.log(`[CommentLayer] Fetching comments for:`, { |
||||||
|
eventIds: allEventIds, |
||||||
|
addresses: allAddresses |
||||||
|
}); |
||||||
|
|
||||||
|
try { |
||||||
|
// Build filter for kind 1111 comment events |
||||||
|
// IMPORTANT: Use only #a tags because filters are AND, not OR |
||||||
|
// If we include both #e and #a, relays will only return comments that have BOTH |
||||||
|
const filter: any = { |
||||||
|
kinds: [1111], |
||||||
|
limit: 500, |
||||||
|
}; |
||||||
|
|
||||||
|
// Prefer #a (addressable events) since they're more specific and persistent |
||||||
|
if (allAddresses.length > 0) { |
||||||
|
filter["#a"] = allAddresses; |
||||||
|
} else if (allEventIds.length > 0) { |
||||||
|
// Fallback to #e if no addresses available |
||||||
|
filter["#e"] = allEventIds; |
||||||
|
} |
||||||
|
|
||||||
|
console.log(`[CommentLayer] Fetching with filter:`, JSON.stringify(filter, null, 2)); |
||||||
|
|
||||||
|
// Build explicit relay set (same pattern as HighlightLayer) |
||||||
|
const relays = [ |
||||||
|
...communityRelays, |
||||||
|
...$activeOutboxRelays, |
||||||
|
...$activeInboxRelays, |
||||||
|
]; |
||||||
|
const uniqueRelays = Array.from(new Set(relays)); |
||||||
|
console.log(`[CommentLayer] Fetching from ${uniqueRelays.length} relays:`, uniqueRelays); |
||||||
|
|
||||||
|
/** |
||||||
|
* Use WebSocketPool with nostr-tools protocol instead of NDK |
||||||
|
* |
||||||
|
* Reasons for not using NDK: |
||||||
|
* 1. NDK subscriptions mysteriously returned 0 events even when websocat confirmed events existed |
||||||
|
* 2. Consistency - HighlightLayer, CommentButton, and HighlightSelectionHandler use WebSocketPool |
||||||
|
* 3. Better debugging - direct access to WebSocket messages for troubleshooting |
||||||
|
* 4. Proven reliability - battle-tested in the codebase for similar use cases |
||||||
|
* 5. Performance control - explicit 5s timeout per relay, tunable as needed |
||||||
|
* |
||||||
|
* This matches the pattern in: |
||||||
|
* - src/lib/components/publications/HighlightLayer.svelte:111-212 |
||||||
|
* - src/lib/components/publications/CommentButton.svelte:156-220 |
||||||
|
* - src/lib/components/publications/HighlightSelectionHandler.svelte:217-280 |
||||||
|
*/ |
||||||
|
const subscriptionId = `comments-${Date.now()}`; |
||||||
|
const receivedEventIds = new Set<string>(); |
||||||
|
let eoseCount = 0; |
||||||
|
|
||||||
|
const fetchPromises = uniqueRelays.map(async (relayUrl) => { |
||||||
|
try { |
||||||
|
console.log(`[CommentLayer] Connecting to ${relayUrl}`); |
||||||
|
const ws = await WebSocketPool.instance.acquire(relayUrl); |
||||||
|
|
||||||
|
return new Promise<void>((resolve) => { |
||||||
|
const messageHandler = (event: MessageEvent) => { |
||||||
|
try { |
||||||
|
const message = JSON.parse(event.data); |
||||||
|
|
||||||
|
// Log ALL messages from relay.nostr.band for debugging |
||||||
|
if (relayUrl.includes('relay.nostr.band')) { |
||||||
|
console.log(`[CommentLayer] RAW message from ${relayUrl}:`, message); |
||||||
|
} |
||||||
|
|
||||||
|
if (message[0] === "EVENT" && message[1] === subscriptionId) { |
||||||
|
const rawEvent = message[2]; |
||||||
|
console.log(`[CommentLayer] EVENT from ${relayUrl}:`, { |
||||||
|
id: rawEvent.id, |
||||||
|
kind: rawEvent.kind, |
||||||
|
content: rawEvent.content.substring(0, 50), |
||||||
|
tags: rawEvent.tags |
||||||
|
}); |
||||||
|
|
||||||
|
// Avoid duplicates |
||||||
|
if (!receivedEventIds.has(rawEvent.id)) { |
||||||
|
receivedEventIds.add(rawEvent.id); |
||||||
|
|
||||||
|
// Convert to NDKEvent |
||||||
|
const ndkEvent = new NDKEventClass(ndk, rawEvent); |
||||||
|
comments = [...comments, ndkEvent]; |
||||||
|
console.log(`[CommentLayer] Added comment, total now: ${comments.length}`); |
||||||
|
} |
||||||
|
} else if (message[0] === "EOSE" && message[1] === subscriptionId) { |
||||||
|
eoseCount++; |
||||||
|
console.log(`[CommentLayer] EOSE from ${relayUrl} (${eoseCount}/${uniqueRelays.length})`); |
||||||
|
|
||||||
|
// Close subscription |
||||||
|
ws.send(JSON.stringify(["CLOSE", subscriptionId])); |
||||||
|
ws.removeEventListener("message", messageHandler); |
||||||
|
WebSocketPool.instance.release(ws); |
||||||
|
resolve(); |
||||||
|
} else if (message[0] === "NOTICE") { |
||||||
|
console.warn(`[CommentLayer] NOTICE from ${relayUrl}:`, message[1]); |
||||||
|
} |
||||||
|
} catch (err) { |
||||||
|
console.error(`[CommentLayer] Error processing message from ${relayUrl}:`, err); |
||||||
|
} |
||||||
|
}; |
||||||
|
|
||||||
|
ws.addEventListener("message", messageHandler); |
||||||
|
|
||||||
|
// Send REQ |
||||||
|
const req = ["REQ", subscriptionId, filter]; |
||||||
|
if (relayUrl.includes('relay.nostr.band')) { |
||||||
|
console.log(`[CommentLayer] Sending REQ to ${relayUrl}:`, JSON.stringify(req)); |
||||||
|
} else { |
||||||
|
console.log(`[CommentLayer] Sending REQ to ${relayUrl}`); |
||||||
|
} |
||||||
|
ws.send(JSON.stringify(req)); |
||||||
|
|
||||||
|
// Timeout per relay (5 seconds) |
||||||
|
setTimeout(() => { |
||||||
|
if (ws.readyState === WebSocket.OPEN) { |
||||||
|
ws.send(JSON.stringify(["CLOSE", subscriptionId])); |
||||||
|
ws.removeEventListener("message", messageHandler); |
||||||
|
WebSocketPool.instance.release(ws); |
||||||
|
} |
||||||
|
resolve(); |
||||||
|
}, 5000); |
||||||
|
}); |
||||||
|
} catch (err) { |
||||||
|
console.error(`[CommentLayer] Error connecting to ${relayUrl}:`, err); |
||||||
|
} |
||||||
|
}); |
||||||
|
|
||||||
|
// Wait for all relays to respond or timeout |
||||||
|
await Promise.all(fetchPromises); |
||||||
|
|
||||||
|
console.log(`[CommentLayer] Fetched ${comments.length} comments`); |
||||||
|
|
||||||
|
if (comments.length > 0) { |
||||||
|
console.log(`[CommentLayer] Comments summary:`, comments.map(c => ({ |
||||||
|
content: c.content.substring(0, 30) + "...", |
||||||
|
address: c.tags.find(t => t[0] === "a")?.[1], |
||||||
|
author: c.pubkey.substring(0, 8) |
||||||
|
}))); |
||||||
|
} |
||||||
|
|
||||||
|
loading = false; |
||||||
|
|
||||||
|
} catch (err) { |
||||||
|
console.error(`[CommentLayer] Error fetching comments:`, err); |
||||||
|
loading = false; |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
// Track the last fetched event count to know when to refetch |
||||||
|
let lastFetchedCount = $state(0); |
||||||
|
let fetchTimeout: ReturnType<typeof setTimeout> | null = null; |
||||||
|
|
||||||
|
// Watch for changes to event data - debounce and fetch when data stabilizes |
||||||
|
$effect(() => { |
||||||
|
const currentCount = eventIds.length + eventAddresses.length; |
||||||
|
const hasEventData = currentCount > 0; |
||||||
|
|
||||||
|
console.log(`[CommentLayer] Event data effect - count: ${currentCount}, lastFetched: ${lastFetchedCount}, loading: ${loading}`); |
||||||
|
|
||||||
|
// Only fetch if: |
||||||
|
// 1. We have event data |
||||||
|
// 2. The count has changed since last fetch |
||||||
|
// 3. We're not already loading |
||||||
|
if (hasEventData && currentCount !== lastFetchedCount && !loading) { |
||||||
|
// Clear any existing timeout |
||||||
|
if (fetchTimeout) { |
||||||
|
clearTimeout(fetchTimeout); |
||||||
|
} |
||||||
|
|
||||||
|
// Debounce: wait 500ms for more events to arrive before fetching |
||||||
|
fetchTimeout = setTimeout(() => { |
||||||
|
console.log(`[CommentLayer] Event data stabilized at ${currentCount} events, fetching comments...`); |
||||||
|
lastFetchedCount = currentCount; |
||||||
|
fetchComments(); |
||||||
|
}, 500); |
||||||
|
} |
||||||
|
|
||||||
|
// Cleanup timeout on effect cleanup |
||||||
|
return () => { |
||||||
|
if (fetchTimeout) { |
||||||
|
clearTimeout(fetchTimeout); |
||||||
|
} |
||||||
|
}; |
||||||
|
}); |
||||||
|
|
||||||
|
/** |
||||||
|
* Public method to refresh comments (e.g., after creating a new one) |
||||||
|
*/ |
||||||
|
export function refresh() { |
||||||
|
console.log("[CommentLayer] Manual refresh triggered"); |
||||||
|
|
||||||
|
// Clear existing comments |
||||||
|
comments = []; |
||||||
|
|
||||||
|
// Reset fetch count to force re-fetch |
||||||
|
lastFetchedCount = 0; |
||||||
|
fetchComments(); |
||||||
|
} |
||||||
|
</script> |
||||||
|
|
||||||
|
{#if loading} |
||||||
|
<div class="fixed top-40 right-4 z-50 bg-white dark:bg-gray-800 rounded-lg shadow-lg p-3"> |
||||||
|
<p class="text-sm text-gray-600 dark:text-gray-300">Loading comments...</p> |
||||||
|
</div> |
||||||
|
{/if} |
||||||
@ -0,0 +1,280 @@ |
|||||||
|
<script lang="ts"> |
||||||
|
import type { NDKEvent } from "@nostr-dev-kit/ndk"; |
||||||
|
import { getUserMetadata, toNpub } from "$lib/utils/nostrUtils"; |
||||||
|
import { getNdkContext } from "$lib/ndk"; |
||||||
|
import { basicMarkup } from "$lib/snippets/MarkupSnippets.svelte"; |
||||||
|
import { ChevronDownOutline, ChevronRightOutline } from "flowbite-svelte-icons"; |
||||||
|
|
||||||
|
let { |
||||||
|
comments = [], |
||||||
|
sectionTitles = new Map<string, string>(), |
||||||
|
}: { |
||||||
|
comments: NDKEvent[]; |
||||||
|
sectionTitles?: Map<string, string>; |
||||||
|
} = $props(); |
||||||
|
|
||||||
|
const ndk = getNdkContext(); |
||||||
|
|
||||||
|
// State management |
||||||
|
let profiles = $state(new Map<string, any>()); |
||||||
|
let expandedSections = $state(new Set<string>()); |
||||||
|
|
||||||
|
/** |
||||||
|
* Group comments by their target event address |
||||||
|
* Extracts the target from #a or #e tags |
||||||
|
*/ |
||||||
|
let groupedComments = $derived.by(() => { |
||||||
|
const groups = new Map<string, NDKEvent[]>(); |
||||||
|
|
||||||
|
for (const comment of comments) { |
||||||
|
// Look for #a tag first (addressable events - preferred) |
||||||
|
const aTag = comment.tags.find(t => t[0] === "a"); |
||||||
|
if (aTag && aTag[1]) { |
||||||
|
const address = aTag[1]; |
||||||
|
if (!groups.has(address)) { |
||||||
|
groups.set(address, []); |
||||||
|
} |
||||||
|
groups.get(address)!.push(comment); |
||||||
|
continue; |
||||||
|
} |
||||||
|
|
||||||
|
// Fallback to #e tag (event ID) |
||||||
|
const eTag = comment.tags.find(t => t[0] === "e"); |
||||||
|
if (eTag && eTag[1]) { |
||||||
|
const eventId = eTag[1]; |
||||||
|
if (!groups.has(eventId)) { |
||||||
|
groups.set(eventId, []); |
||||||
|
} |
||||||
|
groups.get(eventId)!.push(comment); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
console.log(`[CommentPanel] Grouped ${comments.length} comments into ${groups.size} sections`); |
||||||
|
return groups; |
||||||
|
}); |
||||||
|
|
||||||
|
/** |
||||||
|
* Get a display label for a target address/id |
||||||
|
* Uses provided section titles, or falls back to address/id |
||||||
|
*/ |
||||||
|
function getTargetLabel(target: string): string { |
||||||
|
// Check if we have a title for this address |
||||||
|
if (sectionTitles.has(target)) { |
||||||
|
return sectionTitles.get(target)!; |
||||||
|
} |
||||||
|
|
||||||
|
// Parse address format: kind:pubkey:d-tag |
||||||
|
const parts = target.split(":"); |
||||||
|
if (parts.length === 3) { |
||||||
|
const [kind, _pubkey, dTag] = parts; |
||||||
|
if (kind === "30040") { |
||||||
|
return "Comments on Collection"; |
||||||
|
} |
||||||
|
if (kind === "30041" && dTag) { |
||||||
|
return `Section: ${dTag}`; |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
// Fallback to truncated address/id |
||||||
|
return target.length > 20 ? `${target.substring(0, 20)}...` : target; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Fetch profile for a pubkey |
||||||
|
*/ |
||||||
|
async function fetchProfile(pubkey: string) { |
||||||
|
if (profiles.has(pubkey)) return; |
||||||
|
|
||||||
|
try { |
||||||
|
const npub = toNpub(pubkey); |
||||||
|
if (!npub) { |
||||||
|
setFallbackProfile(pubkey); |
||||||
|
return; |
||||||
|
} |
||||||
|
|
||||||
|
const profile = await getUserMetadata(npub, ndk, true); |
||||||
|
const newProfiles = new Map(profiles); |
||||||
|
newProfiles.set(pubkey, profile); |
||||||
|
profiles = newProfiles; |
||||||
|
|
||||||
|
console.log(`[CommentPanel] Fetched profile for ${pubkey}:`, profile); |
||||||
|
} catch (err) { |
||||||
|
console.warn(`[CommentPanel] Failed to fetch profile for ${pubkey}:`, err); |
||||||
|
setFallbackProfile(pubkey); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Set fallback profile using truncated npub |
||||||
|
*/ |
||||||
|
function setFallbackProfile(pubkey: string) { |
||||||
|
const npub = toNpub(pubkey) || pubkey; |
||||||
|
const truncated = `${npub.slice(0, 12)}...${npub.slice(-4)}`; |
||||||
|
const fallbackProfile = { |
||||||
|
name: truncated, |
||||||
|
displayName: truncated, |
||||||
|
picture: null |
||||||
|
}; |
||||||
|
const newProfiles = new Map(profiles); |
||||||
|
newProfiles.set(pubkey, fallbackProfile); |
||||||
|
profiles = newProfiles; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Get display name for a pubkey |
||||||
|
*/ |
||||||
|
function getDisplayName(pubkey: string): string { |
||||||
|
const profile = profiles.get(pubkey); |
||||||
|
if (profile) { |
||||||
|
return profile.displayName || profile.name || profile.pubkey || pubkey; |
||||||
|
} |
||||||
|
// Return truncated npub while loading |
||||||
|
const npub = toNpub(pubkey) || pubkey; |
||||||
|
return `${npub.slice(0, 12)}...${npub.slice(-4)}`; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Toggle section expansion |
||||||
|
*/ |
||||||
|
function toggleSection(target: string) { |
||||||
|
const newExpanded = new Set(expandedSections); |
||||||
|
if (newExpanded.has(target)) { |
||||||
|
newExpanded.delete(target); |
||||||
|
} else { |
||||||
|
newExpanded.add(target); |
||||||
|
} |
||||||
|
expandedSections = newExpanded; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Format timestamp |
||||||
|
*/ |
||||||
|
function formatTimestamp(timestamp: number): string { |
||||||
|
const date = new Date(timestamp * 1000); |
||||||
|
const now = new Date(); |
||||||
|
const diffMs = now.getTime() - date.getTime(); |
||||||
|
const diffMins = Math.floor(diffMs / 60000); |
||||||
|
const diffHours = Math.floor(diffMs / 3600000); |
||||||
|
const diffDays = Math.floor(diffMs / 86400000); |
||||||
|
|
||||||
|
if (diffMins < 60) { |
||||||
|
return `${diffMins}m ago`; |
||||||
|
} else if (diffHours < 24) { |
||||||
|
return `${diffHours}h ago`; |
||||||
|
} else if (diffDays < 7) { |
||||||
|
return `${diffDays}d ago`; |
||||||
|
} else { |
||||||
|
return date.toLocaleDateString(); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Pre-fetch all profiles when comments change |
||||||
|
*/ |
||||||
|
$effect(() => { |
||||||
|
const uniquePubkeys = new Set(comments.map(c => c.pubkey)); |
||||||
|
console.log(`[CommentPanel] Pre-fetching ${uniquePubkeys.size} profiles`); |
||||||
|
for (const pubkey of uniquePubkeys) { |
||||||
|
fetchProfile(pubkey); |
||||||
|
} |
||||||
|
}); |
||||||
|
</script> |
||||||
|
|
||||||
|
{#if comments.length > 0} |
||||||
|
<div class="fixed right-4 top-20 bottom-4 w-96 bg-white dark:bg-gray-800 rounded-lg shadow-xl overflow-hidden flex flex-col z-40"> |
||||||
|
<!-- Header --> |
||||||
|
<div class="p-4 border-b border-gray-200 dark:border-gray-700"> |
||||||
|
<h3 class="text-lg font-semibold text-gray-900 dark:text-gray-100"> |
||||||
|
Comments ({comments.length}) |
||||||
|
</h3> |
||||||
|
</div> |
||||||
|
|
||||||
|
<!-- Comment groups --> |
||||||
|
<div class="flex-1 overflow-y-auto p-4 space-y-4"> |
||||||
|
{#each Array.from(groupedComments.entries()) as [target, targetComments]} |
||||||
|
<div class="border border-gray-200 dark:border-gray-700 rounded-lg overflow-hidden"> |
||||||
|
<!-- Section header --> |
||||||
|
<button |
||||||
|
class="w-full px-4 py-3 flex items-center justify-between bg-gray-50 dark:bg-gray-700/50 hover:bg-gray-100 dark:hover:bg-gray-700 transition-colors" |
||||||
|
onclick={() => toggleSection(target)} |
||||||
|
> |
||||||
|
<div class="flex items-center gap-2"> |
||||||
|
{#if expandedSections.has(target)} |
||||||
|
<ChevronDownOutline class="w-4 h-4 text-gray-600 dark:text-gray-400" /> |
||||||
|
{:else} |
||||||
|
<ChevronRightOutline class="w-4 h-4 text-gray-600 dark:text-gray-400" /> |
||||||
|
{/if} |
||||||
|
<span class="text-sm font-medium text-gray-900 dark:text-gray-100"> |
||||||
|
{getTargetLabel(target)} |
||||||
|
</span> |
||||||
|
</div> |
||||||
|
<span class="text-xs text-gray-500 dark:text-gray-400"> |
||||||
|
{targetComments.length} {targetComments.length === 1 ? 'comment' : 'comments'} |
||||||
|
</span> |
||||||
|
</button> |
||||||
|
|
||||||
|
<!-- Comment list --> |
||||||
|
{#if expandedSections.has(target)} |
||||||
|
<div class="divide-y divide-gray-200 dark:divide-gray-700"> |
||||||
|
{#each targetComments as comment (comment.id)} |
||||||
|
<div class="p-4 hover:bg-gray-50 dark:hover:bg-gray-700/30 transition-colors"> |
||||||
|
<!-- Comment header --> |
||||||
|
<div class="flex items-start gap-3 mb-2"> |
||||||
|
<div class="flex-1 min-w-0"> |
||||||
|
<div class="flex items-baseline gap-2"> |
||||||
|
<span class="text-sm font-medium text-gray-900 dark:text-gray-100 truncate"> |
||||||
|
{getDisplayName(comment.pubkey)} |
||||||
|
</span> |
||||||
|
<span class="text-xs text-gray-500 dark:text-gray-400"> |
||||||
|
{formatTimestamp(comment.created_at || 0)} |
||||||
|
</span> |
||||||
|
</div> |
||||||
|
</div> |
||||||
|
</div> |
||||||
|
|
||||||
|
<!-- Comment content --> |
||||||
|
<div class="text-sm text-gray-700 dark:text-gray-300 prose prose-sm dark:prose-invert max-w-none"> |
||||||
|
{@render basicMarkup(comment.content)} |
||||||
|
</div> |
||||||
|
</div> |
||||||
|
{/each} |
||||||
|
</div> |
||||||
|
{/if} |
||||||
|
</div> |
||||||
|
{/each} |
||||||
|
|
||||||
|
{#if groupedComments.size === 0 && comments.length > 0} |
||||||
|
<div class="text-center py-8"> |
||||||
|
<p class="text-sm text-gray-500 dark:text-gray-400"> |
||||||
|
Comments loaded but couldn't determine their targets |
||||||
|
</p> |
||||||
|
</div> |
||||||
|
{/if} |
||||||
|
</div> |
||||||
|
</div> |
||||||
|
{/if} |
||||||
|
|
||||||
|
<style> |
||||||
|
/* Custom scrollbar for comment panel */ |
||||||
|
.overflow-y-auto { |
||||||
|
scrollbar-width: thin; |
||||||
|
scrollbar-color: rgba(156, 163, 175, 0.5) transparent; |
||||||
|
} |
||||||
|
|
||||||
|
.overflow-y-auto::-webkit-scrollbar { |
||||||
|
width: 6px; |
||||||
|
} |
||||||
|
|
||||||
|
.overflow-y-auto::-webkit-scrollbar-track { |
||||||
|
background: transparent; |
||||||
|
} |
||||||
|
|
||||||
|
.overflow-y-auto::-webkit-scrollbar-thumb { |
||||||
|
background-color: rgba(156, 163, 175, 0.5); |
||||||
|
border-radius: 3px; |
||||||
|
} |
||||||
|
|
||||||
|
.overflow-y-auto::-webkit-scrollbar-thumb:hover { |
||||||
|
background-color: rgba(156, 163, 175, 0.7); |
||||||
|
} |
||||||
|
</style> |
||||||
@ -0,0 +1,126 @@ |
|||||||
|
<script lang="ts"> |
||||||
|
import { Button, Modal } from "flowbite-svelte"; |
||||||
|
import { TrashBinOutline } from "flowbite-svelte-icons"; |
||||||
|
import { getContext } from "svelte"; |
||||||
|
import type NDK from "@nostr-dev-kit/ndk"; |
||||||
|
import type { NDKEvent } from "@nostr-dev-kit/ndk"; |
||||||
|
import { deleteEvent, canDeleteEvent } from "$lib/services/deletion"; |
||||||
|
import { userStore } from "$lib/stores/userStore"; |
||||||
|
|
||||||
|
let { |
||||||
|
address, |
||||||
|
leafEvent, |
||||||
|
onDeleted, |
||||||
|
}: { |
||||||
|
address: string; |
||||||
|
leafEvent: Promise<NDKEvent | null>; |
||||||
|
onDeleted?: () => void; |
||||||
|
} = $props(); |
||||||
|
|
||||||
|
const ndk: NDK = getContext("ndk"); |
||||||
|
|
||||||
|
let showDeleteModal = $state(false); |
||||||
|
let isDeleting = $state(false); |
||||||
|
let deleteError = $state<string | null>(null); |
||||||
|
let resolvedEvent = $state<NDKEvent | null>(null); |
||||||
|
|
||||||
|
// Check if user can delete this event |
||||||
|
let canDelete = $derived(canDeleteEvent(resolvedEvent, ndk)); |
||||||
|
|
||||||
|
// Resolve the event promise |
||||||
|
$effect(() => { |
||||||
|
leafEvent.then(event => { |
||||||
|
resolvedEvent = event; |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
async function handleDelete() { |
||||||
|
if (!resolvedEvent) { |
||||||
|
deleteError = "Event not found"; |
||||||
|
return; |
||||||
|
} |
||||||
|
|
||||||
|
isDeleting = true; |
||||||
|
deleteError = null; |
||||||
|
|
||||||
|
const result = await deleteEvent( |
||||||
|
{ |
||||||
|
eventId: resolvedEvent.id, |
||||||
|
eventAddress: address, |
||||||
|
eventKind: resolvedEvent.kind, |
||||||
|
reason: "Deleted by author", |
||||||
|
onSuccess: (deletionEventId) => { |
||||||
|
console.log(`[DeleteButton] Published deletion event: ${deletionEventId}`); |
||||||
|
showDeleteModal = false; |
||||||
|
onDeleted?.(); |
||||||
|
}, |
||||||
|
onError: (error) => { |
||||||
|
console.error(`[DeleteButton] Deletion failed: ${error}`); |
||||||
|
deleteError = error; |
||||||
|
}, |
||||||
|
}, |
||||||
|
ndk, |
||||||
|
); |
||||||
|
|
||||||
|
isDeleting = false; |
||||||
|
|
||||||
|
if (result.success) { |
||||||
|
console.log(`[DeleteButton] Successfully deleted section: ${address}`); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
function openDeleteModal() { |
||||||
|
deleteError = null; |
||||||
|
showDeleteModal = true; |
||||||
|
} |
||||||
|
</script> |
||||||
|
|
||||||
|
{#if canDelete} |
||||||
|
<Button |
||||||
|
color="red" |
||||||
|
size="xs" |
||||||
|
class="single-line-button opacity-0 transition-opacity duration-200" |
||||||
|
onclick={openDeleteModal} |
||||||
|
> |
||||||
|
<TrashBinOutline class="w-3 h-3 mr-1" /> |
||||||
|
Delete |
||||||
|
</Button> |
||||||
|
|
||||||
|
<Modal bind:open={showDeleteModal} size="sm" title="Delete Section"> |
||||||
|
<div class="text-center"> |
||||||
|
<TrashBinOutline class="mx-auto mb-4 text-gray-400 w-12 h-12 dark:text-gray-200" /> |
||||||
|
<h3 class="mb-5 text-lg font-normal text-gray-500 dark:text-gray-400"> |
||||||
|
Are you sure you want to delete this section? |
||||||
|
</h3> |
||||||
|
<p class="mb-5 text-sm text-gray-500 dark:text-gray-400"> |
||||||
|
This will publish a deletion request to all relays. Note that not all relays |
||||||
|
may honor this request, and the content may remain visible on some relays. |
||||||
|
</p> |
||||||
|
{#if deleteError} |
||||||
|
<p class="mb-5 text-sm text-red-500">{deleteError}</p> |
||||||
|
{/if} |
||||||
|
<div class="flex justify-center gap-4"> |
||||||
|
<Button |
||||||
|
color="red" |
||||||
|
disabled={isDeleting} |
||||||
|
onclick={handleDelete} |
||||||
|
> |
||||||
|
{isDeleting ? "Deleting..." : "Yes, delete it"} |
||||||
|
</Button> |
||||||
|
<Button |
||||||
|
color="alternative" |
||||||
|
disabled={isDeleting} |
||||||
|
onclick={() => (showDeleteModal = false)} |
||||||
|
> |
||||||
|
No, cancel |
||||||
|
</Button> |
||||||
|
</div> |
||||||
|
</div> |
||||||
|
</Modal> |
||||||
|
{/if} |
||||||
|
|
||||||
|
<style> |
||||||
|
:global(.single-line-button) { |
||||||
|
opacity: 0; |
||||||
|
} |
||||||
|
</style> |
||||||
@ -0,0 +1,21 @@ |
|||||||
|
<script lang="ts"> |
||||||
|
import { Button } from "flowbite-svelte"; |
||||||
|
import { FontHighlightOutline } from "flowbite-svelte-icons"; |
||||||
|
|
||||||
|
let { isActive = $bindable(false) }: { isActive?: boolean } = $props(); |
||||||
|
|
||||||
|
function toggleHighlightMode() { |
||||||
|
isActive = !isActive; |
||||||
|
} |
||||||
|
</script> |
||||||
|
|
||||||
|
<Button |
||||||
|
color={isActive ? "primary" : "light"} |
||||||
|
size="sm" |
||||||
|
class="btn-leather {isActive ? 'ring-2 ring-primary-500' : ''}" |
||||||
|
onclick={toggleHighlightMode} |
||||||
|
title={isActive ? "Exit highlight mode" : "Enter highlight mode"} |
||||||
|
> |
||||||
|
<FontHighlightOutline class="w-4 h-4 mr-2" /> |
||||||
|
{isActive ? "Exit Highlight Mode" : "Add Highlight"} |
||||||
|
</Button> |
||||||
@ -0,0 +1,952 @@ |
|||||||
|
<script lang="ts"> |
||||||
|
import { |
||||||
|
getNdkContext, |
||||||
|
activeInboxRelays, |
||||||
|
activeOutboxRelays, |
||||||
|
} from "$lib/ndk"; |
||||||
|
import { pubkeyToHue } from "$lib/utils/nostrUtils"; |
||||||
|
import type { NDKEvent } from "@nostr-dev-kit/ndk"; |
||||||
|
import { NDKEvent as NDKEventClass } from "@nostr-dev-kit/ndk"; |
||||||
|
import { communityRelays } from "$lib/consts"; |
||||||
|
import { WebSocketPool } from "$lib/data_structures/websocket_pool"; |
||||||
|
import { generateMockHighlightsForSections } from "$lib/utils/mockHighlightData"; |
||||||
|
import { |
||||||
|
groupHighlightsByAuthor, |
||||||
|
truncateHighlight, |
||||||
|
encodeHighlightNaddr, |
||||||
|
getRelaysFromHighlight, |
||||||
|
getAuthorDisplayName, |
||||||
|
sortHighlightsByTime, |
||||||
|
} from "$lib/utils/highlightUtils"; |
||||||
|
import { unifiedProfileCache } from "$lib/utils/npubCache"; |
||||||
|
import { nip19 } from "nostr-tools"; |
||||||
|
import { |
||||||
|
highlightByOffset, |
||||||
|
getPlainText, |
||||||
|
} from "$lib/utils/highlightPositioning"; |
||||||
|
|
||||||
|
let { |
||||||
|
eventId, |
||||||
|
eventAddress, |
||||||
|
eventIds = [], |
||||||
|
eventAddresses = [], |
||||||
|
visible = $bindable(false), |
||||||
|
useMockHighlights = false, |
||||||
|
}: { |
||||||
|
eventId?: string; |
||||||
|
eventAddress?: string; |
||||||
|
eventIds?: string[]; |
||||||
|
eventAddresses?: string[]; |
||||||
|
visible?: boolean; |
||||||
|
useMockHighlights?: boolean; |
||||||
|
} = $props(); |
||||||
|
|
||||||
|
const ndk = getNdkContext(); |
||||||
|
|
||||||
|
// State management |
||||||
|
let highlights: NDKEvent[] = $state([]); |
||||||
|
let loading = $state(false); |
||||||
|
let containerRef: HTMLElement | null = $state(null); |
||||||
|
let expandedAuthors = $state(new Set<string>()); |
||||||
|
let authorProfiles = $state(new Map<string, any>()); |
||||||
|
let copyFeedback = $state<string | null>(null); |
||||||
|
|
||||||
|
// Derived state for color mapping |
||||||
|
let colorMap = $derived.by(() => { |
||||||
|
const map = new Map<string, string>(); |
||||||
|
highlights.forEach((highlight) => { |
||||||
|
if (!map.has(highlight.pubkey)) { |
||||||
|
const hue = pubkeyToHue(highlight.pubkey); |
||||||
|
map.set(highlight.pubkey, `hsla(${hue}, 70%, 60%, 0.3)`); |
||||||
|
} |
||||||
|
}); |
||||||
|
return map; |
||||||
|
}); |
||||||
|
|
||||||
|
// Derived state for grouped highlights |
||||||
|
let groupedHighlights = $derived.by(() => { |
||||||
|
return groupHighlightsByAuthor(highlights); |
||||||
|
}); |
||||||
|
|
||||||
|
/** |
||||||
|
* Fetch highlight events (kind 9802) for the current publication using NDK |
||||||
|
* Or generate mock highlights if useMockHighlights is enabled |
||||||
|
*/ |
||||||
|
async function fetchHighlights() { |
||||||
|
// Prevent concurrent fetches |
||||||
|
if (loading) { |
||||||
|
console.log("[HighlightLayer] Already loading, skipping fetch"); |
||||||
|
return; |
||||||
|
} |
||||||
|
|
||||||
|
// Collect all event IDs and addresses |
||||||
|
const allEventIds = [...(eventId ? [eventId] : []), ...eventIds].filter( |
||||||
|
Boolean, |
||||||
|
); |
||||||
|
const allAddresses = [ |
||||||
|
...(eventAddress ? [eventAddress] : []), |
||||||
|
...eventAddresses, |
||||||
|
].filter(Boolean); |
||||||
|
|
||||||
|
if (allEventIds.length === 0 && allAddresses.length === 0) { |
||||||
|
console.warn("[HighlightLayer] No event IDs or addresses provided"); |
||||||
|
return; |
||||||
|
} |
||||||
|
|
||||||
|
loading = true; |
||||||
|
highlights = []; |
||||||
|
|
||||||
|
// AI-NOTE: Mock mode allows testing highlight UI without publishing to relays |
||||||
|
// This is useful for development and demonstrating the highlight system |
||||||
|
if (useMockHighlights) { |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] MOCK MODE - Generating mock highlights for ${allAddresses.length} sections`, |
||||||
|
); |
||||||
|
|
||||||
|
try { |
||||||
|
// Generate mock highlight data |
||||||
|
const mockHighlights = generateMockHighlightsForSections(allAddresses); |
||||||
|
|
||||||
|
// Convert to NDKEvent instances (same as real events) |
||||||
|
highlights = mockHighlights.map( |
||||||
|
(rawEvent) => new NDKEventClass(ndk, rawEvent), |
||||||
|
); |
||||||
|
|
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Generated ${highlights.length} mock highlights`, |
||||||
|
); |
||||||
|
loading = false; |
||||||
|
return; |
||||||
|
} catch (err) { |
||||||
|
console.error( |
||||||
|
`[HighlightLayer] Error generating mock highlights:`, |
||||||
|
err, |
||||||
|
); |
||||||
|
loading = false; |
||||||
|
return; |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
console.log(`[HighlightLayer] Fetching highlights for:`, { |
||||||
|
eventIds: allEventIds, |
||||||
|
addresses: allAddresses, |
||||||
|
}); |
||||||
|
|
||||||
|
try { |
||||||
|
// Build filter for kind 9802 highlight events |
||||||
|
// IMPORTANT: Use only #a tags because filters are AND, not OR |
||||||
|
// If we include both #e and #a, relays will only return highlights that have BOTH |
||||||
|
const filter: any = { |
||||||
|
kinds: [9802], |
||||||
|
limit: 500, |
||||||
|
}; |
||||||
|
|
||||||
|
// Prefer #a (addressable events) since they're more specific and persistent |
||||||
|
if (allAddresses.length > 0) { |
||||||
|
filter["#a"] = allAddresses; |
||||||
|
} else if (allEventIds.length > 0) { |
||||||
|
// Fallback to #e if no addresses available |
||||||
|
filter["#e"] = allEventIds; |
||||||
|
} |
||||||
|
|
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Fetching with filter:`, |
||||||
|
JSON.stringify(filter, null, 2), |
||||||
|
); |
||||||
|
|
||||||
|
// Build explicit relay set (same pattern as HighlightSelectionHandler and CommentButton) |
||||||
|
const relays = [ |
||||||
|
...communityRelays, |
||||||
|
...$activeOutboxRelays, |
||||||
|
...$activeInboxRelays, |
||||||
|
]; |
||||||
|
const uniqueRelays = Array.from(new Set(relays)); |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Fetching from ${uniqueRelays.length} relays:`, |
||||||
|
uniqueRelays, |
||||||
|
); |
||||||
|
|
||||||
|
/** |
||||||
|
* Use WebSocketPool with nostr-tools protocol instead of NDK |
||||||
|
* |
||||||
|
* Reasons for not using NDK: |
||||||
|
* 1. NDK subscriptions mysteriously returned 0 events even when websocat confirmed events existed |
||||||
|
* 2. Consistency - CommentButton and HighlightSelectionHandler both use WebSocketPool pattern |
||||||
|
* 3. Better debugging - direct access to WebSocket messages for troubleshooting |
||||||
|
* 4. Proven reliability - battle-tested in the codebase for similar use cases |
||||||
|
* 5. Performance control - explicit 5s timeout per relay, tunable as needed |
||||||
|
* |
||||||
|
* This matches the pattern in: |
||||||
|
* - src/lib/components/publications/CommentButton.svelte:156-220 |
||||||
|
* - src/lib/components/publications/HighlightSelectionHandler.svelte:217-280 |
||||||
|
*/ |
||||||
|
const subscriptionId = `highlights-${Date.now()}`; |
||||||
|
const receivedEventIds = new Set<string>(); |
||||||
|
let eoseCount = 0; |
||||||
|
|
||||||
|
const fetchPromises = uniqueRelays.map(async (relayUrl) => { |
||||||
|
try { |
||||||
|
console.log(`[HighlightLayer] Connecting to ${relayUrl}`); |
||||||
|
const ws = await WebSocketPool.instance.acquire(relayUrl); |
||||||
|
|
||||||
|
return new Promise<void>((resolve) => { |
||||||
|
const messageHandler = (event: MessageEvent) => { |
||||||
|
try { |
||||||
|
const message = JSON.parse(event.data); |
||||||
|
|
||||||
|
// Log ALL messages from relay.nostr.band for debugging |
||||||
|
if (relayUrl.includes("relay.nostr.band")) { |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] RAW message from ${relayUrl}:`, |
||||||
|
message, |
||||||
|
); |
||||||
|
} |
||||||
|
|
||||||
|
if (message[0] === "EVENT" && message[1] === subscriptionId) { |
||||||
|
const rawEvent = message[2]; |
||||||
|
console.log(`[HighlightLayer] EVENT from ${relayUrl}:`, { |
||||||
|
id: rawEvent.id, |
||||||
|
kind: rawEvent.kind, |
||||||
|
content: rawEvent.content.substring(0, 50), |
||||||
|
tags: rawEvent.tags, |
||||||
|
}); |
||||||
|
|
||||||
|
// Avoid duplicates |
||||||
|
if (!receivedEventIds.has(rawEvent.id)) { |
||||||
|
receivedEventIds.add(rawEvent.id); |
||||||
|
|
||||||
|
// Convert to NDKEvent |
||||||
|
const ndkEvent = new NDKEventClass(ndk, rawEvent); |
||||||
|
highlights = [...highlights, ndkEvent]; |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Added highlight, total now: ${highlights.length}`, |
||||||
|
); |
||||||
|
} |
||||||
|
} else if ( |
||||||
|
message[0] === "EOSE" && |
||||||
|
message[1] === subscriptionId |
||||||
|
) { |
||||||
|
eoseCount++; |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] EOSE from ${relayUrl} (${eoseCount}/${uniqueRelays.length})`, |
||||||
|
); |
||||||
|
|
||||||
|
// Close subscription |
||||||
|
ws.send(JSON.stringify(["CLOSE", subscriptionId])); |
||||||
|
ws.removeEventListener("message", messageHandler); |
||||||
|
WebSocketPool.instance.release(ws); |
||||||
|
resolve(); |
||||||
|
} else if (message[0] === "NOTICE") { |
||||||
|
console.warn( |
||||||
|
`[HighlightLayer] NOTICE from ${relayUrl}:`, |
||||||
|
message[1], |
||||||
|
); |
||||||
|
} |
||||||
|
} catch (err) { |
||||||
|
console.error( |
||||||
|
`[HighlightLayer] Error processing message from ${relayUrl}:`, |
||||||
|
err, |
||||||
|
); |
||||||
|
} |
||||||
|
}; |
||||||
|
|
||||||
|
ws.addEventListener("message", messageHandler); |
||||||
|
|
||||||
|
// Send REQ |
||||||
|
const req = ["REQ", subscriptionId, filter]; |
||||||
|
if (relayUrl.includes("relay.nostr.band")) { |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Sending REQ to ${relayUrl}:`, |
||||||
|
JSON.stringify(req), |
||||||
|
); |
||||||
|
} else { |
||||||
|
console.log(`[HighlightLayer] Sending REQ to ${relayUrl}`); |
||||||
|
} |
||||||
|
ws.send(JSON.stringify(req)); |
||||||
|
|
||||||
|
// Timeout per relay (5 seconds) |
||||||
|
setTimeout(() => { |
||||||
|
if (ws.readyState === WebSocket.OPEN) { |
||||||
|
ws.send(JSON.stringify(["CLOSE", subscriptionId])); |
||||||
|
ws.removeEventListener("message", messageHandler); |
||||||
|
WebSocketPool.instance.release(ws); |
||||||
|
} |
||||||
|
resolve(); |
||||||
|
}, 5000); |
||||||
|
}); |
||||||
|
} catch (err) { |
||||||
|
console.error( |
||||||
|
`[HighlightLayer] Error connecting to ${relayUrl}:`, |
||||||
|
err, |
||||||
|
); |
||||||
|
} |
||||||
|
}); |
||||||
|
|
||||||
|
// Wait for all relays to respond or timeout |
||||||
|
await Promise.all(fetchPromises); |
||||||
|
|
||||||
|
console.log(`[HighlightLayer] Fetched ${highlights.length} highlights`); |
||||||
|
|
||||||
|
if (highlights.length > 0) { |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Highlights summary:`, |
||||||
|
highlights.map((h) => ({ |
||||||
|
content: h.content.substring(0, 30) + "...", |
||||||
|
address: h.tags.find((t) => t[0] === "a")?.[1], |
||||||
|
author: h.pubkey.substring(0, 8), |
||||||
|
})), |
||||||
|
); |
||||||
|
} |
||||||
|
|
||||||
|
loading = false; |
||||||
|
|
||||||
|
// Rendering is handled by the visibility/highlights effect |
||||||
|
} catch (err) { |
||||||
|
console.error(`[HighlightLayer] Error fetching highlights:`, err); |
||||||
|
loading = false; |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Apply highlight using position offsets |
||||||
|
* @param offsetStart - Start character position |
||||||
|
* @param offsetEnd - End character position |
||||||
|
* @param color - The color to use for highlighting |
||||||
|
* @param targetAddress - Optional address to limit search to specific section |
||||||
|
*/ |
||||||
|
function highlightByPosition( |
||||||
|
offsetStart: number, |
||||||
|
offsetEnd: number, |
||||||
|
color: string, |
||||||
|
targetAddress?: string, |
||||||
|
): boolean { |
||||||
|
if (!containerRef) { |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Cannot highlight by position - no containerRef`, |
||||||
|
); |
||||||
|
return false; |
||||||
|
} |
||||||
|
|
||||||
|
// If we have a target address, search only in that section |
||||||
|
let searchRoot: HTMLElement = containerRef; |
||||||
|
if (targetAddress) { |
||||||
|
const sectionElement = document.getElementById(targetAddress); |
||||||
|
if (sectionElement) { |
||||||
|
searchRoot = sectionElement; |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Highlighting in specific section: ${targetAddress}`, |
||||||
|
); |
||||||
|
} else { |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Section ${targetAddress} not found in DOM, searching globally`, |
||||||
|
); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Applying position-based highlight ${offsetStart}-${offsetEnd}`, |
||||||
|
); |
||||||
|
const result = highlightByOffset(searchRoot, offsetStart, offsetEnd, color); |
||||||
|
|
||||||
|
if (result) { |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Successfully applied position-based highlight`, |
||||||
|
); |
||||||
|
} else { |
||||||
|
console.log(`[HighlightLayer] Failed to apply position-based highlight`); |
||||||
|
} |
||||||
|
|
||||||
|
return result; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Find text in the DOM and highlight it (fallback method) |
||||||
|
* @param text - The text to highlight |
||||||
|
* @param color - The color to use for highlighting |
||||||
|
* @param targetAddress - Optional address to limit search to specific section |
||||||
|
*/ |
||||||
|
function findAndHighlightText( |
||||||
|
text: string, |
||||||
|
color: string, |
||||||
|
targetAddress?: string, |
||||||
|
): void { |
||||||
|
if (!containerRef || !text || text.trim().length === 0) { |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Cannot highlight - containerRef: ${!!containerRef}, text: "${text}"`, |
||||||
|
); |
||||||
|
return; |
||||||
|
} |
||||||
|
|
||||||
|
// If we have a target address, search only in that section |
||||||
|
let searchRoot: HTMLElement | Document = containerRef; |
||||||
|
if (targetAddress) { |
||||||
|
const sectionElement = document.getElementById(targetAddress); |
||||||
|
if (sectionElement) { |
||||||
|
searchRoot = sectionElement; |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Searching in specific section: ${targetAddress}`, |
||||||
|
); |
||||||
|
} else { |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Section ${targetAddress} not found in DOM, searching globally`, |
||||||
|
); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Searching for text: "${text}" in`, |
||||||
|
searchRoot, |
||||||
|
); |
||||||
|
|
||||||
|
// Use TreeWalker to find all text nodes |
||||||
|
const walker = document.createTreeWalker( |
||||||
|
searchRoot, |
||||||
|
NodeFilter.SHOW_TEXT, |
||||||
|
null, |
||||||
|
); |
||||||
|
|
||||||
|
const textNodes: Node[] = []; |
||||||
|
let node: Node | null; |
||||||
|
while ((node = walker.nextNode())) { |
||||||
|
textNodes.push(node); |
||||||
|
} |
||||||
|
|
||||||
|
// Search for the highlight text in text nodes |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Searching through ${textNodes.length} text nodes`, |
||||||
|
); |
||||||
|
|
||||||
|
for (const textNode of textNodes) { |
||||||
|
const nodeText = textNode.textContent || ""; |
||||||
|
const index = nodeText.toLowerCase().indexOf(text.toLowerCase()); |
||||||
|
|
||||||
|
if (index !== -1) { |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Found match in text node:`, |
||||||
|
nodeText.substring( |
||||||
|
Math.max(0, index - 20), |
||||||
|
Math.min(nodeText.length, index + text.length + 20), |
||||||
|
), |
||||||
|
); |
||||||
|
const parent = textNode.parentNode; |
||||||
|
if (!parent) continue; |
||||||
|
|
||||||
|
// Skip if already highlighted |
||||||
|
if ( |
||||||
|
parent.nodeName === "MARK" || |
||||||
|
(parent instanceof Element && parent.classList?.contains("highlight")) |
||||||
|
) { |
||||||
|
continue; |
||||||
|
} |
||||||
|
|
||||||
|
const before = nodeText.substring(0, index); |
||||||
|
const match = nodeText.substring(index, index + text.length); |
||||||
|
const after = nodeText.substring(index + text.length); |
||||||
|
|
||||||
|
// Create highlight span |
||||||
|
const highlightSpan = document.createElement("mark"); |
||||||
|
highlightSpan.className = "highlight"; |
||||||
|
highlightSpan.style.backgroundColor = color; |
||||||
|
highlightSpan.style.borderRadius = "2px"; |
||||||
|
highlightSpan.style.padding = "2px 0"; |
||||||
|
highlightSpan.textContent = match; |
||||||
|
|
||||||
|
// Replace the text node with the highlighted version |
||||||
|
const fragment = document.createDocumentFragment(); |
||||||
|
if (before) fragment.appendChild(document.createTextNode(before)); |
||||||
|
fragment.appendChild(highlightSpan); |
||||||
|
if (after) fragment.appendChild(document.createTextNode(after)); |
||||||
|
|
||||||
|
parent.replaceChild(fragment, textNode); |
||||||
|
|
||||||
|
console.log(`[HighlightLayer] Highlighted text:`, match); |
||||||
|
return; // Only highlight first occurrence to avoid multiple highlights |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
console.log(`[HighlightLayer] No match found for text: "${text}"`); |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Render all highlights on the page |
||||||
|
*/ |
||||||
|
function renderHighlights() { |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] renderHighlights called - visible: ${visible}, containerRef: ${!!containerRef}, highlights: ${highlights.length}`, |
||||||
|
); |
||||||
|
|
||||||
|
if (!visible || !containerRef) { |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Skipping render - visible: ${visible}, containerRef: ${!!containerRef}`, |
||||||
|
); |
||||||
|
return; |
||||||
|
} |
||||||
|
|
||||||
|
if (highlights.length === 0) { |
||||||
|
console.log(`[HighlightLayer] No highlights to render`); |
||||||
|
return; |
||||||
|
} |
||||||
|
|
||||||
|
// Clear existing highlights |
||||||
|
clearHighlights(); |
||||||
|
|
||||||
|
console.log(`[HighlightLayer] Rendering ${highlights.length} highlights`); |
||||||
|
console.log(`[HighlightLayer] Container element:`, containerRef); |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Container has children:`, |
||||||
|
containerRef.children.length, |
||||||
|
); |
||||||
|
|
||||||
|
// Apply each highlight |
||||||
|
for (const highlight of highlights) { |
||||||
|
const content = highlight.content; |
||||||
|
const color = colorMap.get(highlight.pubkey) || "hsla(60, 70%, 60%, 0.3)"; |
||||||
|
|
||||||
|
// Extract the target address from the highlight's "a" tag |
||||||
|
const aTag = highlight.tags.find((tag) => tag[0] === "a"); |
||||||
|
const targetAddress = aTag ? aTag[1] : undefined; |
||||||
|
|
||||||
|
// Check for offset tags (position-based highlighting) |
||||||
|
const offsetTag = highlight.tags.find((tag) => tag[0] === "offset"); |
||||||
|
const hasOffset = |
||||||
|
offsetTag && offsetTag[1] !== undefined && offsetTag[2] !== undefined; |
||||||
|
|
||||||
|
console.log(`[HighlightLayer] Rendering highlight:`, { |
||||||
|
hasOffset, |
||||||
|
offsetTag, |
||||||
|
content: content.substring(0, 50), |
||||||
|
contentLength: content.length, |
||||||
|
targetAddress, |
||||||
|
color, |
||||||
|
allTags: highlight.tags, |
||||||
|
}); |
||||||
|
|
||||||
|
if (hasOffset) { |
||||||
|
// Use position-based highlighting |
||||||
|
const offsetStart = parseInt(offsetTag[1], 10); |
||||||
|
const offsetEnd = parseInt(offsetTag[2], 10); |
||||||
|
|
||||||
|
if (!isNaN(offsetStart) && !isNaN(offsetEnd)) { |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Using position-based highlighting: ${offsetStart}-${offsetEnd}`, |
||||||
|
); |
||||||
|
highlightByPosition(offsetStart, offsetEnd, color, targetAddress); |
||||||
|
} else { |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Invalid offset values, falling back to text search`, |
||||||
|
); |
||||||
|
if (content && content.trim().length > 0) { |
||||||
|
findAndHighlightText(content, color, targetAddress); |
||||||
|
} |
||||||
|
} |
||||||
|
} else { |
||||||
|
// Fall back to text-based highlighting |
||||||
|
console.log(`[HighlightLayer] Using text-based highlighting`); |
||||||
|
if (content && content.trim().length > 0) { |
||||||
|
findAndHighlightText(content, color, targetAddress); |
||||||
|
} else { |
||||||
|
console.log(`[HighlightLayer] Skipping highlight - empty content`); |
||||||
|
} |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
// Check if any highlights were actually rendered |
||||||
|
const renderedHighlights = containerRef.querySelectorAll("mark.highlight"); |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Rendered ${renderedHighlights.length} highlight marks in DOM`, |
||||||
|
); |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Clear all highlights from the page |
||||||
|
*/ |
||||||
|
function clearHighlights() { |
||||||
|
if (!containerRef) return; |
||||||
|
|
||||||
|
const highlightElements = containerRef.querySelectorAll("mark.highlight"); |
||||||
|
highlightElements.forEach((el) => { |
||||||
|
const parent = el.parentNode; |
||||||
|
if (parent) { |
||||||
|
// Replace highlight with plain text |
||||||
|
const textNode = document.createTextNode(el.textContent || ""); |
||||||
|
parent.replaceChild(textNode, el); |
||||||
|
|
||||||
|
// Normalize the parent to merge adjacent text nodes |
||||||
|
parent.normalize(); |
||||||
|
} |
||||||
|
}); |
||||||
|
|
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Cleared ${highlightElements.length} highlights`, |
||||||
|
); |
||||||
|
} |
||||||
|
|
||||||
|
// Track the last fetched event count to know when to refetch |
||||||
|
let lastFetchedCount = $state(0); |
||||||
|
let fetchTimeout: ReturnType<typeof setTimeout> | null = null; |
||||||
|
|
||||||
|
// Watch for changes to event data - debounce and fetch when data stabilizes |
||||||
|
$effect(() => { |
||||||
|
const currentCount = eventIds.length + eventAddresses.length; |
||||||
|
const hasEventData = currentCount > 0; |
||||||
|
|
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Event data effect - count: ${currentCount}, lastFetched: ${lastFetchedCount}, loading: ${loading}`, |
||||||
|
); |
||||||
|
|
||||||
|
// Only fetch if: |
||||||
|
// 1. We have event data |
||||||
|
// 2. The count has changed since last fetch |
||||||
|
// 3. We're not already loading |
||||||
|
if (hasEventData && currentCount !== lastFetchedCount && !loading) { |
||||||
|
// Clear any existing timeout |
||||||
|
if (fetchTimeout) { |
||||||
|
clearTimeout(fetchTimeout); |
||||||
|
} |
||||||
|
|
||||||
|
// Debounce: wait 500ms for more events to arrive before fetching |
||||||
|
fetchTimeout = setTimeout(() => { |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Event data stabilized at ${currentCount} events, fetching highlights...`, |
||||||
|
); |
||||||
|
lastFetchedCount = currentCount; |
||||||
|
fetchHighlights(); |
||||||
|
}, 500); |
||||||
|
} |
||||||
|
|
||||||
|
// Cleanup timeout on effect cleanup |
||||||
|
return () => { |
||||||
|
if (fetchTimeout) { |
||||||
|
clearTimeout(fetchTimeout); |
||||||
|
} |
||||||
|
}; |
||||||
|
}); |
||||||
|
|
||||||
|
// Watch for visibility AND highlights changes - render when both are ready |
||||||
|
$effect(() => { |
||||||
|
// This effect runs when either visible or highlights.length changes |
||||||
|
const highlightCount = highlights.length; |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Visibility/highlights effect - visible: ${visible}, highlights: ${highlightCount}`, |
||||||
|
); |
||||||
|
|
||||||
|
if (visible && highlightCount > 0) { |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Both visible and highlights ready, rendering...`, |
||||||
|
); |
||||||
|
renderHighlights(); |
||||||
|
} else if (!visible) { |
||||||
|
clearHighlights(); |
||||||
|
} |
||||||
|
}); |
||||||
|
|
||||||
|
// Fetch profiles when highlights change |
||||||
|
$effect(() => { |
||||||
|
const highlightCount = highlights.length; |
||||||
|
if (highlightCount > 0) { |
||||||
|
fetchAuthorProfiles(); |
||||||
|
} |
||||||
|
}); |
||||||
|
|
||||||
|
/** |
||||||
|
* Fetch author profiles for all unique pubkeys in highlights |
||||||
|
*/ |
||||||
|
async function fetchAuthorProfiles() { |
||||||
|
const uniquePubkeys = Array.from(groupedHighlights.keys()); |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Fetching profiles for ${uniquePubkeys.length} authors`, |
||||||
|
); |
||||||
|
|
||||||
|
for (const pubkey of uniquePubkeys) { |
||||||
|
try { |
||||||
|
// Convert hex pubkey to npub for the profile cache |
||||||
|
const npub = nip19.npubEncode(pubkey); |
||||||
|
const profile = await unifiedProfileCache.getProfile(npub, ndk); |
||||||
|
if (profile) { |
||||||
|
authorProfiles.set(pubkey, profile); |
||||||
|
// Trigger reactivity |
||||||
|
authorProfiles = new Map(authorProfiles); |
||||||
|
} |
||||||
|
} catch (err) { |
||||||
|
console.error( |
||||||
|
`[HighlightLayer] Error fetching profile for ${pubkey}:`, |
||||||
|
err, |
||||||
|
); |
||||||
|
} |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Toggle expansion state for an author's highlights |
||||||
|
*/ |
||||||
|
function toggleAuthor(pubkey: string) { |
||||||
|
if (expandedAuthors.has(pubkey)) { |
||||||
|
expandedAuthors.delete(pubkey); |
||||||
|
} else { |
||||||
|
expandedAuthors.add(pubkey); |
||||||
|
} |
||||||
|
// Trigger reactivity |
||||||
|
expandedAuthors = new Set(expandedAuthors); |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Scroll to a specific highlight in the document |
||||||
|
*/ |
||||||
|
function scrollToHighlight(highlight: NDKEvent) { |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] scrollToHighlight called for:`, |
||||||
|
highlight.content.substring(0, 50), |
||||||
|
); |
||||||
|
|
||||||
|
if (!containerRef) { |
||||||
|
console.warn(`[HighlightLayer] No containerRef available`); |
||||||
|
return; |
||||||
|
} |
||||||
|
|
||||||
|
const content = highlight.content; |
||||||
|
if (!content || content.trim().length === 0) { |
||||||
|
console.warn(`[HighlightLayer] No content in highlight`); |
||||||
|
return; |
||||||
|
} |
||||||
|
|
||||||
|
// Find the highlight mark element |
||||||
|
const highlightMarks = containerRef.querySelectorAll("mark.highlight"); |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Found ${highlightMarks.length} highlight marks in DOM`, |
||||||
|
); |
||||||
|
|
||||||
|
// Try exact match first |
||||||
|
for (const mark of highlightMarks) { |
||||||
|
const markText = mark.textContent?.toLowerCase() || ""; |
||||||
|
const searchText = content.toLowerCase(); |
||||||
|
|
||||||
|
if (markText === searchText) { |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Found exact match, scrolling and flashing`, |
||||||
|
); |
||||||
|
// Scroll to this element |
||||||
|
mark.scrollIntoView({ behavior: "smooth", block: "center" }); |
||||||
|
|
||||||
|
// Add a temporary flash effect |
||||||
|
mark.classList.add("highlight-flash"); |
||||||
|
setTimeout(() => { |
||||||
|
mark.classList.remove("highlight-flash"); |
||||||
|
}, 1500); |
||||||
|
return; |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
// Try partial match (for position-based highlights that might be split) |
||||||
|
for (const mark of highlightMarks) { |
||||||
|
const markText = mark.textContent?.toLowerCase() || ""; |
||||||
|
const searchText = content.toLowerCase(); |
||||||
|
|
||||||
|
if (markText.includes(searchText) || searchText.includes(markText)) { |
||||||
|
console.log( |
||||||
|
`[HighlightLayer] Found partial match, scrolling and flashing`, |
||||||
|
); |
||||||
|
mark.scrollIntoView({ behavior: "smooth", block: "center" }); |
||||||
|
mark.classList.add("highlight-flash"); |
||||||
|
setTimeout(() => { |
||||||
|
mark.classList.remove("highlight-flash"); |
||||||
|
}, 1500); |
||||||
|
return; |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
console.warn( |
||||||
|
`[HighlightLayer] Could not find highlight mark for:`, |
||||||
|
content.substring(0, 50), |
||||||
|
); |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Copy highlight naddr to clipboard |
||||||
|
*/ |
||||||
|
async function copyHighlightNaddr(highlight: NDKEvent) { |
||||||
|
const relays = getRelaysFromHighlight(highlight); |
||||||
|
const naddr = encodeHighlightNaddr(highlight, relays); |
||||||
|
|
||||||
|
try { |
||||||
|
await navigator.clipboard.writeText(naddr); |
||||||
|
copyFeedback = highlight.id; |
||||||
|
console.log(`[HighlightLayer] Copied naddr to clipboard:`, naddr); |
||||||
|
|
||||||
|
// Clear feedback after 2 seconds |
||||||
|
setTimeout(() => { |
||||||
|
copyFeedback = null; |
||||||
|
}, 2000); |
||||||
|
} catch (err) { |
||||||
|
console.error(`[HighlightLayer] Error copying to clipboard:`, err); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Bind to parent container element |
||||||
|
*/ |
||||||
|
export function setContainer(element: HTMLElement | null) { |
||||||
|
containerRef = element; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Public method to refresh highlights (e.g., after creating a new one) |
||||||
|
*/ |
||||||
|
export function refresh() { |
||||||
|
console.log("[HighlightLayer] Manual refresh triggered"); |
||||||
|
|
||||||
|
// Clear existing highlights |
||||||
|
highlights = []; |
||||||
|
clearHighlights(); |
||||||
|
|
||||||
|
// Reset fetch count to force re-fetch |
||||||
|
lastFetchedCount = 0; |
||||||
|
fetchHighlights(); |
||||||
|
} |
||||||
|
</script> |
||||||
|
|
||||||
|
{#if loading && visible} |
||||||
|
<div |
||||||
|
class="fixed top-40 right-4 z-50 bg-white dark:bg-gray-800 rounded-lg shadow-lg p-3" |
||||||
|
> |
||||||
|
<p class="text-sm text-gray-600 dark:text-gray-300"> |
||||||
|
Loading highlights... |
||||||
|
</p> |
||||||
|
</div> |
||||||
|
{/if} |
||||||
|
|
||||||
|
{#if visible && highlights.length > 0} |
||||||
|
<div |
||||||
|
class="fixed bottom-4 right-4 z-50 bg-white dark:bg-gray-800 rounded-lg shadow-lg p-4 max-w-sm w-80" |
||||||
|
> |
||||||
|
<h4 class="text-sm font-semibold mb-3 text-gray-900 dark:text-gray-100"> |
||||||
|
Highlights |
||||||
|
</h4> |
||||||
|
<div class="space-y-2 max-h-96 overflow-y-auto"> |
||||||
|
{#each Array.from(groupedHighlights.entries()) as [pubkey, authorHighlights]} |
||||||
|
{@const isExpanded = expandedAuthors.has(pubkey)} |
||||||
|
{@const profile = authorProfiles.get(pubkey)} |
||||||
|
{@const displayName = getAuthorDisplayName(profile, pubkey)} |
||||||
|
{@const color = colorMap.get(pubkey) || "hsla(60, 70%, 60%, 0.3)"} |
||||||
|
{@const sortedHighlights = sortHighlightsByTime(authorHighlights)} |
||||||
|
|
||||||
|
<div class="border-b border-gray-200 dark:border-gray-700 pb-2"> |
||||||
|
<!-- Author header --> |
||||||
|
<button |
||||||
|
class="w-full flex items-center gap-2 text-sm hover:bg-gray-50 dark:hover:bg-gray-700 p-2 rounded transition-colors" |
||||||
|
onclick={() => toggleAuthor(pubkey)} |
||||||
|
> |
||||||
|
<div |
||||||
|
class="w-3 h-3 rounded flex-shrink-0" |
||||||
|
style="background-color: {color};" |
||||||
|
></div> |
||||||
|
<span |
||||||
|
class="font-medium text-gray-900 dark:text-gray-100 flex-1 text-left truncate" |
||||||
|
> |
||||||
|
{displayName} |
||||||
|
</span> |
||||||
|
<span class="text-xs text-gray-500 dark:text-gray-400"> |
||||||
|
({authorHighlights.length}) |
||||||
|
</span> |
||||||
|
<svg |
||||||
|
class="w-4 h-4 text-gray-500 transition-transform {isExpanded |
||||||
|
? 'rotate-90' |
||||||
|
: ''}" |
||||||
|
fill="none" |
||||||
|
stroke="currentColor" |
||||||
|
viewBox="0 0 24 24" |
||||||
|
> |
||||||
|
<path |
||||||
|
stroke-linecap="round" |
||||||
|
stroke-linejoin="round" |
||||||
|
stroke-width="2" |
||||||
|
d="M9 5l7 7-7 7" |
||||||
|
/> |
||||||
|
</svg> |
||||||
|
</button> |
||||||
|
|
||||||
|
<!-- Expanded highlight list --> |
||||||
|
{#if isExpanded} |
||||||
|
<div class="mt-2 ml-5 space-y-2"> |
||||||
|
{#each sortedHighlights as highlight} |
||||||
|
{@const truncated = useMockHighlights |
||||||
|
? "test data" |
||||||
|
: truncateHighlight(highlight.content)} |
||||||
|
{@const showCopied = copyFeedback === highlight.id} |
||||||
|
|
||||||
|
<div class="flex items-start gap-2 group"> |
||||||
|
<button |
||||||
|
class="flex-1 text-left text-xs text-gray-600 dark:text-gray-300 hover:text-gray-900 dark:hover:text-gray-100 transition-colors" |
||||||
|
onclick={() => scrollToHighlight(highlight)} |
||||||
|
title={useMockHighlights |
||||||
|
? "Mock highlight" |
||||||
|
: highlight.content} |
||||||
|
> |
||||||
|
{truncated} |
||||||
|
</button> |
||||||
|
<button |
||||||
|
class="flex-shrink-0 p-1 hover:bg-gray-100 dark:hover:bg-gray-600 rounded transition-colors" |
||||||
|
onclick={() => copyHighlightNaddr(highlight)} |
||||||
|
title="Copy naddr" |
||||||
|
> |
||||||
|
{#if showCopied} |
||||||
|
<svg |
||||||
|
class="w-3 h-3 text-green-500" |
||||||
|
fill="currentColor" |
||||||
|
viewBox="0 0 20 20" |
||||||
|
> |
||||||
|
<path |
||||||
|
fill-rule="evenodd" |
||||||
|
d="M16.707 5.293a1 1 0 010 1.414l-8 8a1 1 0 01-1.414 0l-4-4a1 1 0 011.414-1.414L8 12.586l7.293-7.293a1 1 0 011.414 0z" |
||||||
|
clip-rule="evenodd" |
||||||
|
/> |
||||||
|
</svg> |
||||||
|
{:else} |
||||||
|
<svg |
||||||
|
class="w-3 h-3 text-gray-400 group-hover:text-gray-600 dark:group-hover:text-gray-300" |
||||||
|
fill="none" |
||||||
|
stroke="currentColor" |
||||||
|
viewBox="0 0 24 24" |
||||||
|
> |
||||||
|
<path |
||||||
|
stroke-linecap="round" |
||||||
|
stroke-linejoin="round" |
||||||
|
stroke-width="2" |
||||||
|
d="M8 16H6a2 2 0 01-2-2V6a2 2 0 012-2h8a2 2 0 012 2v2m-6 12h8a2 2 0 002-2v-8a2 2 0 00-2-2h-8a2 2 0 00-2 2v8a2 2 0 002 2z" |
||||||
|
/> |
||||||
|
</svg> |
||||||
|
{/if} |
||||||
|
</button> |
||||||
|
</div> |
||||||
|
{/each} |
||||||
|
</div> |
||||||
|
{/if} |
||||||
|
</div> |
||||||
|
{/each} |
||||||
|
</div> |
||||||
|
</div> |
||||||
|
{/if} |
||||||
|
|
||||||
|
<style> |
||||||
|
:global(mark.highlight) { |
||||||
|
transition: background-color 0.2s ease; |
||||||
|
} |
||||||
|
|
||||||
|
:global(mark.highlight:hover) { |
||||||
|
filter: brightness(1.1); |
||||||
|
} |
||||||
|
|
||||||
|
:global(mark.highlight.highlight-flash) { |
||||||
|
animation: flash 1.5s ease-in-out; |
||||||
|
} |
||||||
|
|
||||||
|
@keyframes -global-flash { |
||||||
|
0%, |
||||||
|
100% { |
||||||
|
filter: brightness(1); |
||||||
|
} |
||||||
|
50% { |
||||||
|
filter: brightness(0.4); |
||||||
|
box-shadow: 0 0 12px rgba(0, 0, 0, 0.5); |
||||||
|
} |
||||||
|
} |
||||||
|
</style> |
||||||
@ -0,0 +1,472 @@ |
|||||||
|
<script lang="ts"> |
||||||
|
import { getContext, onMount, onDestroy } from "svelte"; |
||||||
|
import { Button, Modal, Textarea, P } from "flowbite-svelte"; |
||||||
|
import { NDKEvent } from "@nostr-dev-kit/ndk"; |
||||||
|
import type NDK from "@nostr-dev-kit/ndk"; |
||||||
|
import { userStore } from "$lib/stores/userStore"; |
||||||
|
import { activeOutboxRelays, activeInboxRelays } from "$lib/ndk"; |
||||||
|
import { communityRelays } from "$lib/consts"; |
||||||
|
import { WebSocketPool } from "$lib/data_structures/websocket_pool"; |
||||||
|
import { ChevronDownOutline, ChevronUpOutline } from "flowbite-svelte-icons"; |
||||||
|
|
||||||
|
let { |
||||||
|
isActive = false, |
||||||
|
publicationEvent, |
||||||
|
onHighlightCreated, |
||||||
|
}: { |
||||||
|
isActive: boolean; |
||||||
|
publicationEvent: NDKEvent; |
||||||
|
onHighlightCreated?: () => void; |
||||||
|
} = $props(); |
||||||
|
|
||||||
|
const ndk: NDK = getContext("ndk"); |
||||||
|
|
||||||
|
let showConfirmModal = $state(false); |
||||||
|
let selectedText = $state(""); |
||||||
|
let selectionContext = $state(""); |
||||||
|
let comment = $state(""); |
||||||
|
let isSubmitting = $state(false); |
||||||
|
let feedbackMessage = $state(""); |
||||||
|
let showFeedback = $state(false); |
||||||
|
let showJsonPreview = $state(false); |
||||||
|
|
||||||
|
// Store the selection range and section info for creating highlight |
||||||
|
let currentSelection: Selection | null = null; |
||||||
|
let selectedSectionAddress = $state<string | undefined>(undefined); |
||||||
|
let selectedSectionEventId = $state<string | undefined>(undefined); |
||||||
|
|
||||||
|
// Build preview JSON for the highlight event |
||||||
|
let previewJson = $derived.by(() => { |
||||||
|
if (!selectedText) return null; |
||||||
|
|
||||||
|
const useAddress = selectedSectionAddress || publicationEvent.tagAddress(); |
||||||
|
const useEventId = selectedSectionEventId || publicationEvent.id; |
||||||
|
|
||||||
|
const tags: string[][] = []; |
||||||
|
|
||||||
|
if (useAddress) { |
||||||
|
tags.push(["a", useAddress, ""]); |
||||||
|
} else if (useEventId) { |
||||||
|
tags.push(["e", useEventId, ""]); |
||||||
|
} |
||||||
|
|
||||||
|
if (selectionContext) { |
||||||
|
tags.push(["context", selectionContext]); |
||||||
|
} |
||||||
|
|
||||||
|
let authorPubkey = publicationEvent.pubkey; |
||||||
|
if (useAddress && useAddress.includes(":")) { |
||||||
|
authorPubkey = useAddress.split(":")[1]; |
||||||
|
} |
||||||
|
if (authorPubkey) { |
||||||
|
tags.push(["p", authorPubkey, "", "author"]); |
||||||
|
} |
||||||
|
|
||||||
|
if (comment.trim()) { |
||||||
|
tags.push(["comment", comment.trim()]); |
||||||
|
} |
||||||
|
|
||||||
|
return { |
||||||
|
kind: 9802, |
||||||
|
pubkey: $userStore.pubkey || "<your-pubkey>", |
||||||
|
created_at: Math.floor(Date.now() / 1000), |
||||||
|
tags: tags, |
||||||
|
content: selectedText, |
||||||
|
id: "<calculated-on-signing>", |
||||||
|
sig: "<calculated-on-signing>", |
||||||
|
}; |
||||||
|
}); |
||||||
|
|
||||||
|
function handleMouseUp(event: MouseEvent) { |
||||||
|
if (!isActive) return; |
||||||
|
if (!$userStore.signedIn) { |
||||||
|
showFeedbackMessage("Please sign in to create highlights", "error"); |
||||||
|
return; |
||||||
|
} |
||||||
|
|
||||||
|
const selection = window.getSelection(); |
||||||
|
if (!selection || selection.isCollapsed) return; |
||||||
|
|
||||||
|
const text = selection.toString().trim(); |
||||||
|
if (!text || text.length < 3) return; |
||||||
|
|
||||||
|
// Check if the selection is within the publication content |
||||||
|
const target = event.target as HTMLElement; |
||||||
|
|
||||||
|
// Find the closest section element with an id (PublicationSection) |
||||||
|
// Don't use closest('.publication-leather') as Details also has that class |
||||||
|
const publicationSection = target.closest("section[id]") as HTMLElement; |
||||||
|
if (!publicationSection) { |
||||||
|
console.log("[HighlightSelectionHandler] No section[id] found, aborting"); |
||||||
|
return; |
||||||
|
} |
||||||
|
|
||||||
|
// Get the specific section's event address and ID from data attributes |
||||||
|
const sectionAddress = publicationSection.dataset.eventAddress; |
||||||
|
const sectionEventId = publicationSection.dataset.eventId; |
||||||
|
|
||||||
|
console.log("[HighlightSelectionHandler] Selection in section:", { |
||||||
|
element: publicationSection, |
||||||
|
address: sectionAddress, |
||||||
|
eventId: sectionEventId, |
||||||
|
allDataAttrs: publicationSection.dataset, |
||||||
|
sectionId: publicationSection.id, |
||||||
|
}); |
||||||
|
|
||||||
|
currentSelection = selection; |
||||||
|
selectedText = text; |
||||||
|
selectedSectionAddress = sectionAddress; |
||||||
|
selectedSectionEventId = sectionEventId; |
||||||
|
selectionContext = ""; // Will be set below |
||||||
|
|
||||||
|
// Get surrounding context (the paragraph or section) |
||||||
|
const parentElement = selection.anchorNode?.parentElement; |
||||||
|
if (parentElement) { |
||||||
|
const contextElement = parentElement.closest("p, section, div"); |
||||||
|
if (contextElement) { |
||||||
|
selectionContext = contextElement.textContent?.trim() || ""; |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
showConfirmModal = true; |
||||||
|
} |
||||||
|
|
||||||
|
async function createHighlight() { |
||||||
|
if (!$userStore.signer || !ndk) { |
||||||
|
showFeedbackMessage("Please sign in to create highlights", "error"); |
||||||
|
return; |
||||||
|
} |
||||||
|
|
||||||
|
if (!$userStore.pubkey) { |
||||||
|
showFeedbackMessage("User pubkey not available", "error"); |
||||||
|
return; |
||||||
|
} |
||||||
|
|
||||||
|
isSubmitting = true; |
||||||
|
|
||||||
|
try { |
||||||
|
const event = new NDKEvent(ndk); |
||||||
|
event.kind = 9802; |
||||||
|
event.content = selectedText; |
||||||
|
event.pubkey = $userStore.pubkey; // Set pubkey from user store |
||||||
|
|
||||||
|
// Use the specific section's address/ID if available, otherwise fall back to publication event |
||||||
|
const useAddress = |
||||||
|
selectedSectionAddress || publicationEvent.tagAddress(); |
||||||
|
const useEventId = selectedSectionEventId || publicationEvent.id; |
||||||
|
|
||||||
|
console.log("[HighlightSelectionHandler] Creating highlight with:", { |
||||||
|
address: useAddress, |
||||||
|
eventId: useEventId, |
||||||
|
fallbackUsed: !selectedSectionAddress, |
||||||
|
}); |
||||||
|
|
||||||
|
const tags: string[][] = []; |
||||||
|
|
||||||
|
// Always prefer addressable events for publications |
||||||
|
if (useAddress) { |
||||||
|
// Addressable event - use "a" tag |
||||||
|
tags.push(["a", useAddress, ""]); |
||||||
|
} else if (useEventId) { |
||||||
|
// Regular event - use "e" tag |
||||||
|
tags.push(["e", useEventId, ""]); |
||||||
|
} |
||||||
|
|
||||||
|
// Add context tag |
||||||
|
if (selectionContext) { |
||||||
|
tags.push(["context", selectionContext]); |
||||||
|
} |
||||||
|
|
||||||
|
// Add author tag - extract from address or use publication event |
||||||
|
let authorPubkey = publicationEvent.pubkey; |
||||||
|
if (useAddress && useAddress.includes(":")) { |
||||||
|
// Extract pubkey from address format "kind:pubkey:identifier" |
||||||
|
authorPubkey = useAddress.split(":")[1]; |
||||||
|
} |
||||||
|
if (authorPubkey) { |
||||||
|
tags.push(["p", authorPubkey, "", "author"]); |
||||||
|
} |
||||||
|
|
||||||
|
// Add comment tag if user provided a comment (quote highlight) |
||||||
|
if (comment.trim()) { |
||||||
|
tags.push(["comment", comment.trim()]); |
||||||
|
} |
||||||
|
|
||||||
|
event.tags = tags; |
||||||
|
|
||||||
|
// Sign the event - create plain object to avoid proxy issues |
||||||
|
const plainEvent = { |
||||||
|
kind: Number(event.kind), |
||||||
|
pubkey: String(event.pubkey), |
||||||
|
created_at: Number(event.created_at ?? Math.floor(Date.now() / 1000)), |
||||||
|
tags: event.tags.map((tag) => tag.map(String)), |
||||||
|
content: String(event.content), |
||||||
|
}; |
||||||
|
|
||||||
|
if ( |
||||||
|
typeof window !== "undefined" && |
||||||
|
window.nostr && |
||||||
|
window.nostr.signEvent |
||||||
|
) { |
||||||
|
const signed = await window.nostr.signEvent(plainEvent); |
||||||
|
event.sig = signed.sig; |
||||||
|
if ("id" in signed) { |
||||||
|
event.id = signed.id as string; |
||||||
|
} |
||||||
|
} else { |
||||||
|
await event.sign($userStore.signer); |
||||||
|
} |
||||||
|
|
||||||
|
// Build relay list following the same pattern as eventServices |
||||||
|
const relays = [ |
||||||
|
...communityRelays, |
||||||
|
...$activeOutboxRelays, |
||||||
|
...$activeInboxRelays, |
||||||
|
]; |
||||||
|
|
||||||
|
// Remove duplicates |
||||||
|
const uniqueRelays = Array.from(new Set(relays)); |
||||||
|
|
||||||
|
console.log( |
||||||
|
"[HighlightSelectionHandler] Publishing to relays:", |
||||||
|
uniqueRelays, |
||||||
|
); |
||||||
|
|
||||||
|
const signedEvent = { |
||||||
|
...plainEvent, |
||||||
|
id: event.id, |
||||||
|
sig: event.sig, |
||||||
|
}; |
||||||
|
|
||||||
|
// Publish to relays using WebSocketPool |
||||||
|
let publishedCount = 0; |
||||||
|
for (const relayUrl of uniqueRelays) { |
||||||
|
try { |
||||||
|
const ws = await WebSocketPool.instance.acquire(relayUrl); |
||||||
|
|
||||||
|
await new Promise<void>((resolve, reject) => { |
||||||
|
const timeout = setTimeout(() => { |
||||||
|
WebSocketPool.instance.release(ws); |
||||||
|
reject(new Error("Timeout")); |
||||||
|
}, 5000); |
||||||
|
|
||||||
|
ws.onmessage = (e) => { |
||||||
|
const [type, id, ok, message] = JSON.parse(e.data); |
||||||
|
if (type === "OK" && id === signedEvent.id) { |
||||||
|
clearTimeout(timeout); |
||||||
|
if (ok) { |
||||||
|
publishedCount++; |
||||||
|
console.log( |
||||||
|
`[HighlightSelectionHandler] Published to ${relayUrl}`, |
||||||
|
); |
||||||
|
WebSocketPool.instance.release(ws); |
||||||
|
resolve(); |
||||||
|
} else { |
||||||
|
console.warn( |
||||||
|
`[HighlightSelectionHandler] ${relayUrl} rejected: ${message}`, |
||||||
|
); |
||||||
|
WebSocketPool.instance.release(ws); |
||||||
|
reject(new Error(message)); |
||||||
|
} |
||||||
|
} |
||||||
|
}; |
||||||
|
|
||||||
|
// Send the event to the relay |
||||||
|
ws.send(JSON.stringify(["EVENT", signedEvent])); |
||||||
|
}); |
||||||
|
} catch (e) { |
||||||
|
console.error( |
||||||
|
`[HighlightSelectionHandler] Failed to publish to ${relayUrl}:`, |
||||||
|
e, |
||||||
|
); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
if (publishedCount === 0) { |
||||||
|
throw new Error("Failed to publish to any relays"); |
||||||
|
} |
||||||
|
|
||||||
|
showFeedbackMessage( |
||||||
|
`Highlight created and published to ${publishedCount} relay(s)!`, |
||||||
|
"success", |
||||||
|
); |
||||||
|
|
||||||
|
// Clear the selection |
||||||
|
if (currentSelection) { |
||||||
|
currentSelection.removeAllRanges(); |
||||||
|
} |
||||||
|
|
||||||
|
// Reset state |
||||||
|
showConfirmModal = false; |
||||||
|
selectedText = ""; |
||||||
|
selectionContext = ""; |
||||||
|
comment = ""; |
||||||
|
selectedSectionAddress = undefined; |
||||||
|
selectedSectionEventId = undefined; |
||||||
|
showJsonPreview = false; |
||||||
|
currentSelection = null; |
||||||
|
|
||||||
|
// Notify parent component |
||||||
|
if (onHighlightCreated) { |
||||||
|
onHighlightCreated(); |
||||||
|
} |
||||||
|
} catch (error) { |
||||||
|
console.error("Failed to create highlight:", error); |
||||||
|
showFeedbackMessage( |
||||||
|
"Failed to create highlight. Please try again.", |
||||||
|
"error", |
||||||
|
); |
||||||
|
} finally { |
||||||
|
isSubmitting = false; |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
function cancelHighlight() { |
||||||
|
showConfirmModal = false; |
||||||
|
selectedText = ""; |
||||||
|
selectionContext = ""; |
||||||
|
comment = ""; |
||||||
|
selectedSectionAddress = undefined; |
||||||
|
selectedSectionEventId = undefined; |
||||||
|
showJsonPreview = false; |
||||||
|
|
||||||
|
// Clear the selection |
||||||
|
if (currentSelection) { |
||||||
|
currentSelection.removeAllRanges(); |
||||||
|
} |
||||||
|
currentSelection = null; |
||||||
|
} |
||||||
|
|
||||||
|
function showFeedbackMessage(message: string, type: "success" | "error") { |
||||||
|
feedbackMessage = message; |
||||||
|
showFeedback = true; |
||||||
|
setTimeout(() => { |
||||||
|
showFeedback = false; |
||||||
|
}, 3000); |
||||||
|
} |
||||||
|
|
||||||
|
onMount(() => { |
||||||
|
// Only listen to mouseup on the document |
||||||
|
document.addEventListener("mouseup", handleMouseUp); |
||||||
|
}); |
||||||
|
|
||||||
|
onDestroy(() => { |
||||||
|
document.removeEventListener("mouseup", handleMouseUp); |
||||||
|
}); |
||||||
|
|
||||||
|
// Add visual indicator when highlight mode is active |
||||||
|
$effect(() => { |
||||||
|
if (isActive) { |
||||||
|
document.body.classList.add("highlight-mode-active"); |
||||||
|
} else { |
||||||
|
document.body.classList.remove("highlight-mode-active"); |
||||||
|
} |
||||||
|
|
||||||
|
// Cleanup when component unmounts |
||||||
|
return () => { |
||||||
|
document.body.classList.remove("highlight-mode-active"); |
||||||
|
}; |
||||||
|
}); |
||||||
|
</script> |
||||||
|
|
||||||
|
{#if showConfirmModal} |
||||||
|
<Modal |
||||||
|
title="Create Highlight" |
||||||
|
bind:open={showConfirmModal} |
||||||
|
autoclose={false} |
||||||
|
size="md" |
||||||
|
> |
||||||
|
<div class="space-y-4"> |
||||||
|
<div> |
||||||
|
<P class="text-sm font-semibold mb-2">Selected Text:</P> |
||||||
|
<div |
||||||
|
class="bg-gray-100 dark:bg-gray-800 p-3 rounded-lg max-h-32 overflow-y-auto" |
||||||
|
> |
||||||
|
<P class="text-sm italic">"{selectedText}"</P> |
||||||
|
</div> |
||||||
|
</div> |
||||||
|
|
||||||
|
<div> |
||||||
|
<label for="comment" class="block text-sm font-semibold mb-2"> |
||||||
|
Add a Comment (Optional): |
||||||
|
</label> |
||||||
|
<Textarea |
||||||
|
id="comment" |
||||||
|
bind:value={comment} |
||||||
|
placeholder="Share your thoughts about this highlight..." |
||||||
|
rows={3} |
||||||
|
class="w-full" |
||||||
|
/> |
||||||
|
</div> |
||||||
|
|
||||||
|
<!-- JSON Preview Section --> |
||||||
|
{#if showJsonPreview && previewJson} |
||||||
|
<div |
||||||
|
class="border border-gray-300 dark:border-gray-600 rounded-lg p-3 bg-gray-50 dark:bg-gray-900" |
||||||
|
> |
||||||
|
<P class="text-sm font-semibold mb-2">Event JSON Preview:</P> |
||||||
|
<pre |
||||||
|
class="text-xs bg-white dark:bg-gray-800 p-3 rounded overflow-x-auto border border-gray-200 dark:border-gray-700"><code |
||||||
|
>{JSON.stringify(previewJson, null, 2)}</code |
||||||
|
></pre> |
||||||
|
</div> |
||||||
|
{/if} |
||||||
|
|
||||||
|
<div class="flex justify-between items-center"> |
||||||
|
<Button |
||||||
|
color="light" |
||||||
|
size="sm" |
||||||
|
onclick={() => (showJsonPreview = !showJsonPreview)} |
||||||
|
class="flex items-center gap-1" |
||||||
|
> |
||||||
|
{#if showJsonPreview} |
||||||
|
<ChevronUpOutline class="w-4 h-4" /> |
||||||
|
{:else} |
||||||
|
<ChevronDownOutline class="w-4 h-4" /> |
||||||
|
{/if} |
||||||
|
{showJsonPreview ? "Hide" : "Show"} JSON |
||||||
|
</Button> |
||||||
|
|
||||||
|
<div class="flex space-x-2"> |
||||||
|
<Button |
||||||
|
color="alternative" |
||||||
|
onclick={cancelHighlight} |
||||||
|
disabled={isSubmitting} |
||||||
|
> |
||||||
|
Cancel |
||||||
|
</Button> |
||||||
|
<Button |
||||||
|
color="primary" |
||||||
|
onclick={createHighlight} |
||||||
|
disabled={isSubmitting} |
||||||
|
> |
||||||
|
{isSubmitting ? "Creating..." : "Create Highlight"} |
||||||
|
</Button> |
||||||
|
</div> |
||||||
|
</div> |
||||||
|
</div> |
||||||
|
</Modal> |
||||||
|
{/if} |
||||||
|
|
||||||
|
{#if showFeedback} |
||||||
|
<div |
||||||
|
class="fixed bottom-4 right-4 z-50 p-4 rounded-lg shadow-lg {feedbackMessage.includes( |
||||||
|
'success', |
||||||
|
) |
||||||
|
? 'bg-green-500 text-white' |
||||||
|
: 'bg-red-500 text-white'}" |
||||||
|
> |
||||||
|
{feedbackMessage} |
||||||
|
</div> |
||||||
|
{/if} |
||||||
|
|
||||||
|
<style> |
||||||
|
:global(body.highlight-mode-active .publication-leather) { |
||||||
|
cursor: text; |
||||||
|
user-select: text; |
||||||
|
} |
||||||
|
|
||||||
|
:global(body.highlight-mode-active .publication-leather *) { |
||||||
|
cursor: text; |
||||||
|
} |
||||||
|
</style> |
||||||
@ -0,0 +1,928 @@ |
|||||||
|
<script lang="ts"> |
||||||
|
import type { NDKEvent } from "@nostr-dev-kit/ndk"; |
||||||
|
import { getUserMetadata, toNpub } from "$lib/utils/nostrUtils"; |
||||||
|
import { getNdkContext } from "$lib/ndk"; |
||||||
|
import { basicMarkup } from "$lib/snippets/MarkupSnippets.svelte"; |
||||||
|
import { ChevronDownOutline, ChevronRightOutline, DotsVerticalOutline, TrashBinOutline, ClipboardCleanOutline, EyeOutline } from "flowbite-svelte-icons"; |
||||||
|
import { nip19 } from "nostr-tools"; |
||||||
|
import { Button, Popover, Modal, Textarea, P } from "flowbite-svelte"; |
||||||
|
import { deleteEvent, canDeleteEvent } from "$lib/services/deletion"; |
||||||
|
import { userStore } from "$lib/stores/userStore"; |
||||||
|
import { goto } from "$app/navigation"; |
||||||
|
|
||||||
|
let { |
||||||
|
sectionAddress, |
||||||
|
comments = [], |
||||||
|
visible = true, |
||||||
|
}: { |
||||||
|
sectionAddress: string; |
||||||
|
comments: NDKEvent[]; |
||||||
|
visible?: boolean; |
||||||
|
} = $props(); |
||||||
|
|
||||||
|
const ndk = getNdkContext(); |
||||||
|
|
||||||
|
// State management |
||||||
|
let profiles = $state(new Map<string, any>()); |
||||||
|
let expandedThreads = $state(new Set<string>()); |
||||||
|
let detailsModalOpen = $state<string | null>(null); |
||||||
|
let deletingComments = $state(new Set<string>()); |
||||||
|
let replyingTo = $state<string | null>(null); |
||||||
|
let replyContent = $state(""); |
||||||
|
let isSubmittingReply = $state(false); |
||||||
|
let replyError = $state<string | null>(null); |
||||||
|
let replySuccess = $state<string | null>(null); |
||||||
|
|
||||||
|
// Subscribe to userStore |
||||||
|
let user = $derived($userStore); |
||||||
|
|
||||||
|
/** |
||||||
|
* Parse comment threading structure |
||||||
|
* Root comments have no 'e' tag with 'reply' marker |
||||||
|
*/ |
||||||
|
function buildThreadStructure(allComments: NDKEvent[]) { |
||||||
|
const rootComments: NDKEvent[] = []; |
||||||
|
const repliesByParent = new Map<string, NDKEvent[]>(); |
||||||
|
|
||||||
|
for (const comment of allComments) { |
||||||
|
// Check if this is a reply by looking for 'e' tags with 'reply' marker |
||||||
|
const replyTag = comment.tags.find(t => t[0] === 'e' && t[3] === 'reply'); |
||||||
|
|
||||||
|
if (replyTag) { |
||||||
|
const parentId = replyTag[1]; |
||||||
|
if (!repliesByParent.has(parentId)) { |
||||||
|
repliesByParent.set(parentId, []); |
||||||
|
} |
||||||
|
repliesByParent.get(parentId)!.push(comment); |
||||||
|
} else { |
||||||
|
// This is a root comment (no reply tag) |
||||||
|
rootComments.push(comment); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
return { rootComments, repliesByParent }; |
||||||
|
} |
||||||
|
|
||||||
|
let threadStructure = $derived(buildThreadStructure(comments)); |
||||||
|
|
||||||
|
/** |
||||||
|
* Count replies for a comment thread |
||||||
|
*/ |
||||||
|
function countReplies(commentId: string, repliesMap: Map<string, NDKEvent[]>): number { |
||||||
|
const directReplies = repliesMap.get(commentId) || []; |
||||||
|
let count = directReplies.length; |
||||||
|
|
||||||
|
// Recursively count nested replies |
||||||
|
for (const reply of directReplies) { |
||||||
|
count += countReplies(reply.id, repliesMap); |
||||||
|
} |
||||||
|
|
||||||
|
return count; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Get display name for a pubkey |
||||||
|
*/ |
||||||
|
function getDisplayName(pubkey: string): string { |
||||||
|
const profile = profiles.get(pubkey); |
||||||
|
if (profile) { |
||||||
|
return profile.displayName || profile.name || profile.pubkey || pubkey; |
||||||
|
} |
||||||
|
const npub = toNpub(pubkey) || pubkey; |
||||||
|
return `${npub.slice(0, 12)}...`; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Format timestamp |
||||||
|
*/ |
||||||
|
function formatTimestamp(timestamp: number): string { |
||||||
|
const date = new Date(timestamp * 1000); |
||||||
|
const now = new Date(); |
||||||
|
const diffMs = now.getTime() - date.getTime(); |
||||||
|
const diffMins = Math.floor(diffMs / 60000); |
||||||
|
const diffHours = Math.floor(diffMs / 3600000); |
||||||
|
const diffDays = Math.floor(diffMs / 86400000); |
||||||
|
|
||||||
|
if (diffMins < 60) { |
||||||
|
return `${diffMins}m ago`; |
||||||
|
} else if (diffHours < 24) { |
||||||
|
return `${diffHours}h ago`; |
||||||
|
} else if (diffDays < 7) { |
||||||
|
return `${diffDays}d ago`; |
||||||
|
} else { |
||||||
|
return date.toLocaleDateString(); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Fetch profile for a pubkey |
||||||
|
*/ |
||||||
|
async function fetchProfile(pubkey: string) { |
||||||
|
if (profiles.has(pubkey)) return; |
||||||
|
|
||||||
|
try { |
||||||
|
const npub = toNpub(pubkey); |
||||||
|
if (!npub) { |
||||||
|
setFallbackProfile(pubkey); |
||||||
|
return; |
||||||
|
} |
||||||
|
|
||||||
|
const profile = await getUserMetadata(npub, ndk, true); |
||||||
|
const newProfiles = new Map(profiles); |
||||||
|
newProfiles.set(pubkey, profile); |
||||||
|
profiles = newProfiles; |
||||||
|
} catch (err) { |
||||||
|
setFallbackProfile(pubkey); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
function setFallbackProfile(pubkey: string) { |
||||||
|
const npub = toNpub(pubkey) || pubkey; |
||||||
|
const truncated = `${npub.slice(0, 12)}...`; |
||||||
|
const fallbackProfile = { |
||||||
|
name: truncated, |
||||||
|
displayName: truncated, |
||||||
|
picture: null |
||||||
|
}; |
||||||
|
const newProfiles = new Map(profiles); |
||||||
|
newProfiles.set(pubkey, fallbackProfile); |
||||||
|
profiles = newProfiles; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Toggle thread expansion |
||||||
|
*/ |
||||||
|
function toggleThread(commentId: string) { |
||||||
|
const newExpanded = new Set(expandedThreads); |
||||||
|
if (newExpanded.has(commentId)) { |
||||||
|
newExpanded.delete(commentId); |
||||||
|
} else { |
||||||
|
newExpanded.add(commentId); |
||||||
|
} |
||||||
|
expandedThreads = newExpanded; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Render nested replies recursively |
||||||
|
*/ |
||||||
|
function renderReplies(parentId: string, repliesMap: Map<string, NDKEvent[]>, level: number = 0) { |
||||||
|
const replies = repliesMap.get(parentId) || []; |
||||||
|
return replies; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Copy nevent to clipboard |
||||||
|
*/ |
||||||
|
async function copyNevent(event: NDKEvent) { |
||||||
|
try { |
||||||
|
const nevent = nip19.neventEncode({ |
||||||
|
id: event.id, |
||||||
|
author: event.pubkey, |
||||||
|
kind: event.kind, |
||||||
|
}); |
||||||
|
await navigator.clipboard.writeText(nevent); |
||||||
|
console.log('Copied nevent to clipboard:', nevent); |
||||||
|
} catch (err) { |
||||||
|
console.error('Failed to copy nevent:', err); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Navigate to event details page |
||||||
|
*/ |
||||||
|
function viewEventDetails(comment: NDKEvent) { |
||||||
|
const nevent = nip19.neventEncode({ |
||||||
|
id: comment.id, |
||||||
|
author: comment.pubkey, |
||||||
|
kind: comment.kind, |
||||||
|
}); |
||||||
|
goto(`/events?id=${encodeURIComponent(nevent)}`); |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Check if user can delete a comment |
||||||
|
*/ |
||||||
|
function canDelete(comment: NDKEvent): boolean { |
||||||
|
return canDeleteEvent(comment, ndk); |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Submit a reply to a comment |
||||||
|
*/ |
||||||
|
async function submitReply(parentComment: NDKEvent) { |
||||||
|
if (!replyContent.trim()) { |
||||||
|
replyError = "Reply cannot be empty"; |
||||||
|
return; |
||||||
|
} |
||||||
|
|
||||||
|
if (!user.signedIn || !user.signer) { |
||||||
|
replyError = "You must be signed in to reply"; |
||||||
|
return; |
||||||
|
} |
||||||
|
|
||||||
|
isSubmittingReply = true; |
||||||
|
replyError = null; |
||||||
|
replySuccess = null; |
||||||
|
|
||||||
|
try { |
||||||
|
const { NDKEvent: NDKEventClass } = await import("@nostr-dev-kit/ndk"); |
||||||
|
const { activeOutboxRelays } = await import("$lib/ndk"); |
||||||
|
|
||||||
|
// Get relay hint |
||||||
|
const relays = activeOutboxRelays; |
||||||
|
let relayHint = ""; |
||||||
|
relays.subscribe((r) => { relayHint = r[0] || ""; })(); |
||||||
|
|
||||||
|
// Create reply event (kind 1111) |
||||||
|
const replyEvent = new NDKEventClass(ndk); |
||||||
|
replyEvent.kind = 1111; |
||||||
|
replyEvent.content = replyContent; |
||||||
|
|
||||||
|
// Parse section address to get root event details |
||||||
|
const rootParts = sectionAddress.split(":"); |
||||||
|
if (rootParts.length !== 3) { |
||||||
|
throw new Error("Invalid section address format"); |
||||||
|
} |
||||||
|
const [rootKindStr, rootAuthorPubkey, rootDTag] = rootParts; |
||||||
|
const rootKind = parseInt(rootKindStr); |
||||||
|
|
||||||
|
// NIP-22 reply tags structure: |
||||||
|
// - Root tags (A, K, P) point to the section/article |
||||||
|
// - Parent tags (a, k, p) point to the parent comment |
||||||
|
// - Add 'e' tag with 'reply' marker for the parent comment |
||||||
|
replyEvent.tags = [ |
||||||
|
// Root scope - uppercase tags (point to section) |
||||||
|
["A", sectionAddress, relayHint, rootAuthorPubkey], |
||||||
|
["K", rootKind.toString()], |
||||||
|
["P", rootAuthorPubkey, relayHint], |
||||||
|
|
||||||
|
// Parent scope - lowercase tags (point to parent comment) |
||||||
|
["a", `1111:${parentComment.pubkey}:`, relayHint], |
||||||
|
["k", "1111"], |
||||||
|
["p", parentComment.pubkey, relayHint], |
||||||
|
|
||||||
|
// Reply marker |
||||||
|
["e", parentComment.id, relayHint, "reply"], |
||||||
|
]; |
||||||
|
|
||||||
|
console.log("[SectionComments] Creating reply with tags:", replyEvent.tags); |
||||||
|
|
||||||
|
// Sign and publish |
||||||
|
await replyEvent.sign(); |
||||||
|
await replyEvent.publish(); |
||||||
|
|
||||||
|
console.log("[SectionComments] Reply published:", replyEvent.id); |
||||||
|
|
||||||
|
replySuccess = parentComment.id; |
||||||
|
replyContent = ""; |
||||||
|
|
||||||
|
// Close reply UI after a delay |
||||||
|
setTimeout(() => { |
||||||
|
replyingTo = null; |
||||||
|
replySuccess = null; |
||||||
|
}, 2000); |
||||||
|
|
||||||
|
} catch (err) { |
||||||
|
console.error("[SectionComments] Error submitting reply:", err); |
||||||
|
replyError = err instanceof Error ? err.message : "Failed to submit reply"; |
||||||
|
} finally { |
||||||
|
isSubmittingReply = false; |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Delete a comment |
||||||
|
*/ |
||||||
|
async function handleDeleteComment(comment: NDKEvent) { |
||||||
|
if (!canDelete(comment)) return; |
||||||
|
|
||||||
|
if (!confirm('Are you sure you want to delete this comment?')) { |
||||||
|
return; |
||||||
|
} |
||||||
|
|
||||||
|
const newDeleting = new Set(deletingComments); |
||||||
|
newDeleting.add(comment.id); |
||||||
|
deletingComments = newDeleting; |
||||||
|
|
||||||
|
try { |
||||||
|
const result = await deleteEvent({ |
||||||
|
eventId: comment.id, |
||||||
|
eventKind: comment.kind, |
||||||
|
reason: 'User deleted comment', |
||||||
|
}, ndk); |
||||||
|
|
||||||
|
if (result.success) { |
||||||
|
console.log('[SectionComments] Comment deleted successfully'); |
||||||
|
// Note: The comment will still show in the UI until the page is refreshed |
||||||
|
// or the parent component refetches comments |
||||||
|
} else { |
||||||
|
alert(`Failed to delete comment: ${result.error}`); |
||||||
|
} |
||||||
|
} catch (err) { |
||||||
|
console.error('[SectionComments] Error deleting comment:', err); |
||||||
|
alert('Failed to delete comment'); |
||||||
|
} finally { |
||||||
|
const newDeleting = new Set(deletingComments); |
||||||
|
newDeleting.delete(comment.id); |
||||||
|
deletingComments = newDeleting; |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Pre-fetch profiles for all comment authors |
||||||
|
*/ |
||||||
|
$effect(() => { |
||||||
|
const uniquePubkeys = new Set(comments.map(c => c.pubkey)); |
||||||
|
for (const pubkey of uniquePubkeys) { |
||||||
|
fetchProfile(pubkey); |
||||||
|
} |
||||||
|
}); |
||||||
|
</script> |
||||||
|
|
||||||
|
{#if visible && threadStructure.rootComments.length > 0} |
||||||
|
<div class="space-y-1"> |
||||||
|
{#each threadStructure.rootComments as rootComment (rootComment.id)} |
||||||
|
{@const replyCount = countReplies(rootComment.id, threadStructure.repliesByParent)} |
||||||
|
{@const isExpanded = expandedThreads.has(rootComment.id)} |
||||||
|
|
||||||
|
<div class="border border-gray-300 dark:border-gray-600 rounded-lg overflow-hidden bg-white dark:bg-gray-800 shadow-sm"> |
||||||
|
<!-- Multi-row collapsed view --> |
||||||
|
{#if !isExpanded} |
||||||
|
<div class="flex gap-2 px-3 py-2 text-sm"> |
||||||
|
<button |
||||||
|
class="flex-shrink-0 mt-1" |
||||||
|
onclick={() => toggleThread(rootComment.id)} |
||||||
|
aria-label="Expand comment" |
||||||
|
> |
||||||
|
<ChevronRightOutline class="w-3 h-3 text-gray-600 dark:text-gray-400" /> |
||||||
|
</button> |
||||||
|
|
||||||
|
<div class="flex-1 min-w-0"> |
||||||
|
<p class="line-clamp-3 text-gray-700 dark:text-gray-300 mb-1"> |
||||||
|
{rootComment.content} |
||||||
|
</p> |
||||||
|
<div class="flex items-center gap-2 text-xs"> |
||||||
|
<button |
||||||
|
class="text-gray-600 dark:text-gray-400 hover:text-gray-900 dark:hover:text-gray-100 transition-colors" |
||||||
|
onclick={(e) => { e.stopPropagation(); copyNevent(rootComment); }} |
||||||
|
title="Copy nevent to clipboard" |
||||||
|
> |
||||||
|
{getDisplayName(rootComment.pubkey)} |
||||||
|
</button> |
||||||
|
{#if replyCount > 0} |
||||||
|
<span class="text-gray-400 dark:text-gray-500">•</span> |
||||||
|
<span class="text-blue-600 dark:text-blue-400"> |
||||||
|
{replyCount} {replyCount === 1 ? 'reply' : 'replies'} |
||||||
|
</span> |
||||||
|
{/if} |
||||||
|
<span class="text-gray-400 dark:text-gray-500">•</span> |
||||||
|
<button |
||||||
|
class="text-blue-600 dark:text-blue-400 hover:text-blue-800 dark:hover:text-blue-300 transition-colors" |
||||||
|
onclick={(e) => { |
||||||
|
e.stopPropagation(); |
||||||
|
replyingTo = replyingTo === rootComment.id ? null : rootComment.id; |
||||||
|
replyError = null; |
||||||
|
replySuccess = null; |
||||||
|
// Auto-expand when replying from collapsed view |
||||||
|
if (!expandedThreads.has(rootComment.id)) { |
||||||
|
toggleThread(rootComment.id); |
||||||
|
} |
||||||
|
}} |
||||||
|
> |
||||||
|
Reply |
||||||
|
</button> |
||||||
|
</div> |
||||||
|
</div> |
||||||
|
|
||||||
|
<!-- Actions menu in collapsed view --> |
||||||
|
<div class="flex-shrink-0 mt-1"> |
||||||
|
<button |
||||||
|
id="comment-actions-collapsed-{rootComment.id}" |
||||||
|
class="p-1 hover:bg-gray-200 dark:hover:bg-gray-600 rounded transition-colors" |
||||||
|
aria-label="Comment actions" |
||||||
|
onclick={(e) => { e.stopPropagation(); }} |
||||||
|
> |
||||||
|
<DotsVerticalOutline class="w-4 h-4 text-gray-600 dark:text-gray-400" /> |
||||||
|
</button> |
||||||
|
<Popover |
||||||
|
triggeredBy="#comment-actions-collapsed-{rootComment.id}" |
||||||
|
placement="bottom-end" |
||||||
|
class="w-48 text-sm" |
||||||
|
> |
||||||
|
<ul class="space-y-1"> |
||||||
|
<li> |
||||||
|
<button |
||||||
|
class="w-full text-left px-3 py-2 hover:bg-gray-100 dark:hover:bg-gray-700 rounded flex items-center gap-2" |
||||||
|
onclick={() => { |
||||||
|
detailsModalOpen = rootComment.id; |
||||||
|
}} |
||||||
|
> |
||||||
|
<EyeOutline class="w-4 h-4" /> |
||||||
|
View details |
||||||
|
</button> |
||||||
|
</li> |
||||||
|
<li> |
||||||
|
<button |
||||||
|
class="w-full text-left px-3 py-2 hover:bg-gray-100 dark:hover:bg-gray-700 rounded flex items-center gap-2" |
||||||
|
onclick={async () => { |
||||||
|
await copyNevent(rootComment); |
||||||
|
}} |
||||||
|
> |
||||||
|
<ClipboardCleanOutline class="w-4 h-4" /> |
||||||
|
Copy nevent |
||||||
|
</button> |
||||||
|
</li> |
||||||
|
{#if canDelete(rootComment)} |
||||||
|
<li> |
||||||
|
<button |
||||||
|
class="w-full text-left px-3 py-2 hover:bg-red-50 dark:hover:bg-red-900/20 rounded flex items-center gap-2 text-red-600 dark:text-red-400" |
||||||
|
onclick={() => { |
||||||
|
handleDeleteComment(rootComment); |
||||||
|
}} |
||||||
|
disabled={deletingComments.has(rootComment.id)} |
||||||
|
> |
||||||
|
<TrashBinOutline class="w-4 h-4" /> |
||||||
|
{deletingComments.has(rootComment.id) ? 'Deleting...' : 'Delete comment'} |
||||||
|
</button> |
||||||
|
</li> |
||||||
|
{/if} |
||||||
|
</ul> |
||||||
|
</Popover> |
||||||
|
</div> |
||||||
|
</div> |
||||||
|
{:else} |
||||||
|
<!-- Expanded view --> |
||||||
|
<div class="flex flex-col"> |
||||||
|
<!-- Expanded header row --> |
||||||
|
<div class="flex items-center gap-2 px-3 py-2 text-sm border-b border-gray-200 dark:border-gray-700"> |
||||||
|
<button |
||||||
|
class="flex-shrink-0" |
||||||
|
onclick={() => toggleThread(rootComment.id)} |
||||||
|
aria-label="Collapse comment" |
||||||
|
> |
||||||
|
<ChevronDownOutline class="w-3 h-3 text-gray-600 dark:text-gray-400" /> |
||||||
|
</button> |
||||||
|
|
||||||
|
<button |
||||||
|
class="flex-shrink-0 font-medium text-gray-900 dark:text-gray-100 hover:text-gray-600 dark:hover:text-gray-400 transition-colors" |
||||||
|
onclick={(e) => { e.stopPropagation(); copyNevent(rootComment); }} |
||||||
|
title="Copy nevent to clipboard" |
||||||
|
> |
||||||
|
{getDisplayName(rootComment.pubkey)} |
||||||
|
</button> |
||||||
|
|
||||||
|
<span class="text-xs text-gray-500 dark:text-gray-400"> |
||||||
|
{formatTimestamp(rootComment.created_at || 0)} |
||||||
|
</span> |
||||||
|
|
||||||
|
{#if replyCount > 0} |
||||||
|
<span class="text-xs text-blue-600 dark:text-blue-400"> |
||||||
|
{replyCount} {replyCount === 1 ? 'reply' : 'replies'} |
||||||
|
</span> |
||||||
|
{/if} |
||||||
|
|
||||||
|
<!-- Actions menu --> |
||||||
|
<div class="ml-auto"> |
||||||
|
<button |
||||||
|
id="comment-actions-{rootComment.id}" |
||||||
|
class="p-1 hover:bg-gray-200 dark:hover:bg-gray-600 rounded transition-colors" |
||||||
|
aria-label="Comment actions" |
||||||
|
> |
||||||
|
<DotsVerticalOutline class="w-4 h-4 text-gray-600 dark:text-gray-400" /> |
||||||
|
</button> |
||||||
|
<Popover |
||||||
|
triggeredBy="#comment-actions-{rootComment.id}" |
||||||
|
placement="bottom-end" |
||||||
|
class="w-48 text-sm" |
||||||
|
> |
||||||
|
<ul class="space-y-1"> |
||||||
|
<li> |
||||||
|
<button |
||||||
|
class="w-full text-left px-3 py-2 hover:bg-gray-100 dark:hover:bg-gray-700 rounded flex items-center gap-2" |
||||||
|
onclick={() => { |
||||||
|
detailsModalOpen = rootComment.id; |
||||||
|
}} |
||||||
|
> |
||||||
|
<EyeOutline class="w-4 h-4" /> |
||||||
|
View details |
||||||
|
</button> |
||||||
|
</li> |
||||||
|
<li> |
||||||
|
<button |
||||||
|
class="w-full text-left px-3 py-2 hover:bg-gray-100 dark:hover:bg-gray-700 rounded flex items-center gap-2" |
||||||
|
onclick={async () => { |
||||||
|
await copyNevent(rootComment); |
||||||
|
}} |
||||||
|
> |
||||||
|
<ClipboardCleanOutline class="w-4 h-4" /> |
||||||
|
Copy nevent |
||||||
|
</button> |
||||||
|
</li> |
||||||
|
{#if canDelete(rootComment)} |
||||||
|
<li> |
||||||
|
<button |
||||||
|
class="w-full text-left px-3 py-2 hover:bg-red-50 dark:hover:bg-red-900/20 rounded flex items-center gap-2 text-red-600 dark:text-red-400" |
||||||
|
onclick={() => { |
||||||
|
handleDeleteComment(rootComment); |
||||||
|
}} |
||||||
|
disabled={deletingComments.has(rootComment.id)} |
||||||
|
> |
||||||
|
<TrashBinOutline class="w-4 h-4" /> |
||||||
|
{deletingComments.has(rootComment.id) ? 'Deleting...' : 'Delete comment'} |
||||||
|
</button> |
||||||
|
</li> |
||||||
|
{/if} |
||||||
|
</ul> |
||||||
|
</Popover> |
||||||
|
</div> |
||||||
|
</div> |
||||||
|
|
||||||
|
<!-- Full content --> |
||||||
|
<div class="px-3 py-3"> |
||||||
|
<div class="text-sm text-gray-700 dark:text-gray-300 prose prose-sm dark:prose-invert max-w-none mb-3"> |
||||||
|
{@render basicMarkup(rootComment.content)} |
||||||
|
</div> |
||||||
|
|
||||||
|
<!-- Reply button --> |
||||||
|
<div class="mb-3"> |
||||||
|
<Button |
||||||
|
size="xs" |
||||||
|
color="light" |
||||||
|
onclick={() => { |
||||||
|
replyingTo = replyingTo === rootComment.id ? null : rootComment.id; |
||||||
|
replyError = null; |
||||||
|
replySuccess = null; |
||||||
|
}} |
||||||
|
> |
||||||
|
{replyingTo === rootComment.id ? 'Cancel Reply' : 'Reply'} |
||||||
|
</Button> |
||||||
|
</div> |
||||||
|
|
||||||
|
<!-- Reply UI --> |
||||||
|
{#if replyingTo === rootComment.id} |
||||||
|
<div class="mb-4 border border-gray-300 dark:border-gray-600 rounded-lg p-3 bg-gray-50 dark:bg-gray-700"> |
||||||
|
<Textarea |
||||||
|
bind:value={replyContent} |
||||||
|
placeholder="Write your reply..." |
||||||
|
rows={3} |
||||||
|
disabled={isSubmittingReply} |
||||||
|
class="mb-2" |
||||||
|
/> |
||||||
|
|
||||||
|
{#if replyError} |
||||||
|
<P class="text-red-600 dark:text-red-400 text-sm mb-2">{replyError}</P> |
||||||
|
{/if} |
||||||
|
|
||||||
|
{#if replySuccess === rootComment.id} |
||||||
|
<P class="text-green-600 dark:text-green-400 text-sm mb-2">Reply posted successfully!</P> |
||||||
|
{/if} |
||||||
|
|
||||||
|
<div class="flex gap-2"> |
||||||
|
<Button |
||||||
|
size="sm" |
||||||
|
onclick={() => submitReply(rootComment)} |
||||||
|
disabled={isSubmittingReply || !replyContent.trim()} |
||||||
|
> |
||||||
|
{isSubmittingReply ? 'Posting...' : 'Post Reply'} |
||||||
|
</Button> |
||||||
|
<Button |
||||||
|
size="sm" |
||||||
|
color="light" |
||||||
|
onclick={() => { |
||||||
|
replyingTo = null; |
||||||
|
replyContent = ""; |
||||||
|
replyError = null; |
||||||
|
}} |
||||||
|
> |
||||||
|
Cancel |
||||||
|
</Button> |
||||||
|
</div> |
||||||
|
</div> |
||||||
|
{/if} |
||||||
|
|
||||||
|
<!-- Replies --> |
||||||
|
{#if replyCount > 0} |
||||||
|
<div class="pl-4 border-l-2 border-gray-200 dark:border-gray-600 space-y-2"> |
||||||
|
{#each renderReplies(rootComment.id, threadStructure.repliesByParent) as reply (reply.id)} |
||||||
|
<div class="bg-gray-50 dark:bg-gray-700/30 rounded p-3"> |
||||||
|
<div class="flex items-center gap-2 mb-2"> |
||||||
|
<button |
||||||
|
class="text-sm font-medium text-gray-900 dark:text-gray-100 hover:text-gray-600 dark:hover:text-gray-400 transition-colors" |
||||||
|
onclick={(e) => { e.stopPropagation(); copyNevent(reply); }} |
||||||
|
title="Copy nevent to clipboard" |
||||||
|
> |
||||||
|
{getDisplayName(reply.pubkey)} |
||||||
|
</button> |
||||||
|
<span class="text-xs text-gray-500 dark:text-gray-400"> |
||||||
|
{formatTimestamp(reply.created_at || 0)} |
||||||
|
</span> |
||||||
|
|
||||||
|
<!-- Three-dot menu for reply --> |
||||||
|
<div class="ml-auto flex items-center gap-2"> |
||||||
|
<button |
||||||
|
id="reply-actions-{reply.id}" |
||||||
|
class="p-1 hover:bg-gray-200 dark:hover:bg-gray-600 rounded transition-colors" |
||||||
|
aria-label="Reply actions" |
||||||
|
onclick={(e) => { e.stopPropagation(); }} |
||||||
|
> |
||||||
|
<DotsVerticalOutline class="w-3 h-3 text-gray-600 dark:text-gray-400" /> |
||||||
|
</button> |
||||||
|
<Popover |
||||||
|
triggeredBy="#reply-actions-{reply.id}" |
||||||
|
placement="bottom-end" |
||||||
|
class="w-48 text-sm" |
||||||
|
> |
||||||
|
<ul class="space-y-1"> |
||||||
|
<li> |
||||||
|
<button |
||||||
|
class="w-full text-left px-3 py-2 hover:bg-gray-100 dark:hover:bg-gray-700 rounded flex items-center gap-2" |
||||||
|
onclick={() => { |
||||||
|
detailsModalOpen = reply.id; |
||||||
|
}} |
||||||
|
> |
||||||
|
<EyeOutline class="w-4 h-4" /> |
||||||
|
View details |
||||||
|
</button> |
||||||
|
</li> |
||||||
|
<li> |
||||||
|
<button |
||||||
|
class="w-full text-left px-3 py-2 hover:bg-gray-100 dark:hover:bg-gray-700 rounded flex items-center gap-2" |
||||||
|
onclick={async () => { |
||||||
|
await copyNevent(reply); |
||||||
|
}} |
||||||
|
> |
||||||
|
<ClipboardCleanOutline class="w-4 h-4" /> |
||||||
|
Copy nevent |
||||||
|
</button> |
||||||
|
</li> |
||||||
|
{#if canDelete(reply)} |
||||||
|
<li> |
||||||
|
<button |
||||||
|
class="w-full text-left px-3 py-2 hover:bg-red-50 dark:hover:bg-red-900/20 rounded flex items-center gap-2 text-red-600 dark:text-red-400" |
||||||
|
onclick={() => { |
||||||
|
handleDeleteComment(reply); |
||||||
|
}} |
||||||
|
disabled={deletingComments.has(reply.id)} |
||||||
|
> |
||||||
|
<TrashBinOutline class="w-4 h-4" /> |
||||||
|
{deletingComments.has(reply.id) ? 'Deleting...' : 'Delete comment'} |
||||||
|
</button> |
||||||
|
</li> |
||||||
|
{/if} |
||||||
|
</ul> |
||||||
|
</Popover> |
||||||
|
</div> |
||||||
|
</div> |
||||||
|
<div class="text-sm text-gray-700 dark:text-gray-300 prose prose-sm dark:prose-invert max-w-none mb-2"> |
||||||
|
{@render basicMarkup(reply.content)} |
||||||
|
</div> |
||||||
|
|
||||||
|
<!-- Reply button for first-level reply --> |
||||||
|
<div class="mb-2"> |
||||||
|
<Button |
||||||
|
size="xs" |
||||||
|
color="light" |
||||||
|
onclick={() => { |
||||||
|
replyingTo = replyingTo === reply.id ? null : reply.id; |
||||||
|
replyError = null; |
||||||
|
replySuccess = null; |
||||||
|
}} |
||||||
|
> |
||||||
|
{replyingTo === reply.id ? 'Cancel Reply' : 'Reply'} |
||||||
|
</Button> |
||||||
|
</div> |
||||||
|
|
||||||
|
<!-- Reply UI for first-level reply --> |
||||||
|
{#if replyingTo === reply.id} |
||||||
|
<div class="mb-3 border border-gray-300 dark:border-gray-600 rounded-lg p-3 bg-white dark:bg-gray-800"> |
||||||
|
<Textarea |
||||||
|
bind:value={replyContent} |
||||||
|
placeholder="Write your reply..." |
||||||
|
rows={3} |
||||||
|
disabled={isSubmittingReply} |
||||||
|
class="mb-2" |
||||||
|
/> |
||||||
|
|
||||||
|
{#if replyError} |
||||||
|
<P class="text-red-600 dark:text-red-400 text-sm mb-2">{replyError}</P> |
||||||
|
{/if} |
||||||
|
|
||||||
|
{#if replySuccess === reply.id} |
||||||
|
<P class="text-green-600 dark:text-green-400 text-sm mb-2">Reply posted successfully!</P> |
||||||
|
{/if} |
||||||
|
|
||||||
|
<div class="flex gap-2"> |
||||||
|
<Button |
||||||
|
size="sm" |
||||||
|
onclick={() => submitReply(reply)} |
||||||
|
disabled={isSubmittingReply || !replyContent.trim()} |
||||||
|
> |
||||||
|
{isSubmittingReply ? 'Posting...' : 'Post Reply'} |
||||||
|
</Button> |
||||||
|
<Button |
||||||
|
size="sm" |
||||||
|
color="light" |
||||||
|
onclick={() => { |
||||||
|
replyingTo = null; |
||||||
|
replyContent = ""; |
||||||
|
replyError = null; |
||||||
|
}} |
||||||
|
> |
||||||
|
Cancel |
||||||
|
</Button> |
||||||
|
</div> |
||||||
|
</div> |
||||||
|
{/if} |
||||||
|
|
||||||
|
<!-- Nested replies (one level deep) --> |
||||||
|
{#each renderReplies(reply.id, threadStructure.repliesByParent) as nestedReply (nestedReply.id)} |
||||||
|
<div class="ml-4 mt-2 bg-gray-100 dark:bg-gray-600/30 rounded p-2"> |
||||||
|
<div class="flex items-center gap-2 mb-1"> |
||||||
|
<button |
||||||
|
class="text-xs font-medium text-gray-900 dark:text-gray-100 hover:text-gray-600 dark:hover:text-gray-400 transition-colors" |
||||||
|
onclick={(e) => { e.stopPropagation(); copyNevent(nestedReply); }} |
||||||
|
title="Copy nevent to clipboard" |
||||||
|
> |
||||||
|
{getDisplayName(nestedReply.pubkey)} |
||||||
|
</button> |
||||||
|
<span class="text-xs text-gray-500 dark:text-gray-400"> |
||||||
|
{formatTimestamp(nestedReply.created_at || 0)} |
||||||
|
</span> |
||||||
|
|
||||||
|
<!-- Three-dot menu for nested reply --> |
||||||
|
<div class="ml-auto flex items-center gap-2"> |
||||||
|
<button |
||||||
|
id="nested-reply-actions-{nestedReply.id}" |
||||||
|
class="p-1 hover:bg-gray-200 dark:hover:bg-gray-600 rounded transition-colors" |
||||||
|
aria-label="Nested reply actions" |
||||||
|
onclick={(e) => { e.stopPropagation(); }} |
||||||
|
> |
||||||
|
<DotsVerticalOutline class="w-3 h-3 text-gray-600 dark:text-gray-400" /> |
||||||
|
</button> |
||||||
|
<Popover |
||||||
|
triggeredBy="#nested-reply-actions-{nestedReply.id}" |
||||||
|
placement="bottom-end" |
||||||
|
class="w-48 text-sm" |
||||||
|
> |
||||||
|
<ul class="space-y-1"> |
||||||
|
<li> |
||||||
|
<button |
||||||
|
class="w-full text-left px-3 py-2 hover:bg-gray-100 dark:hover:bg-gray-700 rounded flex items-center gap-2" |
||||||
|
onclick={() => { |
||||||
|
detailsModalOpen = nestedReply.id; |
||||||
|
}} |
||||||
|
> |
||||||
|
<EyeOutline class="w-4 h-4" /> |
||||||
|
View details |
||||||
|
</button> |
||||||
|
</li> |
||||||
|
<li> |
||||||
|
<button |
||||||
|
class="w-full text-left px-3 py-2 hover:bg-gray-100 dark:hover:bg-gray-700 rounded flex items-center gap-2" |
||||||
|
onclick={async () => { |
||||||
|
await copyNevent(nestedReply); |
||||||
|
}} |
||||||
|
> |
||||||
|
<ClipboardCleanOutline class="w-4 h-4" /> |
||||||
|
Copy nevent |
||||||
|
</button> |
||||||
|
</li> |
||||||
|
{#if canDelete(nestedReply)} |
||||||
|
<li> |
||||||
|
<button |
||||||
|
class="w-full text-left px-3 py-2 hover:bg-red-50 dark:hover:bg-red-900/20 rounded flex items-center gap-2 text-red-600 dark:text-red-400" |
||||||
|
onclick={() => { |
||||||
|
handleDeleteComment(nestedReply); |
||||||
|
}} |
||||||
|
disabled={deletingComments.has(nestedReply.id)} |
||||||
|
> |
||||||
|
<TrashBinOutline class="w-4 h-4" /> |
||||||
|
{deletingComments.has(nestedReply.id) ? 'Deleting...' : 'Delete comment'} |
||||||
|
</button> |
||||||
|
</li> |
||||||
|
{/if} |
||||||
|
</ul> |
||||||
|
</Popover> |
||||||
|
</div> |
||||||
|
</div> |
||||||
|
<div class="text-xs text-gray-700 dark:text-gray-300 mb-2"> |
||||||
|
{@render basicMarkup(nestedReply.content)} |
||||||
|
</div> |
||||||
|
|
||||||
|
<!-- Reply button for nested reply --> |
||||||
|
<div class="mb-1"> |
||||||
|
<Button |
||||||
|
size="xs" |
||||||
|
color="light" |
||||||
|
onclick={() => { |
||||||
|
replyingTo = replyingTo === nestedReply.id ? null : nestedReply.id; |
||||||
|
replyError = null; |
||||||
|
replySuccess = null; |
||||||
|
}} |
||||||
|
> |
||||||
|
{replyingTo === nestedReply.id ? 'Cancel Reply' : 'Reply'} |
||||||
|
</Button> |
||||||
|
</div> |
||||||
|
|
||||||
|
<!-- Reply UI for nested reply --> |
||||||
|
{#if replyingTo === nestedReply.id} |
||||||
|
<div class="mb-2 border border-gray-300 dark:border-gray-600 rounded-lg p-2 bg-white dark:bg-gray-800"> |
||||||
|
<Textarea |
||||||
|
bind:value={replyContent} |
||||||
|
placeholder="Write your reply..." |
||||||
|
rows={2} |
||||||
|
disabled={isSubmittingReply} |
||||||
|
class="mb-2 text-xs" |
||||||
|
/> |
||||||
|
|
||||||
|
{#if replyError} |
||||||
|
<P class="text-red-600 dark:text-red-400 text-xs mb-1">{replyError}</P> |
||||||
|
{/if} |
||||||
|
|
||||||
|
{#if replySuccess === nestedReply.id} |
||||||
|
<P class="text-green-600 dark:text-green-400 text-xs mb-1">Reply posted successfully!</P> |
||||||
|
{/if} |
||||||
|
|
||||||
|
<div class="flex gap-2"> |
||||||
|
<Button |
||||||
|
size="xs" |
||||||
|
onclick={() => submitReply(nestedReply)} |
||||||
|
disabled={isSubmittingReply || !replyContent.trim()} |
||||||
|
> |
||||||
|
{isSubmittingReply ? 'Posting...' : 'Post Reply'} |
||||||
|
</Button> |
||||||
|
<Button |
||||||
|
size="xs" |
||||||
|
color="light" |
||||||
|
onclick={() => { |
||||||
|
replyingTo = null; |
||||||
|
replyContent = ""; |
||||||
|
replyError = null; |
||||||
|
}} |
||||||
|
> |
||||||
|
Cancel |
||||||
|
</Button> |
||||||
|
</div> |
||||||
|
</div> |
||||||
|
{/if} |
||||||
|
</div> |
||||||
|
{/each} |
||||||
|
</div> |
||||||
|
{/each} |
||||||
|
</div> |
||||||
|
{/if} |
||||||
|
</div> |
||||||
|
</div> |
||||||
|
{/if} |
||||||
|
</div> |
||||||
|
{/each} |
||||||
|
</div> |
||||||
|
{/if} |
||||||
|
|
||||||
|
<!-- Details Modal --> |
||||||
|
{#if detailsModalOpen} |
||||||
|
{@const comment = comments.find(c => c.id === detailsModalOpen)} |
||||||
|
{#if comment} |
||||||
|
<Modal |
||||||
|
title="Comment Details" |
||||||
|
open={true} |
||||||
|
autoclose |
||||||
|
outsideclose |
||||||
|
size="lg" |
||||||
|
class="modal-leather" |
||||||
|
onclose={() => detailsModalOpen = null} |
||||||
|
> |
||||||
|
<div class="space-y-4"> |
||||||
|
<div class="flex justify-center pb-2"> |
||||||
|
<Button |
||||||
|
color="primary" |
||||||
|
onclick={() => { |
||||||
|
viewEventDetails(comment); |
||||||
|
}} |
||||||
|
> |
||||||
|
View on Event Page |
||||||
|
</Button> |
||||||
|
</div> |
||||||
|
<div> |
||||||
|
<pre class="text-xs bg-gray-100 dark:bg-gray-800 p-3 rounded overflow-x-auto max-h-[500px] overflow-y-auto">{JSON.stringify({ |
||||||
|
id: comment.id, |
||||||
|
pubkey: comment.pubkey, |
||||||
|
created_at: comment.created_at, |
||||||
|
kind: comment.kind, |
||||||
|
tags: comment.tags, |
||||||
|
content: comment.content, |
||||||
|
sig: comment.sig |
||||||
|
}, null, 2)}</pre> |
||||||
|
</div> |
||||||
|
</div> |
||||||
|
</Modal> |
||||||
|
{/if} |
||||||
|
{/if} |
||||||
|
|
||||||
|
<style> |
||||||
|
/* Ensure proper text wrapping */ |
||||||
|
.prose { |
||||||
|
word-wrap: break-word; |
||||||
|
overflow-wrap: break-word; |
||||||
|
} |
||||||
|
</style> |
||||||
@ -0,0 +1,119 @@ |
|||||||
|
import NDK, { NDKEvent, NDKRelaySet } from "@nostr-dev-kit/ndk"; |
||||||
|
|
||||||
|
export interface DeletionOptions { |
||||||
|
eventId?: string; |
||||||
|
eventAddress?: string; |
||||||
|
eventKind?: number; |
||||||
|
reason?: string; |
||||||
|
onSuccess?: (deletionEventId: string) => void; |
||||||
|
onError?: (error: string) => void; |
||||||
|
} |
||||||
|
|
||||||
|
export interface DeletionResult { |
||||||
|
success: boolean; |
||||||
|
deletionEventId?: string; |
||||||
|
error?: string; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Deletes a Nostr event by publishing a kind 5 deletion request (NIP-09) |
||||||
|
* @param options - Deletion options |
||||||
|
* @param ndk - NDK instance |
||||||
|
* @returns Promise resolving to deletion result |
||||||
|
*/ |
||||||
|
export async function deleteEvent( |
||||||
|
options: DeletionOptions, |
||||||
|
ndk: NDK, |
||||||
|
): Promise<DeletionResult> { |
||||||
|
const { eventId, eventAddress, eventKind, reason = "", onSuccess, onError } = |
||||||
|
options; |
||||||
|
|
||||||
|
if (!eventId && !eventAddress) { |
||||||
|
const error = "Either eventId or eventAddress must be provided"; |
||||||
|
onError?.(error); |
||||||
|
return { success: false, error }; |
||||||
|
} |
||||||
|
|
||||||
|
if (!ndk?.activeUser) { |
||||||
|
const error = "Please log in first"; |
||||||
|
onError?.(error); |
||||||
|
return { success: false, error }; |
||||||
|
} |
||||||
|
|
||||||
|
try { |
||||||
|
// Create deletion event (kind 5)
|
||||||
|
const deletionEvent = new NDKEvent(ndk); |
||||||
|
deletionEvent.kind = 5; |
||||||
|
deletionEvent.created_at = Math.floor(Date.now() / 1000); |
||||||
|
deletionEvent.content = reason; |
||||||
|
deletionEvent.pubkey = ndk.activeUser.pubkey; |
||||||
|
|
||||||
|
// Build tags based on what we have
|
||||||
|
const tags: string[][] = []; |
||||||
|
|
||||||
|
if (eventId) { |
||||||
|
// Add 'e' tag for event ID
|
||||||
|
tags.push(["e", eventId]); |
||||||
|
} |
||||||
|
|
||||||
|
if (eventAddress) { |
||||||
|
// Add 'a' tag for replaceable event address
|
||||||
|
tags.push(["a", eventAddress]); |
||||||
|
} |
||||||
|
|
||||||
|
if (eventKind) { |
||||||
|
// Add 'k' tag for event kind (recommended by NIP-09)
|
||||||
|
tags.push(["k", eventKind.toString()]); |
||||||
|
} |
||||||
|
|
||||||
|
deletionEvent.tags = tags; |
||||||
|
|
||||||
|
// Sign the deletion event
|
||||||
|
await deletionEvent.sign(); |
||||||
|
|
||||||
|
// Publish to all available relays
|
||||||
|
const allRelayUrls = Array.from(ndk.pool?.relays.values() || []).map( |
||||||
|
(r) => r.url, |
||||||
|
); |
||||||
|
|
||||||
|
if (allRelayUrls.length === 0) { |
||||||
|
throw new Error("No relays available in NDK pool"); |
||||||
|
} |
||||||
|
|
||||||
|
const relaySet = NDKRelaySet.fromRelayUrls(allRelayUrls, ndk); |
||||||
|
const publishedToRelays = await deletionEvent.publish(relaySet); |
||||||
|
|
||||||
|
if (publishedToRelays.size > 0) { |
||||||
|
console.log( |
||||||
|
`[deletion.ts] Published deletion request to ${publishedToRelays.size} relays`, |
||||||
|
); |
||||||
|
const result = { success: true, deletionEventId: deletionEvent.id }; |
||||||
|
onSuccess?.(deletionEvent.id); |
||||||
|
return result; |
||||||
|
} else { |
||||||
|
throw new Error("Failed to publish deletion request to any relays"); |
||||||
|
} |
||||||
|
} catch (error) { |
||||||
|
const errorMessage = error instanceof Error |
||||||
|
? error.message |
||||||
|
: "Unknown error"; |
||||||
|
console.error(`[deletion.ts] Error deleting event: ${errorMessage}`); |
||||||
|
onError?.(errorMessage); |
||||||
|
return { success: false, error: errorMessage }; |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Checks if the current user has permission to delete an event |
||||||
|
* @param event - The event to check |
||||||
|
* @param ndk - NDK instance |
||||||
|
* @returns True if the user can delete the event |
||||||
|
*/ |
||||||
|
export function canDeleteEvent(event: NDKEvent | null, ndk: NDK): boolean { |
||||||
|
if (!event || !ndk?.activeUser) { |
||||||
|
return false; |
||||||
|
} |
||||||
|
|
||||||
|
// User can only delete their own events
|
||||||
|
return event.pubkey === ndk.activeUser.pubkey; |
||||||
|
} |
||||||
@ -0,0 +1,380 @@ |
|||||||
|
/** |
||||||
|
* AST-based AsciiDoc parsing using Asciidoctor's native document structure |
||||||
|
* |
||||||
|
* This replaces the manual regex parsing in asciidoc_metadata.ts with proper |
||||||
|
* AST traversal, leveraging Asciidoctor's built-in parsing capabilities. |
||||||
|
*/ |
||||||
|
|
||||||
|
import Processor from "asciidoctor"; |
||||||
|
import type { Document } from "asciidoctor"; |
||||||
|
import { PublicationTree } from "../data_structures/publication_tree"; |
||||||
|
import { NDKEvent } from "@nostr-dev-kit/ndk"; |
||||||
|
import type NDK from "@nostr-dev-kit/ndk"; |
||||||
|
import { getMimeTags } from "./mime"; |
||||||
|
|
||||||
|
export interface ASTSection { |
||||||
|
title: string; |
||||||
|
content: string; |
||||||
|
level: number; |
||||||
|
attributes: Record<string, string>; |
||||||
|
subsections: ASTSection[]; |
||||||
|
} |
||||||
|
|
||||||
|
export interface ASTParsedDocument { |
||||||
|
title: string; |
||||||
|
content: string; |
||||||
|
attributes: Record<string, string>; |
||||||
|
sections: ASTSection[]; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Parse AsciiDoc content using Asciidoctor's AST instead of manual regex |
||||||
|
*/ |
||||||
|
export function parseAsciiDocAST( |
||||||
|
content: string, |
||||||
|
parseLevel: number = 2, |
||||||
|
): ASTParsedDocument { |
||||||
|
const asciidoctor = Processor(); |
||||||
|
const document = asciidoctor.load(content, { standalone: false }) as Document; |
||||||
|
|
||||||
|
return { |
||||||
|
title: document.getTitle() || "", |
||||||
|
content: document.getContent() || "", |
||||||
|
attributes: document.getAttributes(), |
||||||
|
sections: extractSectionsFromAST(document, parseLevel), |
||||||
|
}; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Extract sections from Asciidoctor AST based on parse level |
||||||
|
*/ |
||||||
|
function extractSectionsFromAST( |
||||||
|
document: Document, |
||||||
|
parseLevel: number, |
||||||
|
): ASTSection[] { |
||||||
|
const directSections = document.getSections(); |
||||||
|
|
||||||
|
// Collect all sections at all levels up to parseLevel
|
||||||
|
const allSections: ASTSection[] = []; |
||||||
|
|
||||||
|
function collectSections(sections: any[]) { |
||||||
|
for (const section of sections) { |
||||||
|
const asciidoctorLevel = section.getLevel(); |
||||||
|
// Convert Asciidoctor's internal level to our application level
|
||||||
|
// Asciidoctor: == is level 1, === is level 2, etc.
|
||||||
|
// Our app: == is level 2, === is level 3, etc.
|
||||||
|
const appLevel = asciidoctorLevel + 1; |
||||||
|
|
||||||
|
if (appLevel <= parseLevel) { |
||||||
|
allSections.push({ |
||||||
|
title: section.getTitle() || "", |
||||||
|
content: section.getContent() || "", |
||||||
|
level: appLevel, |
||||||
|
attributes: section.getAttributes() || {}, |
||||||
|
subsections: [], |
||||||
|
}); |
||||||
|
} |
||||||
|
|
||||||
|
// Recursively collect subsections
|
||||||
|
const subsections = section.getSections?.() || []; |
||||||
|
if (subsections.length > 0) { |
||||||
|
collectSections(subsections); |
||||||
|
} |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
collectSections(directSections); |
||||||
|
|
||||||
|
return allSections; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Extract subsections from a section (recursive helper) |
||||||
|
*/ |
||||||
|
function extractSubsections(section: any, parseLevel: number): ASTSection[] { |
||||||
|
const subsections = section.getSections?.() || []; |
||||||
|
|
||||||
|
return subsections |
||||||
|
.filter((sub: any) => (sub.getLevel() + 1) <= parseLevel) |
||||||
|
.map((sub: any) => ({ |
||||||
|
title: sub.getTitle() || "", |
||||||
|
content: sub.getContent() || "", |
||||||
|
level: sub.getLevel() + 1, // Convert to app level
|
||||||
|
attributes: sub.getAttributes() || {}, |
||||||
|
subsections: extractSubsections(sub, parseLevel), |
||||||
|
})); |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Create a PublicationTree directly from Asciidoctor AST |
||||||
|
* This integrates with Michael's PublicationTree architecture |
||||||
|
*/ |
||||||
|
export async function createPublicationTreeFromAST( |
||||||
|
content: string, |
||||||
|
ndk: NDK, |
||||||
|
parseLevel: number = 2, |
||||||
|
): Promise<PublicationTree> { |
||||||
|
const parsed = parseAsciiDocAST(content, parseLevel); |
||||||
|
|
||||||
|
// Create root 30040 index event from document metadata
|
||||||
|
const rootEvent = createIndexEventFromAST(parsed, ndk); |
||||||
|
const tree = new PublicationTree(rootEvent, ndk); |
||||||
|
|
||||||
|
// Add sections as 30041 events with proper namespacing
|
||||||
|
for (const section of parsed.sections) { |
||||||
|
const contentEvent = createContentEventFromSection( |
||||||
|
section, |
||||||
|
ndk, |
||||||
|
parsed.title, |
||||||
|
); |
||||||
|
await tree.addEvent(contentEvent, rootEvent); |
||||||
|
} |
||||||
|
|
||||||
|
return tree; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Create a 30040 index event from AST document metadata |
||||||
|
*/ |
||||||
|
function createIndexEventFromAST( |
||||||
|
parsed: ASTParsedDocument, |
||||||
|
ndk: NDK, |
||||||
|
): NDKEvent { |
||||||
|
const event = new NDKEvent(ndk); |
||||||
|
event.kind = 30040; |
||||||
|
event.created_at = Math.floor(Date.now() / 1000); |
||||||
|
|
||||||
|
// Generate d-tag from title
|
||||||
|
const dTag = generateDTag(parsed.title); |
||||||
|
const [mTag, MTag] = getMimeTags(30040); |
||||||
|
|
||||||
|
const tags: string[][] = [ |
||||||
|
["d", dTag], |
||||||
|
mTag, |
||||||
|
MTag, |
||||||
|
["title", parsed.title], |
||||||
|
]; |
||||||
|
|
||||||
|
// Add document attributes as tags
|
||||||
|
addAttributesAsTags(tags, parsed.attributes); |
||||||
|
|
||||||
|
// Generate publication abbreviation for namespacing sections
|
||||||
|
const pubAbbrev = generateTitleAbbreviation(parsed.title); |
||||||
|
|
||||||
|
// Add a-tags for each section (30041 content events)
|
||||||
|
// Using new format: kind:pubkey:{abbv}-{section-d-tag}
|
||||||
|
parsed.sections.forEach((section) => { |
||||||
|
const sectionDTag = generateDTag(section.title); |
||||||
|
const namespacedDTag = `${pubAbbrev}-${sectionDTag}`; |
||||||
|
tags.push([ |
||||||
|
"a", |
||||||
|
`30041:${ndk.activeUser?.pubkey || "pubkey"}:${namespacedDTag}`, |
||||||
|
]); |
||||||
|
}); |
||||||
|
|
||||||
|
event.tags = tags; |
||||||
|
event.content = parsed.content; |
||||||
|
|
||||||
|
return event; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Create a 30041 content event from an AST section |
||||||
|
* Note: This function needs the publication title for proper namespacing |
||||||
|
* but the current implementation doesn't have access to it. |
||||||
|
* Consider using createPublicationTreeFromAST instead which handles this correctly. |
||||||
|
*/ |
||||||
|
function createContentEventFromSection( |
||||||
|
section: ASTSection, |
||||||
|
ndk: NDK, |
||||||
|
publicationTitle?: string, |
||||||
|
): NDKEvent { |
||||||
|
const event = new NDKEvent(ndk); |
||||||
|
event.kind = 30041; |
||||||
|
event.created_at = Math.floor(Date.now() / 1000); |
||||||
|
|
||||||
|
// Generate namespaced d-tag if publication title is provided
|
||||||
|
const sectionDTag = generateDTag(section.title); |
||||||
|
let dTag = sectionDTag; |
||||||
|
|
||||||
|
if (publicationTitle) { |
||||||
|
const pubAbbrev = generateTitleAbbreviation(publicationTitle); |
||||||
|
dTag = `${pubAbbrev}-${sectionDTag}`; |
||||||
|
} |
||||||
|
|
||||||
|
const [mTag, MTag] = getMimeTags(30041); |
||||||
|
|
||||||
|
const tags: string[][] = [ |
||||||
|
["d", dTag], |
||||||
|
mTag, |
||||||
|
MTag, |
||||||
|
["title", section.title], |
||||||
|
]; |
||||||
|
|
||||||
|
// Add section attributes as tags
|
||||||
|
addAttributesAsTags(tags, section.attributes); |
||||||
|
|
||||||
|
event.tags = tags; |
||||||
|
event.content = section.content; |
||||||
|
|
||||||
|
return event; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Generate a deterministic d-tag from title |
||||||
|
*/ |
||||||
|
function generateDTag(title: string): string { |
||||||
|
return title |
||||||
|
.toLowerCase() |
||||||
|
.replace(/[^\p{L}\p{N}]/gu, "-") |
||||||
|
.replace(/-+/g, "-") |
||||||
|
.replace(/^-|-$/g, ""); |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Generate title abbreviation from first letters of each word |
||||||
|
* Used for namespacing section a-tags |
||||||
|
* @param title - The publication title |
||||||
|
* @returns Abbreviation string (e.g., "My Test Article" → "mta") |
||||||
|
*/ |
||||||
|
function generateTitleAbbreviation(title: string): string { |
||||||
|
if (!title || !title.trim()) { |
||||||
|
return "u"; // "untitled"
|
||||||
|
} |
||||||
|
|
||||||
|
// Split on non-alphanumeric characters and filter out empty strings
|
||||||
|
const words = title |
||||||
|
.split(/[^\p{L}\p{N}]+/u) |
||||||
|
.filter((word) => word.length > 0); |
||||||
|
|
||||||
|
if (words.length === 0) { |
||||||
|
return "u"; |
||||||
|
} |
||||||
|
|
||||||
|
// Take first letter of each word and join
|
||||||
|
return words |
||||||
|
.map((word) => word.charAt(0).toLowerCase()) |
||||||
|
.join(""); |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Add AsciiDoc attributes as Nostr event tags, filtering out system attributes |
||||||
|
*/ |
||||||
|
function addAttributesAsTags( |
||||||
|
tags: string[][], |
||||||
|
attributes: Record<string, string>, |
||||||
|
) { |
||||||
|
const systemAttributes = [ |
||||||
|
"attribute-undefined", |
||||||
|
"attribute-missing", |
||||||
|
"appendix-caption", |
||||||
|
"appendix-refsig", |
||||||
|
"caution-caption", |
||||||
|
"chapter-refsig", |
||||||
|
"example-caption", |
||||||
|
"figure-caption", |
||||||
|
"important-caption", |
||||||
|
"last-update-label", |
||||||
|
"manname-title", |
||||||
|
"note-caption", |
||||||
|
"part-refsig", |
||||||
|
"preface-title", |
||||||
|
"section-refsig", |
||||||
|
"table-caption", |
||||||
|
"tip-caption", |
||||||
|
"toc-title", |
||||||
|
"untitled-label", |
||||||
|
"version-label", |
||||||
|
"warning-caption", |
||||||
|
"asciidoctor", |
||||||
|
"asciidoctor-version", |
||||||
|
"safe-mode-name", |
||||||
|
"backend", |
||||||
|
"doctype", |
||||||
|
"basebackend", |
||||||
|
"filetype", |
||||||
|
"outfilesuffix", |
||||||
|
"stylesdir", |
||||||
|
"iconsdir", |
||||||
|
"localdate", |
||||||
|
"localyear", |
||||||
|
"localtime", |
||||||
|
"localdatetime", |
||||||
|
"docdate", |
||||||
|
"docyear", |
||||||
|
"doctime", |
||||||
|
"docdatetime", |
||||||
|
"doctitle", |
||||||
|
"embedded", |
||||||
|
"notitle", |
||||||
|
]; |
||||||
|
|
||||||
|
// Add standard metadata tags
|
||||||
|
if (attributes.author) tags.push(["author", attributes.author]); |
||||||
|
if (attributes.version) tags.push(["version", attributes.version]); |
||||||
|
if (attributes.description) tags.push(["summary", attributes.description]); |
||||||
|
if (attributes.tags) { |
||||||
|
attributes.tags.split(",").forEach((tag) => tags.push(["t", tag.trim()])); |
||||||
|
} |
||||||
|
|
||||||
|
// Add custom attributes (non-system)
|
||||||
|
Object.entries(attributes).forEach(([key, value]) => { |
||||||
|
if (!systemAttributes.includes(key) && value) { |
||||||
|
tags.push([key, value]); |
||||||
|
} |
||||||
|
}); |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Tree processor extension for Asciidoctor |
||||||
|
* This can be registered to automatically populate PublicationTree during parsing |
||||||
|
*/ |
||||||
|
export function createPublicationTreeProcessor( |
||||||
|
ndk: NDK, |
||||||
|
parseLevel: number = 2, |
||||||
|
) { |
||||||
|
return function (extensions: any) { |
||||||
|
extensions.treeProcessor(function (this: any) { |
||||||
|
const dsl = this; |
||||||
|
dsl.process(function (this: any, document: Document) { |
||||||
|
// Create PublicationTree and store on document for later retrieval
|
||||||
|
const publicationTree = createPublicationTreeFromDocument( |
||||||
|
document, |
||||||
|
ndk, |
||||||
|
parseLevel, |
||||||
|
); |
||||||
|
document.setAttribute("publicationTree", publicationTree); |
||||||
|
}); |
||||||
|
}); |
||||||
|
}; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Helper function to create PublicationTree from Asciidoctor Document |
||||||
|
*/ |
||||||
|
async function createPublicationTreeFromDocument( |
||||||
|
document: Document, |
||||||
|
ndk: NDK, |
||||||
|
parseLevel: number, |
||||||
|
): Promise<PublicationTree> { |
||||||
|
const parsed: ASTParsedDocument = { |
||||||
|
title: document.getTitle() || "", |
||||||
|
content: document.getContent() || "", |
||||||
|
attributes: document.getAttributes(), |
||||||
|
sections: extractSectionsFromAST(document, parseLevel), |
||||||
|
}; |
||||||
|
|
||||||
|
const rootEvent = createIndexEventFromAST(parsed, ndk); |
||||||
|
const tree = new PublicationTree(rootEvent, ndk); |
||||||
|
|
||||||
|
for (const section of parsed.sections) { |
||||||
|
const contentEvent = createContentEventFromSection( |
||||||
|
section, |
||||||
|
ndk, |
||||||
|
parsed.title, |
||||||
|
); |
||||||
|
await tree.addEvent(contentEvent, rootEvent); |
||||||
|
} |
||||||
|
|
||||||
|
return tree; |
||||||
|
} |
||||||
@ -0,0 +1,576 @@ |
|||||||
|
/** |
||||||
|
* AsciiDoc Content Parsing Service |
||||||
|
* |
||||||
|
* Handles parsing AsciiDoc content into hierarchical structures for publication. |
||||||
|
* Separated from metadata extraction to maintain single responsibility principle. |
||||||
|
*/ |
||||||
|
|
||||||
|
// @ts-ignore
|
||||||
|
import Processor from "asciidoctor"; |
||||||
|
import type { Document } from "asciidoctor"; |
||||||
|
import { |
||||||
|
extractDocumentMetadata, |
||||||
|
extractSectionMetadata, |
||||||
|
parseSimpleAttributes, |
||||||
|
} from "./asciidoc_metadata.ts"; |
||||||
|
|
||||||
|
export interface ParsedAsciiDoc { |
||||||
|
metadata: { |
||||||
|
title?: string; |
||||||
|
authors?: string[]; |
||||||
|
version?: string; |
||||||
|
edition?: string; |
||||||
|
publicationDate?: string; |
||||||
|
publisher?: string; |
||||||
|
summary?: string; |
||||||
|
coverImage?: string; |
||||||
|
isbn?: string; |
||||||
|
tags?: string[]; |
||||||
|
source?: string; |
||||||
|
publishedBy?: string; |
||||||
|
type?: string; |
||||||
|
autoUpdate?: "yes" | "ask" | "no"; |
||||||
|
customAttributes?: Record<string, string>; |
||||||
|
}; |
||||||
|
content: string; |
||||||
|
title: string; |
||||||
|
sections: Array<{ |
||||||
|
metadata: { |
||||||
|
title?: string; |
||||||
|
authors?: string[]; |
||||||
|
version?: string; |
||||||
|
edition?: string; |
||||||
|
publicationDate?: string; |
||||||
|
publisher?: string; |
||||||
|
summary?: string; |
||||||
|
coverImage?: string; |
||||||
|
isbn?: string; |
||||||
|
tags?: string[]; |
||||||
|
source?: string; |
||||||
|
publishedBy?: string; |
||||||
|
type?: string; |
||||||
|
autoUpdate?: "yes" | "ask" | "no"; |
||||||
|
customAttributes?: Record<string, string>; |
||||||
|
}; |
||||||
|
content: string; |
||||||
|
title: string; |
||||||
|
}>; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Creates an Asciidoctor processor instance |
||||||
|
*/ |
||||||
|
function createProcessor() { |
||||||
|
return Processor(); |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Helper function to determine the header level of a section |
||||||
|
*/ |
||||||
|
function getSectionLevel(sectionContent: string): number { |
||||||
|
const lines = sectionContent.split(/\r?\n/); |
||||||
|
for (const line of lines) { |
||||||
|
const match = line.match(/^(=+)\s+/); |
||||||
|
if (match) { |
||||||
|
return match[1].length; |
||||||
|
} |
||||||
|
} |
||||||
|
return 0; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Helper function to extract just the intro content (before first subsection) |
||||||
|
*/ |
||||||
|
function extractIntroContent( |
||||||
|
sectionContent: string, |
||||||
|
currentLevel: number, |
||||||
|
): string { |
||||||
|
const lines = sectionContent.split(/\r?\n/); |
||||||
|
const introLines: string[] = []; |
||||||
|
let foundHeader = false; |
||||||
|
|
||||||
|
for (const line of lines) { |
||||||
|
const headerMatch = line.match(/^(=+)\s+/); |
||||||
|
if (headerMatch) { |
||||||
|
const level = headerMatch[1].length; |
||||||
|
if (level === currentLevel && !foundHeader) { |
||||||
|
// This is the section header itself
|
||||||
|
foundHeader = true; |
||||||
|
continue; // Skip the header line itself for intro content
|
||||||
|
} else if (level > currentLevel) { |
||||||
|
// This is a subsection, stop collecting intro content
|
||||||
|
break; |
||||||
|
} |
||||||
|
} else if (foundHeader) { |
||||||
|
// This is intro content after the header
|
||||||
|
introLines.push(line); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
return introLines.join("\n").trim(); |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Parses AsciiDoc content into sections with metadata |
||||||
|
*/ |
||||||
|
export function parseAsciiDocWithMetadata(content: string): ParsedAsciiDoc { |
||||||
|
const asciidoctor = createProcessor(); |
||||||
|
const document = asciidoctor.load(content, { standalone: false }) as Document; |
||||||
|
const { metadata: docMetadata } = extractDocumentMetadata(content); |
||||||
|
|
||||||
|
// Parse the original content to find section attributes
|
||||||
|
const lines = content.split(/\r?\n/); |
||||||
|
const sectionsWithMetadata: Array<{ |
||||||
|
metadata: ParsedAsciiDoc["sections"][0]["metadata"]; |
||||||
|
content: string; |
||||||
|
title: string; |
||||||
|
}> = []; |
||||||
|
let currentSection: string | null = null; |
||||||
|
let currentSectionContent: string[] = []; |
||||||
|
|
||||||
|
for (const line of lines) { |
||||||
|
if (line.match(/^==\s+/)) { |
||||||
|
// Save previous section if exists
|
||||||
|
if (currentSection) { |
||||||
|
const sectionContent = currentSectionContent.join("\n"); |
||||||
|
sectionsWithMetadata.push(extractSectionMetadata(sectionContent)); |
||||||
|
} |
||||||
|
|
||||||
|
// Start new section
|
||||||
|
currentSection = line; |
||||||
|
currentSectionContent = [line]; |
||||||
|
} else if (currentSection) { |
||||||
|
currentSectionContent.push(line); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
// Save the last section
|
||||||
|
if (currentSection) { |
||||||
|
const sectionContent = currentSectionContent.join("\n"); |
||||||
|
sectionsWithMetadata.push(extractSectionMetadata(sectionContent)); |
||||||
|
} |
||||||
|
|
||||||
|
return { |
||||||
|
metadata: docMetadata, |
||||||
|
content: document.getSource(), |
||||||
|
title: docMetadata.title || "", |
||||||
|
sections: sectionsWithMetadata, |
||||||
|
}; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Iterative AsciiDoc parsing based on specified level |
||||||
|
* Level 2: Only == sections become content events (containing all subsections) |
||||||
|
* Level 3: == sections become indices + content events, === sections become content events |
||||||
|
* Level 4: === sections become indices + content events, ==== sections become content events, etc. |
||||||
|
*/ |
||||||
|
export function parseAsciiDocIterative( |
||||||
|
content: string, |
||||||
|
parseLevel: number = 2, |
||||||
|
): ParsedAsciiDoc { |
||||||
|
const asciidoctor = createProcessor(); |
||||||
|
const document = asciidoctor.load(content, { standalone: false }) as Document; |
||||||
|
|
||||||
|
// Extract document metadata using the metadata extraction functions
|
||||||
|
const { metadata: docMetadata } = extractDocumentMetadata(content); |
||||||
|
|
||||||
|
const lines = content.split(/\r?\n/); |
||||||
|
const sections: Array<{ |
||||||
|
metadata: ParsedAsciiDoc["sections"][0]["metadata"]; |
||||||
|
content: string; |
||||||
|
title: string; |
||||||
|
}> = []; |
||||||
|
|
||||||
|
if (parseLevel === 2) { |
||||||
|
// Level 2: Only == sections become events
|
||||||
|
const level2Pattern = /^==\s+/; |
||||||
|
let currentSection: string | null = null; |
||||||
|
let currentSectionContent: string[] = []; |
||||||
|
let documentContent: string[] = []; |
||||||
|
let inDocumentHeader = true; |
||||||
|
|
||||||
|
for (const line of lines) { |
||||||
|
if (line.match(level2Pattern)) { |
||||||
|
inDocumentHeader = false; |
||||||
|
|
||||||
|
// Save previous section if exists
|
||||||
|
if (currentSection) { |
||||||
|
const sectionContent = currentSectionContent.join("\n"); |
||||||
|
const sectionMeta = extractSectionMetadata(sectionContent); |
||||||
|
// For level 2, preserve the full content including the header
|
||||||
|
sections.push({ |
||||||
|
...sectionMeta, |
||||||
|
content: sectionContent, // Use full content, not stripped
|
||||||
|
}); |
||||||
|
} |
||||||
|
|
||||||
|
// Start new section
|
||||||
|
currentSection = line; |
||||||
|
currentSectionContent = [line]; |
||||||
|
} else if (currentSection) { |
||||||
|
currentSectionContent.push(line); |
||||||
|
} else if (inDocumentHeader) { |
||||||
|
documentContent.push(line); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
// Save the last section
|
||||||
|
if (currentSection) { |
||||||
|
const sectionContent = currentSectionContent.join("\n"); |
||||||
|
const sectionMeta = extractSectionMetadata(sectionContent); |
||||||
|
// For level 2, preserve the full content including the header
|
||||||
|
sections.push({ |
||||||
|
...sectionMeta, |
||||||
|
content: sectionContent, // Use full content, not stripped
|
||||||
|
}); |
||||||
|
} |
||||||
|
|
||||||
|
const docContent = documentContent.join("\n"); |
||||||
|
return { |
||||||
|
metadata: docMetadata, |
||||||
|
content: docContent, |
||||||
|
title: docMetadata.title || "", |
||||||
|
sections: sections, |
||||||
|
}; |
||||||
|
} |
||||||
|
|
||||||
|
// Level 3+: Parse hierarchically
|
||||||
|
// All levels from 2 to parseLevel-1 are indices (title only)
|
||||||
|
// Level parseLevel are content sections (full content)
|
||||||
|
|
||||||
|
// First, collect all sections at the content level (parseLevel)
|
||||||
|
const contentLevelPattern = new RegExp(`^${"=".repeat(parseLevel)}\\s+`); |
||||||
|
let currentSection: string | null = null; |
||||||
|
let currentSectionContent: string[] = []; |
||||||
|
let documentContent: string[] = []; |
||||||
|
let inDocumentHeader = true; |
||||||
|
|
||||||
|
for (const line of lines) { |
||||||
|
if (line.match(contentLevelPattern)) { |
||||||
|
inDocumentHeader = false; |
||||||
|
|
||||||
|
// Save previous section if exists
|
||||||
|
if (currentSection) { |
||||||
|
const sectionContent = currentSectionContent.join("\n"); |
||||||
|
const sectionMeta = extractSectionMetadata(sectionContent); |
||||||
|
sections.push({ |
||||||
|
...sectionMeta, |
||||||
|
content: sectionContent, // Full content including headers
|
||||||
|
}); |
||||||
|
} |
||||||
|
|
||||||
|
// Start new content section
|
||||||
|
currentSection = line; |
||||||
|
currentSectionContent = [line]; |
||||||
|
} else if (currentSection) { |
||||||
|
// Continue collecting content for current section
|
||||||
|
currentSectionContent.push(line); |
||||||
|
} else if (inDocumentHeader) { |
||||||
|
documentContent.push(line); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
// Save the last section
|
||||||
|
if (currentSection) { |
||||||
|
const sectionContent = currentSectionContent.join("\n"); |
||||||
|
const sectionMeta = extractSectionMetadata(sectionContent); |
||||||
|
sections.push({ |
||||||
|
...sectionMeta, |
||||||
|
content: sectionContent, // Full content including headers
|
||||||
|
}); |
||||||
|
} |
||||||
|
|
||||||
|
// Now collect index sections (all levels from 2 to parseLevel-1)
|
||||||
|
// These should be shown as navigation/structure but not full content
|
||||||
|
const indexSections: Array<{ |
||||||
|
metadata: ParsedAsciiDoc["sections"][0]["metadata"]; |
||||||
|
content: string; |
||||||
|
title: string; |
||||||
|
level: number; |
||||||
|
}> = []; |
||||||
|
|
||||||
|
for (let level = 2; level < parseLevel; level++) { |
||||||
|
const levelPattern = new RegExp(`^${"=".repeat(level)}\\s+(.+)$`, "gm"); |
||||||
|
const matches = content.matchAll(levelPattern); |
||||||
|
|
||||||
|
for (const match of matches) { |
||||||
|
const title = match[1].trim(); |
||||||
|
indexSections.push({ |
||||||
|
metadata: { title }, |
||||||
|
content: `${"=".repeat(level)} ${title}`, // Just the header line for index sections
|
||||||
|
title, |
||||||
|
level, |
||||||
|
}); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
// Add actual level to content sections based on their content
|
||||||
|
const contentSectionsWithLevel = sections.map((s) => ({ |
||||||
|
...s, |
||||||
|
level: getSectionLevel(s.content), |
||||||
|
})); |
||||||
|
|
||||||
|
// Combine index sections and content sections
|
||||||
|
// Sort by position in original content to maintain order
|
||||||
|
const allSections = [...indexSections, ...contentSectionsWithLevel]; |
||||||
|
|
||||||
|
// Sort sections by their appearance in the original content
|
||||||
|
allSections.sort((a, b) => { |
||||||
|
const posA = content.indexOf(a.content.split("\n")[0]); |
||||||
|
const posB = content.indexOf(b.content.split("\n")[0]); |
||||||
|
return posA - posB; |
||||||
|
}); |
||||||
|
|
||||||
|
const docContent = documentContent.join("\n"); |
||||||
|
return { |
||||||
|
metadata: docMetadata, |
||||||
|
content: docContent, |
||||||
|
title: docMetadata.title || "", |
||||||
|
sections: allSections, |
||||||
|
}; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Generates Nostr events from parsed AsciiDoc with proper hierarchical structure |
||||||
|
* Based on docreference.md specifications |
||||||
|
*/ |
||||||
|
export function generateNostrEvents( |
||||||
|
parsed: ParsedAsciiDoc, |
||||||
|
parseLevel: number = 2, |
||||||
|
pubkey?: string, |
||||||
|
maxDepth: number = 6, |
||||||
|
): { |
||||||
|
indexEvent?: any; |
||||||
|
contentEvents: any[]; |
||||||
|
} { |
||||||
|
const allEvents: any[] = []; |
||||||
|
const actualPubkey = pubkey || "pubkey"; |
||||||
|
|
||||||
|
// Helper function to generate section ID
|
||||||
|
const generateSectionId = (title: string): string => { |
||||||
|
return title |
||||||
|
.toLowerCase() |
||||||
|
.replace(/[^\p{L}\p{N}]/gu, "-") |
||||||
|
.replace(/-+/g, "-") |
||||||
|
.replace(/^-|-$/g, ""); |
||||||
|
}; |
||||||
|
|
||||||
|
// Build hierarchical tree structure
|
||||||
|
interface TreeNode { |
||||||
|
section: { |
||||||
|
metadata: any; |
||||||
|
content: string; |
||||||
|
title: string; |
||||||
|
}; |
||||||
|
level: number; |
||||||
|
sectionId: string; |
||||||
|
tags: [string, string][]; |
||||||
|
children: TreeNode[]; |
||||||
|
parent?: TreeNode; |
||||||
|
} |
||||||
|
|
||||||
|
// Convert flat sections to tree structure
|
||||||
|
const buildTree = (): TreeNode[] => { |
||||||
|
const roots: TreeNode[] = []; |
||||||
|
const stack: TreeNode[] = []; |
||||||
|
|
||||||
|
for (const section of parsed.sections) { |
||||||
|
const level = getSectionLevel(section.content); |
||||||
|
const sectionId = generateSectionId(section.title); |
||||||
|
const tags = parseSimpleAttributes(section.content); |
||||||
|
|
||||||
|
const node: TreeNode = { |
||||||
|
section, |
||||||
|
level, |
||||||
|
sectionId, |
||||||
|
tags, |
||||||
|
children: [], |
||||||
|
}; |
||||||
|
|
||||||
|
// Find the correct parent based on header hierarchy
|
||||||
|
while (stack.length > 0 && stack[stack.length - 1].level >= level) { |
||||||
|
stack.pop(); |
||||||
|
} |
||||||
|
|
||||||
|
if (stack.length === 0) { |
||||||
|
// This is a root level section
|
||||||
|
roots.push(node); |
||||||
|
} else { |
||||||
|
// This is a child of the last item in stack
|
||||||
|
const parent = stack[stack.length - 1]; |
||||||
|
parent.children.push(node); |
||||||
|
node.parent = parent; |
||||||
|
} |
||||||
|
|
||||||
|
stack.push(node); |
||||||
|
} |
||||||
|
|
||||||
|
return roots; |
||||||
|
}; |
||||||
|
|
||||||
|
const tree = buildTree(); |
||||||
|
|
||||||
|
// Recursively create events from tree
|
||||||
|
const createEventsFromNode = (node: TreeNode): void => { |
||||||
|
const { section, level, sectionId, tags, children } = node; |
||||||
|
|
||||||
|
// Determine if this node should become an index
|
||||||
|
const hasChildrenAtTargetLevel = children.some( |
||||||
|
(child) => child.level === parseLevel, |
||||||
|
); |
||||||
|
const shouldBeIndex = level < parseLevel && |
||||||
|
(hasChildrenAtTargetLevel || |
||||||
|
children.some((child) => child.level <= parseLevel)); |
||||||
|
|
||||||
|
if (shouldBeIndex) { |
||||||
|
// Create content event for intro text (30041)
|
||||||
|
const introContent = extractIntroContent(section.content, level); |
||||||
|
if (introContent.trim()) { |
||||||
|
const contentEvent = { |
||||||
|
id: "", |
||||||
|
pubkey: "", |
||||||
|
created_at: Math.floor(Date.now() / 1000), |
||||||
|
kind: 30041, |
||||||
|
tags: [ |
||||||
|
["d", `${sectionId}-content`], |
||||||
|
["title", section.title], |
||||||
|
...tags, |
||||||
|
], |
||||||
|
content: introContent, |
||||||
|
sig: "", |
||||||
|
}; |
||||||
|
allEvents.push(contentEvent); |
||||||
|
} |
||||||
|
|
||||||
|
// Create index event (30040)
|
||||||
|
const childATags: string[][] = []; |
||||||
|
|
||||||
|
// Add a-tag for intro content if it exists
|
||||||
|
if (introContent.trim()) { |
||||||
|
childATags.push([ |
||||||
|
"a", |
||||||
|
`30041:${actualPubkey}:${sectionId}-content`, |
||||||
|
"", |
||||||
|
"", |
||||||
|
]); |
||||||
|
} |
||||||
|
|
||||||
|
// Add a-tags for direct children
|
||||||
|
for (const child of children) { |
||||||
|
const childHasSubChildren = child.children.some( |
||||||
|
(grandchild) => grandchild.level <= parseLevel, |
||||||
|
); |
||||||
|
const childShouldBeIndex = child.level < parseLevel && |
||||||
|
childHasSubChildren; |
||||||
|
const childKind = childShouldBeIndex ? 30040 : 30041; |
||||||
|
childATags.push([ |
||||||
|
"a", |
||||||
|
`${childKind}:${actualPubkey}:${child.sectionId}`, |
||||||
|
"", |
||||||
|
"", |
||||||
|
]); |
||||||
|
} |
||||||
|
|
||||||
|
const indexEvent = { |
||||||
|
id: "", |
||||||
|
pubkey: "", |
||||||
|
created_at: Math.floor(Date.now() / 1000), |
||||||
|
kind: 30040, |
||||||
|
tags: [ |
||||||
|
["d", sectionId], |
||||||
|
["title", section.title], |
||||||
|
...tags, |
||||||
|
...childATags, |
||||||
|
], |
||||||
|
content: "", |
||||||
|
sig: "", |
||||||
|
}; |
||||||
|
allEvents.push(indexEvent); |
||||||
|
} else { |
||||||
|
// Create regular content event (30041)
|
||||||
|
const contentEvent = { |
||||||
|
id: "", |
||||||
|
pubkey: "", |
||||||
|
created_at: Math.floor(Date.now() / 1000), |
||||||
|
kind: 30041, |
||||||
|
tags: [["d", sectionId], ["title", section.title], ...tags], |
||||||
|
content: section.content, |
||||||
|
sig: "", |
||||||
|
}; |
||||||
|
allEvents.push(contentEvent); |
||||||
|
} |
||||||
|
|
||||||
|
// Recursively process children
|
||||||
|
for (const child of children) { |
||||||
|
createEventsFromNode(child); |
||||||
|
} |
||||||
|
}; |
||||||
|
|
||||||
|
// Process all root level sections
|
||||||
|
for (const rootNode of tree) { |
||||||
|
createEventsFromNode(rootNode); |
||||||
|
} |
||||||
|
|
||||||
|
// Create main document index if we have a document title (article format)
|
||||||
|
if (parsed.title && parsed.title.trim() !== "") { |
||||||
|
const documentId = generateSectionId(parsed.title); |
||||||
|
const documentTags = parseSimpleAttributes(parsed.content); |
||||||
|
|
||||||
|
// Create a-tags for all root level sections (level 2)
|
||||||
|
const mainIndexATags = tree.map((rootNode) => { |
||||||
|
const hasSubChildren = rootNode.children.some( |
||||||
|
(child) => child.level <= parseLevel, |
||||||
|
); |
||||||
|
const shouldBeIndex = rootNode.level < parseLevel && hasSubChildren; |
||||||
|
const kind = shouldBeIndex ? 30040 : 30041; |
||||||
|
return ["a", `${kind}:${actualPubkey}:${rootNode.sectionId}`, "", ""]; |
||||||
|
}); |
||||||
|
|
||||||
|
console.log("Debug: Root sections found:", tree.length); |
||||||
|
console.log("Debug: Main index a-tags:", mainIndexATags); |
||||||
|
|
||||||
|
const mainIndexEvent = { |
||||||
|
id: "", |
||||||
|
pubkey: "", |
||||||
|
created_at: Math.floor(Date.now() / 1000), |
||||||
|
kind: 30040, |
||||||
|
tags: [ |
||||||
|
["d", documentId], |
||||||
|
["title", parsed.title], |
||||||
|
...documentTags, |
||||||
|
...mainIndexATags, |
||||||
|
], |
||||||
|
content: "", |
||||||
|
sig: "", |
||||||
|
}; |
||||||
|
|
||||||
|
return { |
||||||
|
indexEvent: mainIndexEvent, |
||||||
|
contentEvents: allEvents, |
||||||
|
}; |
||||||
|
} |
||||||
|
|
||||||
|
// For scattered notes, return only content events
|
||||||
|
return { |
||||||
|
contentEvents: allEvents, |
||||||
|
}; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Detects content type for smart publishing |
||||||
|
*/ |
||||||
|
export function detectContentType( |
||||||
|
content: string, |
||||||
|
): "article" | "scattered-notes" | "none" { |
||||||
|
const hasDocTitle = content.trim().startsWith("=") && |
||||||
|
!content.trim().startsWith("=="); |
||||||
|
const hasSections = content.includes("=="); |
||||||
|
|
||||||
|
if (hasDocTitle) { |
||||||
|
return "article"; |
||||||
|
} else if (hasSections) { |
||||||
|
return "scattered-notes"; |
||||||
|
} else { |
||||||
|
return "none"; |
||||||
|
} |
||||||
|
} |
||||||
@ -0,0 +1,158 @@ |
|||||||
|
/** |
||||||
|
* Unified AsciiDoc Publication Parser |
||||||
|
* |
||||||
|
* Single entry point for parsing AsciiDoc content into NKBIP-01 compliant |
||||||
|
* publication trees using proper Asciidoctor tree processor extensions. |
||||||
|
* |
||||||
|
* This implements Michael's vision of using PublicationTree as the primary |
||||||
|
* data structure for organizing hierarchical Nostr events. |
||||||
|
*/ |
||||||
|
|
||||||
|
import Asciidoctor from "asciidoctor"; |
||||||
|
import { |
||||||
|
type ProcessorResult, |
||||||
|
registerPublicationTreeProcessor, |
||||||
|
} from "./publication_tree_processor"; |
||||||
|
import type NDK from "@nostr-dev-kit/ndk"; |
||||||
|
|
||||||
|
export type PublicationTreeResult = ProcessorResult; |
||||||
|
|
||||||
|
/** |
||||||
|
* Parse AsciiDoc content into a PublicationTree using tree processor extension |
||||||
|
* This is the main entry point for all parsing operations |
||||||
|
*/ |
||||||
|
export async function parseAsciiDocWithTree( |
||||||
|
content: string, |
||||||
|
ndk: NDK, |
||||||
|
parseLevel: number = 2, |
||||||
|
): Promise<PublicationTreeResult> { |
||||||
|
console.log(`[Parser] Starting parse at level ${parseLevel}`); |
||||||
|
|
||||||
|
// Create fresh Asciidoctor instance
|
||||||
|
const asciidoctor = Asciidoctor(); |
||||||
|
const registry = asciidoctor.Extensions.create(); |
||||||
|
|
||||||
|
// Register our tree processor extension
|
||||||
|
const processorAccessor = registerPublicationTreeProcessor( |
||||||
|
registry, |
||||||
|
ndk, |
||||||
|
parseLevel, |
||||||
|
content, |
||||||
|
); |
||||||
|
|
||||||
|
try { |
||||||
|
// Parse the document with our extension
|
||||||
|
const doc = asciidoctor.load(content, { |
||||||
|
extension_registry: registry, |
||||||
|
standalone: false, |
||||||
|
attributes: { |
||||||
|
sectids: false, |
||||||
|
}, |
||||||
|
}); |
||||||
|
|
||||||
|
console.log(`[Parser] Document converted successfully`); |
||||||
|
|
||||||
|
// Get the result from our processor
|
||||||
|
const result = processorAccessor.getResult(); |
||||||
|
|
||||||
|
if (!result) { |
||||||
|
throw new Error("Tree processor failed to generate result"); |
||||||
|
} |
||||||
|
|
||||||
|
// Build async relationships in the PublicationTree
|
||||||
|
await buildTreeRelationships(result); |
||||||
|
|
||||||
|
console.log(`[Parser] Tree relationships built successfully`); |
||||||
|
|
||||||
|
return result; |
||||||
|
} catch (error) { |
||||||
|
console.error("[Parser] Error during parsing:", error); |
||||||
|
throw new Error( |
||||||
|
`Failed to parse AsciiDoc content: ${ |
||||||
|
error instanceof Error ? error.message : "Unknown error" |
||||||
|
}`,
|
||||||
|
); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Build async relationships in the PublicationTree |
||||||
|
* This adds content events to the tree structure as Michael envisioned |
||||||
|
*/ |
||||||
|
async function buildTreeRelationships(result: ProcessorResult): Promise<void> { |
||||||
|
const { tree, indexEvent, contentEvents } = result; |
||||||
|
|
||||||
|
if (!tree) { |
||||||
|
throw new Error("No tree available to build relationships"); |
||||||
|
} |
||||||
|
|
||||||
|
try { |
||||||
|
// Add content events to the tree
|
||||||
|
if (indexEvent && contentEvents.length > 0) { |
||||||
|
// Article structure: add all content events to index
|
||||||
|
for (const contentEvent of contentEvents) { |
||||||
|
await tree.addEvent(contentEvent, indexEvent); |
||||||
|
} |
||||||
|
} else if (contentEvents.length > 1) { |
||||||
|
// Scattered notes: add remaining events to first event
|
||||||
|
const rootEvent = contentEvents[0]; |
||||||
|
for (let i = 1; i < contentEvents.length; i++) { |
||||||
|
await tree.addEvent(contentEvents[i], rootEvent); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
console.log(`[Parser] Added ${contentEvents.length} events to tree`); |
||||||
|
} catch (error) { |
||||||
|
console.error("[Parser] Error building tree relationships:", error); |
||||||
|
throw error; |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Export events from PublicationTree for publishing workflow compatibility |
||||||
|
*/ |
||||||
|
export function exportEventsFromTree(result: PublicationTreeResult) { |
||||||
|
return { |
||||||
|
indexEvent: result.indexEvent |
||||||
|
? eventToPublishableObject(result.indexEvent) |
||||||
|
: undefined, |
||||||
|
contentEvents: result.contentEvents.map(eventToPublishableObject), |
||||||
|
// Note: Deliberately omitting 'tree' to ensure the object is serializable for postMessage
|
||||||
|
}; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Convert NDKEvent to publishable object format |
||||||
|
* Ensures all properties are serializable for postMessage |
||||||
|
*/ |
||||||
|
function eventToPublishableObject(event: any) { |
||||||
|
// Extract only primitive values to ensure serializability
|
||||||
|
return { |
||||||
|
kind: Number(event.kind), |
||||||
|
content: String(event.content || ""), |
||||||
|
tags: Array.isArray(event.tags) |
||||||
|
? event.tags.map((tag: any) => |
||||||
|
Array.isArray(tag) ? tag.map((t) => String(t)) : [] |
||||||
|
) |
||||||
|
: [], |
||||||
|
created_at: Number(event.created_at || Math.floor(Date.now() / 1000)), |
||||||
|
pubkey: String(event.pubkey || ""), |
||||||
|
id: String(event.id || ""), |
||||||
|
title: event.tags?.find?.((t: string[]) => t[0] === "title")?.[1] || |
||||||
|
"Untitled", |
||||||
|
}; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Validate parse level parameter |
||||||
|
*/ |
||||||
|
export function validateParseLevel(level: number): boolean { |
||||||
|
return Number.isInteger(level) && level >= 2 && level <= 5; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Get supported parse levels |
||||||
|
*/ |
||||||
|
export function getSupportedParseLevels(): number[] { |
||||||
|
return [2, 3, 4, 5]; |
||||||
|
} |
||||||
@ -0,0 +1,70 @@ |
|||||||
|
import type NDK from "@nostr-dev-kit/ndk"; |
||||||
|
import { NDKEvent } from "@nostr-dev-kit/ndk"; |
||||||
|
|
||||||
|
/** |
||||||
|
* Fetches all highlight events (kind 9802) for sections referenced in a publication event (kind 30040). |
||||||
|
* |
||||||
|
* @param publicationEvent - The kind 30040 event containing "a" tags referencing sections (kind 30041) |
||||||
|
* @param ndk - The NDK instance to use for fetching events |
||||||
|
* @returns A Map of section addresses to arrays of highlight events |
||||||
|
* |
||||||
|
* @example |
||||||
|
* ```typescript
|
||||||
|
* const highlights = await fetchHighlightsForPublication(publicationEvent, ndk); |
||||||
|
* // Returns: Map {
|
||||||
|
* // "30041:pubkey:section-id" => [highlightEvent1, highlightEvent2],
|
||||||
|
* // "30041:pubkey:another-section" => [highlightEvent3]
|
||||||
|
* // }
|
||||||
|
* ``` |
||||||
|
*/ |
||||||
|
export async function fetchHighlightsForPublication( |
||||||
|
publicationEvent: NDKEvent, |
||||||
|
ndk: NDK, |
||||||
|
): Promise<Map<string, NDKEvent[]>> { |
||||||
|
// Extract all "a" tags from the publication event
|
||||||
|
const aTags = publicationEvent.getMatchingTags("a"); |
||||||
|
|
||||||
|
// Filter for only 30041 (section) references
|
||||||
|
const sectionAddresses: string[] = []; |
||||||
|
aTags.forEach((tag: string[]) => { |
||||||
|
if (tag[1]) { |
||||||
|
const parts = tag[1].split(":"); |
||||||
|
// Check if it's a 30041 kind reference and has the correct format
|
||||||
|
if (parts.length >= 3 && parts[0] === "30041") { |
||||||
|
// Handle d-tags with colons by joining everything after the pubkey
|
||||||
|
const sectionAddress = tag[1]; |
||||||
|
sectionAddresses.push(sectionAddress); |
||||||
|
} |
||||||
|
} |
||||||
|
}); |
||||||
|
|
||||||
|
// If no section references found, return empty map
|
||||||
|
if (sectionAddresses.length === 0) { |
||||||
|
return new Map(); |
||||||
|
} |
||||||
|
|
||||||
|
// Fetch all highlight events (kind 9802) that reference these sections
|
||||||
|
const highlightEvents = await ndk.fetchEvents({ |
||||||
|
kinds: [9802], |
||||||
|
"#a": sectionAddresses, |
||||||
|
}); |
||||||
|
|
||||||
|
// Group highlights by section address
|
||||||
|
const highlightsBySection = new Map<string, NDKEvent[]>(); |
||||||
|
|
||||||
|
highlightEvents.forEach((highlight: NDKEvent) => { |
||||||
|
const highlightATags = highlight.getMatchingTags("a"); |
||||||
|
highlightATags.forEach((tag: string[]) => { |
||||||
|
const sectionAddress = tag[1]; |
||||||
|
// Only include if this section is in our original list
|
||||||
|
if (sectionAddress && sectionAddresses.includes(sectionAddress)) { |
||||||
|
if (!highlightsBySection.has(sectionAddress)) { |
||||||
|
highlightsBySection.set(sectionAddress, []); |
||||||
|
} |
||||||
|
highlightsBySection.get(sectionAddress)!.push(highlight); |
||||||
|
} |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
return highlightsBySection; |
||||||
|
} |
||||||
@ -0,0 +1,241 @@ |
|||||||
|
/** |
||||||
|
* Utility for position-based text highlighting in the DOM |
||||||
|
* |
||||||
|
* Highlights text by character offset rather than text search, |
||||||
|
* making highlights resilient to minor content changes. |
||||||
|
*/ |
||||||
|
|
||||||
|
/** |
||||||
|
* Get all text nodes within an element, excluding script/style tags |
||||||
|
*/ |
||||||
|
function getTextNodes(element: HTMLElement): Text[] { |
||||||
|
const textNodes: Text[] = []; |
||||||
|
const walker = document.createTreeWalker( |
||||||
|
element, |
||||||
|
NodeFilter.SHOW_TEXT, |
||||||
|
{ |
||||||
|
acceptNode: (node) => { |
||||||
|
// Skip text in script/style tags
|
||||||
|
const parent = node.parentElement; |
||||||
|
if ( |
||||||
|
parent && (parent.tagName === "SCRIPT" || parent.tagName === "STYLE") |
||||||
|
) { |
||||||
|
return NodeFilter.FILTER_REJECT; |
||||||
|
} |
||||||
|
// Skip empty text nodes
|
||||||
|
if (!node.textContent || node.textContent.trim().length === 0) { |
||||||
|
return NodeFilter.FILTER_REJECT; |
||||||
|
} |
||||||
|
return NodeFilter.FILTER_ACCEPT; |
||||||
|
}, |
||||||
|
}, |
||||||
|
); |
||||||
|
|
||||||
|
let node: Node | null; |
||||||
|
while ((node = walker.nextNode())) { |
||||||
|
textNodes.push(node as Text); |
||||||
|
} |
||||||
|
|
||||||
|
return textNodes; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Calculate the total text length from text nodes |
||||||
|
*/ |
||||||
|
function getTotalTextLength(textNodes: Text[]): number { |
||||||
|
return textNodes.reduce( |
||||||
|
(total, node) => total + (node.textContent?.length || 0), |
||||||
|
0, |
||||||
|
); |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Find text node and local offset for a given global character position |
||||||
|
*/ |
||||||
|
function findNodeAtOffset( |
||||||
|
textNodes: Text[], |
||||||
|
globalOffset: number, |
||||||
|
): { node: Text; localOffset: number } | null { |
||||||
|
let currentOffset = 0; |
||||||
|
|
||||||
|
for (const node of textNodes) { |
||||||
|
const nodeLength = node.textContent?.length || 0; |
||||||
|
|
||||||
|
if (globalOffset < currentOffset + nodeLength) { |
||||||
|
return { |
||||||
|
node, |
||||||
|
localOffset: globalOffset - currentOffset, |
||||||
|
}; |
||||||
|
} |
||||||
|
|
||||||
|
currentOffset += nodeLength; |
||||||
|
} |
||||||
|
|
||||||
|
return null; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Highlight text by character offset within a container element |
||||||
|
* |
||||||
|
* @param container - The root element to search within |
||||||
|
* @param startOffset - Character position where highlight starts (0-indexed) |
||||||
|
* @param endOffset - Character position where highlight ends (exclusive) |
||||||
|
* @param color - Background color for the highlight |
||||||
|
* @returns true if highlight was applied, false otherwise |
||||||
|
*/ |
||||||
|
export function highlightByOffset( |
||||||
|
container: HTMLElement, |
||||||
|
startOffset: number, |
||||||
|
endOffset: number, |
||||||
|
color: string, |
||||||
|
): boolean { |
||||||
|
console.log( |
||||||
|
`[highlightByOffset] Attempting to highlight chars ${startOffset}-${endOffset}`, |
||||||
|
); |
||||||
|
|
||||||
|
// Validate inputs
|
||||||
|
if (startOffset < 0 || endOffset <= startOffset) { |
||||||
|
console.warn( |
||||||
|
`[highlightByOffset] Invalid offsets: ${startOffset}-${endOffset}`, |
||||||
|
); |
||||||
|
return false; |
||||||
|
} |
||||||
|
|
||||||
|
// Get all text nodes
|
||||||
|
const textNodes = getTextNodes(container); |
||||||
|
if (textNodes.length === 0) { |
||||||
|
console.warn(`[highlightByOffset] No text nodes found in container`); |
||||||
|
return false; |
||||||
|
} |
||||||
|
|
||||||
|
const totalLength = getTotalTextLength(textNodes); |
||||||
|
console.log( |
||||||
|
`[highlightByOffset] Total text length: ${totalLength}, nodes: ${textNodes.length}`, |
||||||
|
); |
||||||
|
|
||||||
|
// Validate offsets are within bounds
|
||||||
|
if (startOffset >= totalLength) { |
||||||
|
console.warn( |
||||||
|
`[highlightByOffset] Start offset ${startOffset} exceeds total length ${totalLength}`, |
||||||
|
); |
||||||
|
return false; |
||||||
|
} |
||||||
|
|
||||||
|
// Adjust end offset if it exceeds content
|
||||||
|
const adjustedEndOffset = Math.min(endOffset, totalLength); |
||||||
|
|
||||||
|
// Find the nodes containing start and end positions
|
||||||
|
const startPos = findNodeAtOffset(textNodes, startOffset); |
||||||
|
const endPos = findNodeAtOffset(textNodes, adjustedEndOffset); |
||||||
|
|
||||||
|
if (!startPos || !endPos) { |
||||||
|
console.warn(`[highlightByOffset] Could not locate positions in DOM`); |
||||||
|
return false; |
||||||
|
} |
||||||
|
|
||||||
|
console.log(`[highlightByOffset] Found positions:`, { |
||||||
|
startNode: startPos.node.textContent?.substring(0, 20), |
||||||
|
startLocal: startPos.localOffset, |
||||||
|
endNode: endPos.node.textContent?.substring(0, 20), |
||||||
|
endLocal: endPos.localOffset, |
||||||
|
}); |
||||||
|
|
||||||
|
// Create the highlight mark element
|
||||||
|
const createHighlightMark = (text: string): HTMLElement => { |
||||||
|
const mark = document.createElement("mark"); |
||||||
|
mark.className = "highlight"; |
||||||
|
mark.style.backgroundColor = color; |
||||||
|
mark.style.borderRadius = "2px"; |
||||||
|
mark.style.padding = "2px 0"; |
||||||
|
mark.textContent = text; |
||||||
|
return mark; |
||||||
|
}; |
||||||
|
|
||||||
|
try { |
||||||
|
// Case 1: Highlight is within a single text node
|
||||||
|
if (startPos.node === endPos.node) { |
||||||
|
const text = startPos.node.textContent || ""; |
||||||
|
const before = text.substring(0, startPos.localOffset); |
||||||
|
const highlighted = text.substring( |
||||||
|
startPos.localOffset, |
||||||
|
endPos.localOffset, |
||||||
|
); |
||||||
|
const after = text.substring(endPos.localOffset); |
||||||
|
|
||||||
|
const parent = startPos.node.parentNode; |
||||||
|
if (!parent) return false; |
||||||
|
|
||||||
|
// Create fragment with before + highlight + after
|
||||||
|
const fragment = document.createDocumentFragment(); |
||||||
|
if (before) fragment.appendChild(document.createTextNode(before)); |
||||||
|
fragment.appendChild(createHighlightMark(highlighted)); |
||||||
|
if (after) fragment.appendChild(document.createTextNode(after)); |
||||||
|
|
||||||
|
parent.replaceChild(fragment, startPos.node); |
||||||
|
console.log( |
||||||
|
`[highlightByOffset] Applied single-node highlight: "${highlighted}"`, |
||||||
|
); |
||||||
|
return true; |
||||||
|
} |
||||||
|
|
||||||
|
// Case 2: Highlight spans multiple text nodes
|
||||||
|
let currentNode: Text | null = startPos.node; |
||||||
|
let isFirstNode = true; |
||||||
|
let nodeIndex = textNodes.indexOf(currentNode); |
||||||
|
|
||||||
|
while (currentNode && nodeIndex <= textNodes.indexOf(endPos.node)) { |
||||||
|
const parent = currentNode.parentNode; |
||||||
|
if (!parent) break; |
||||||
|
|
||||||
|
const text = currentNode.textContent || ""; |
||||||
|
let fragment = document.createDocumentFragment(); |
||||||
|
|
||||||
|
if (isFirstNode) { |
||||||
|
// First node: split at start offset
|
||||||
|
const before = text.substring(0, startPos.localOffset); |
||||||
|
const highlighted = text.substring(startPos.localOffset); |
||||||
|
|
||||||
|
if (before) fragment.appendChild(document.createTextNode(before)); |
||||||
|
fragment.appendChild(createHighlightMark(highlighted)); |
||||||
|
isFirstNode = false; |
||||||
|
} else if (currentNode === endPos.node) { |
||||||
|
// Last node: split at end offset
|
||||||
|
const highlighted = text.substring(0, endPos.localOffset); |
||||||
|
const after = text.substring(endPos.localOffset); |
||||||
|
|
||||||
|
fragment.appendChild(createHighlightMark(highlighted)); |
||||||
|
if (after) fragment.appendChild(document.createTextNode(after)); |
||||||
|
} else { |
||||||
|
// Middle node: highlight entirely
|
||||||
|
fragment.appendChild(createHighlightMark(text)); |
||||||
|
} |
||||||
|
|
||||||
|
parent.replaceChild(fragment, currentNode); |
||||||
|
|
||||||
|
nodeIndex++; |
||||||
|
currentNode = textNodes[nodeIndex] || null; |
||||||
|
} |
||||||
|
|
||||||
|
console.log(`[highlightByOffset] Applied multi-node highlight`); |
||||||
|
return true; |
||||||
|
} catch (err) { |
||||||
|
console.error(`[highlightByOffset] Error applying highlight:`, err); |
||||||
|
return false; |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Get the plain text content of an element (without HTML tags) |
||||||
|
* Useful for debugging and validation |
||||||
|
*/ |
||||||
|
export function getPlainText(element: HTMLElement): string { |
||||||
|
const textNodes = getTextNodes(element); |
||||||
|
return textNodes.map((node) => node.textContent).join(""); |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Get the character count of visible text in an element |
||||||
|
*/ |
||||||
|
export function getTextLength(element: HTMLElement): number { |
||||||
|
return getPlainText(element).length; |
||||||
|
} |
||||||
@ -0,0 +1,167 @@ |
|||||||
|
/** |
||||||
|
* Utility functions for highlight management |
||||||
|
*/ |
||||||
|
|
||||||
|
import type { NDKEvent } from "@nostr-dev-kit/ndk"; |
||||||
|
import { nip19 } from "nostr-tools"; |
||||||
|
|
||||||
|
export interface GroupedHighlight { |
||||||
|
pubkey: string; |
||||||
|
highlights: NDKEvent[]; |
||||||
|
count: number; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Groups highlights by author pubkey |
||||||
|
* Returns a Map with pubkey as key and array of highlights as value |
||||||
|
*/ |
||||||
|
export function groupHighlightsByAuthor( |
||||||
|
highlights: NDKEvent[], |
||||||
|
): Map<string, NDKEvent[]> { |
||||||
|
const grouped = new Map<string, NDKEvent[]>(); |
||||||
|
|
||||||
|
for (const highlight of highlights) { |
||||||
|
const pubkey = highlight.pubkey; |
||||||
|
const existing = grouped.get(pubkey) || []; |
||||||
|
existing.push(highlight); |
||||||
|
grouped.set(pubkey, existing); |
||||||
|
} |
||||||
|
|
||||||
|
return grouped; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Truncates highlight text to specified length, breaking at word boundaries |
||||||
|
* @param text - The text to truncate |
||||||
|
* @param maxLength - Maximum length (default: 50) |
||||||
|
* @returns Truncated text with ellipsis if needed |
||||||
|
*/ |
||||||
|
export function truncateHighlight( |
||||||
|
text: string, |
||||||
|
maxLength: number = 50, |
||||||
|
): string { |
||||||
|
if (!text || text.length <= maxLength) { |
||||||
|
return text; |
||||||
|
} |
||||||
|
|
||||||
|
// Find the last space before maxLength
|
||||||
|
const truncated = text.slice(0, maxLength); |
||||||
|
const lastSpace = truncated.lastIndexOf(" "); |
||||||
|
|
||||||
|
// If there's a space, break there; otherwise use the full maxLength
|
||||||
|
if (lastSpace > 0) { |
||||||
|
return truncated.slice(0, lastSpace) + "..."; |
||||||
|
} |
||||||
|
|
||||||
|
return truncated + "..."; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Encodes a highlight event as an naddr with relay hints |
||||||
|
* @param event - The highlight event (kind 9802) |
||||||
|
* @param relays - Array of relay URLs to include as hints |
||||||
|
* @returns naddr string |
||||||
|
*/ |
||||||
|
export function encodeHighlightNaddr( |
||||||
|
event: NDKEvent, |
||||||
|
relays: string[] = [], |
||||||
|
): string { |
||||||
|
try { |
||||||
|
// For kind 9802 highlights, we need the event's unique identifier
|
||||||
|
// Since highlights don't have a d-tag, we'll use the event id as nevent instead
|
||||||
|
// But per NIP-19, naddr is for addressable events (with d-tag)
|
||||||
|
// For non-addressable events like kind 9802, we should use nevent
|
||||||
|
|
||||||
|
const nevent = nip19.neventEncode({ |
||||||
|
id: event.id, |
||||||
|
relays: relays.length > 0 ? relays : undefined, |
||||||
|
author: event.pubkey, |
||||||
|
kind: event.kind, |
||||||
|
}); |
||||||
|
|
||||||
|
return nevent; |
||||||
|
} catch (error) { |
||||||
|
console.error("Error encoding highlight naddr:", error); |
||||||
|
// Fallback to just the event id
|
||||||
|
return event.id; |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Creates a shortened npub for display |
||||||
|
* @param pubkey - The hex pubkey |
||||||
|
* @param length - Number of characters to show from start (default: 8) |
||||||
|
* @returns Shortened npub like "npub1abc...xyz" |
||||||
|
*/ |
||||||
|
export function shortenNpub(pubkey: string, length: number = 8): string { |
||||||
|
try { |
||||||
|
const npub = nip19.npubEncode(pubkey); |
||||||
|
// npub format: "npub1" + bech32 encoded data
|
||||||
|
// Show first part and last part
|
||||||
|
if (npub.length <= length + 10) { |
||||||
|
return npub; |
||||||
|
} |
||||||
|
|
||||||
|
const start = npub.slice(0, length + 5); // "npub1" + first chars
|
||||||
|
const end = npub.slice(-4); // last chars
|
||||||
|
return `${start}...${end}`; |
||||||
|
} catch (error) { |
||||||
|
console.error("Error creating shortened npub:", error); |
||||||
|
// Fallback to shortened hex
|
||||||
|
return `${pubkey.slice(0, 8)}...${pubkey.slice(-4)}`; |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Extracts relay URLs from a highlight event's tags or metadata |
||||||
|
* @param event - The highlight event |
||||||
|
* @returns Array of relay URLs |
||||||
|
*/ |
||||||
|
export function getRelaysFromHighlight(event: NDKEvent): string[] { |
||||||
|
const relays: string[] = []; |
||||||
|
|
||||||
|
// Check for relay hints in tags (e.g., ["a", "30041:pubkey:id", "relay-url"])
|
||||||
|
for (const tag of event.tags) { |
||||||
|
if ((tag[0] === "a" || tag[0] === "e" || tag[0] === "p") && tag[2]) { |
||||||
|
relays.push(tag[2]); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
// Also include relay from the event if available
|
||||||
|
if (event.relay?.url) { |
||||||
|
relays.push(event.relay.url); |
||||||
|
} |
||||||
|
|
||||||
|
// Deduplicate
|
||||||
|
return [...new Set(relays)]; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Sorts highlights within a group by creation time (newest first) |
||||||
|
* @param highlights - Array of highlight events |
||||||
|
* @returns Sorted array |
||||||
|
*/ |
||||||
|
export function sortHighlightsByTime(highlights: NDKEvent[]): NDKEvent[] { |
||||||
|
return [...highlights].sort((a, b) => { |
||||||
|
const timeA = a.created_at || 0; |
||||||
|
const timeB = b.created_at || 0; |
||||||
|
return timeB - timeA; // Newest first
|
||||||
|
}); |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Gets the display name for a highlight author |
||||||
|
* Priority: displayName > name > shortened npub |
||||||
|
*/ |
||||||
|
export function getAuthorDisplayName( |
||||||
|
profile: |
||||||
|
| { name?: string; displayName?: string; display_name?: string } |
||||||
|
| null, |
||||||
|
pubkey: string, |
||||||
|
): string { |
||||||
|
if (profile) { |
||||||
|
return profile.displayName || profile.display_name || profile.name || |
||||||
|
shortenNpub(pubkey); |
||||||
|
} |
||||||
|
return shortenNpub(pubkey); |
||||||
|
} |
||||||
@ -0,0 +1,179 @@ |
|||||||
|
import { NDKEvent } from "@nostr-dev-kit/ndk"; |
||||||
|
import type NDK from "@nostr-dev-kit/ndk"; |
||||||
|
|
||||||
|
/** |
||||||
|
* Generate mock comment data for testing comment UI and threading |
||||||
|
* Creates realistic thread structures with root comments and nested replies |
||||||
|
*/ |
||||||
|
|
||||||
|
const loremIpsumComments = [ |
||||||
|
"Lorem ipsum dolor sit amet, consectetur adipiscing elit. Sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.", |
||||||
|
"Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat.", |
||||||
|
"Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.", |
||||||
|
"Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.", |
||||||
|
"Sed ut perspiciatis unde omnis iste natus error sit voluptatem accusantium doloremque laudantium.", |
||||||
|
"Nemo enim ipsam voluptatem quia voluptas sit aspernatur aut odit aut fugit, sed quia consequuntur magni dolores.", |
||||||
|
"Neque porro quisquam est, qui dolorem ipsum quia dolor sit amet, consectetur, adipisci velit.", |
||||||
|
"At vero eos et accusamus et iusto odio dignissimos ducimus qui blanditiis praesentium voluptatum deleniti atque corrupti.", |
||||||
|
"Et harum quidem rerum facilis est et expedita distinctio. Nam libero tempore, cum soluta nobis est eligendi optio.", |
||||||
|
"Temporibus autem quibusdam et aut officiis debitis aut rerum necessitatibus saepe eveniet ut et voluptates repudiandae.", |
||||||
|
]; |
||||||
|
|
||||||
|
const loremIpsumReplies = [ |
||||||
|
"Quis autem vel eum iure reprehenderit qui in ea voluptate velit esse quam nihil molestiae consequatur.", |
||||||
|
"Vel illum qui dolorem eum fugiat quo voluptas nulla pariatur.", |
||||||
|
"Nam libero tempore, cum soluta nobis est eligendi optio cumque nihil impedit quo minus id quod maxime placeat.", |
||||||
|
"Omnis voluptas assumenda est, omnis dolor repellendus.", |
||||||
|
"Itaque earum rerum hic tenetur a sapiente delectus, ut aut reiciendis voluptatibus maiores alias consequatur.", |
||||||
|
"Facere possimus, omnis voluptas assumenda est.", |
||||||
|
"Sed ut perspiciatis unde omnis iste natus error.", |
||||||
|
"Accusantium doloremque laudantium, totam rem aperiam.", |
||||||
|
]; |
||||||
|
|
||||||
|
const mockPubkeys = [ |
||||||
|
"a1b2c3d4e5f6789012345678901234567890123456789012345678901234abcd", |
||||||
|
"b2c3d4e5f6789012345678901234567890123456789012345678901234abcde", |
||||||
|
"c3d4e5f6789012345678901234567890123456789012345678901234abcdef0", |
||||||
|
"d4e5f6789012345678901234567890123456789012345678901234abcdef01", |
||||||
|
]; |
||||||
|
|
||||||
|
/** |
||||||
|
* Create a mock NDKEvent that looks like a real comment |
||||||
|
*/ |
||||||
|
function createMockComment( |
||||||
|
id: string, |
||||||
|
content: string, |
||||||
|
pubkey: string, |
||||||
|
targetAddress: string, |
||||||
|
createdAt: number, |
||||||
|
replyToId?: string, |
||||||
|
replyToAuthor?: string, |
||||||
|
): any { |
||||||
|
const tags: string[][] = [ |
||||||
|
["A", targetAddress, "wss://relay.damus.io", pubkey], |
||||||
|
["K", "30041"], |
||||||
|
["P", pubkey, "wss://relay.damus.io"], |
||||||
|
["a", targetAddress, "wss://relay.damus.io"], |
||||||
|
["k", "30041"], |
||||||
|
["p", pubkey, "wss://relay.damus.io"], |
||||||
|
]; |
||||||
|
|
||||||
|
if (replyToId && replyToAuthor) { |
||||||
|
tags.push(["e", replyToId, "wss://relay.damus.io", "reply"]); |
||||||
|
tags.push(["p", replyToAuthor, "wss://relay.damus.io"]); |
||||||
|
} |
||||||
|
|
||||||
|
// Return a plain object that matches NDKEvent structure
|
||||||
|
return { |
||||||
|
id, |
||||||
|
kind: 1111, |
||||||
|
pubkey, |
||||||
|
created_at: createdAt, |
||||||
|
content, |
||||||
|
tags, |
||||||
|
sig: "mock-signature-" + id, |
||||||
|
}; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Generate mock comment thread structure |
||||||
|
* @param sectionAddress - The section address to attach comments to |
||||||
|
* @param numRootComments - Number of root comments to generate (default: 3) |
||||||
|
* @param numRepliesPerThread - Number of replies per thread (default: 2) |
||||||
|
* @returns Array of mock comment objects |
||||||
|
*/ |
||||||
|
export function generateMockComments( |
||||||
|
sectionAddress: string, |
||||||
|
numRootComments: number = 3, |
||||||
|
numRepliesPerThread: number = 2, |
||||||
|
): any[] { |
||||||
|
const comments: any[] = []; |
||||||
|
const now = Math.floor(Date.now() / 1000); |
||||||
|
let commentIndex = 0; |
||||||
|
|
||||||
|
// Generate root comments
|
||||||
|
for (let i = 0; i < numRootComments; i++) { |
||||||
|
const rootId = `mock-root-${i}-${Date.now()}`; |
||||||
|
const rootPubkey = mockPubkeys[i % mockPubkeys.length]; |
||||||
|
const rootContent = loremIpsumComments[i % loremIpsumComments.length]; |
||||||
|
const rootCreatedAt = now - (numRootComments - i) * 3600; // Stagger by hours
|
||||||
|
|
||||||
|
const rootComment = createMockComment( |
||||||
|
rootId, |
||||||
|
rootContent, |
||||||
|
rootPubkey, |
||||||
|
sectionAddress, |
||||||
|
rootCreatedAt, |
||||||
|
); |
||||||
|
|
||||||
|
comments.push(rootComment); |
||||||
|
|
||||||
|
// Generate replies to this root comment
|
||||||
|
for (let j = 0; j < numRepliesPerThread; j++) { |
||||||
|
const replyId = `mock-reply-${i}-${j}-${Date.now()}`; |
||||||
|
const replyPubkey = mockPubkeys[(i + j + 1) % mockPubkeys.length]; |
||||||
|
const replyContent = |
||||||
|
loremIpsumReplies[commentIndex % loremIpsumReplies.length]; |
||||||
|
const replyCreatedAt = rootCreatedAt + (j + 1) * 1800; // 30 min after each
|
||||||
|
|
||||||
|
const reply = createMockComment( |
||||||
|
replyId, |
||||||
|
replyContent, |
||||||
|
replyPubkey, |
||||||
|
sectionAddress, |
||||||
|
replyCreatedAt, |
||||||
|
rootId, |
||||||
|
rootPubkey, |
||||||
|
); |
||||||
|
|
||||||
|
comments.push(reply); |
||||||
|
|
||||||
|
// Optionally add a nested reply (reply to reply)
|
||||||
|
if (j === 0 && i < 2) { |
||||||
|
const nestedId = `mock-nested-${i}-${j}-${Date.now()}`; |
||||||
|
const nestedPubkey = mockPubkeys[(i + j + 2) % mockPubkeys.length]; |
||||||
|
const nestedContent = |
||||||
|
loremIpsumReplies[(commentIndex + 1) % loremIpsumReplies.length]; |
||||||
|
const nestedCreatedAt = replyCreatedAt + 900; // 15 min after reply
|
||||||
|
|
||||||
|
const nested = createMockComment( |
||||||
|
nestedId, |
||||||
|
nestedContent, |
||||||
|
nestedPubkey, |
||||||
|
sectionAddress, |
||||||
|
nestedCreatedAt, |
||||||
|
replyId, |
||||||
|
replyPubkey, |
||||||
|
); |
||||||
|
|
||||||
|
comments.push(nested); |
||||||
|
} |
||||||
|
|
||||||
|
commentIndex++; |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
return comments; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Generate mock comments for multiple sections |
||||||
|
* @param sectionAddresses - Array of section addresses |
||||||
|
* @returns Array of all mock comments across all sections |
||||||
|
*/ |
||||||
|
export function generateMockCommentsForSections( |
||||||
|
sectionAddresses: string[], |
||||||
|
): any[] { |
||||||
|
const allComments: any[] = []; |
||||||
|
|
||||||
|
sectionAddresses.forEach((address, index) => { |
||||||
|
// Vary the number of comments per section
|
||||||
|
const numRoot = 2 + (index % 3); // 2-4 root comments
|
||||||
|
const numReplies = 1 + (index % 2); // 1-2 replies per thread
|
||||||
|
|
||||||
|
const sectionComments = generateMockComments(address, numRoot, numReplies); |
||||||
|
allComments.push(...sectionComments); |
||||||
|
}); |
||||||
|
|
||||||
|
return allComments; |
||||||
|
} |
||||||
@ -0,0 +1,200 @@ |
|||||||
|
/** |
||||||
|
* Generate mock highlight data (kind 9802) for testing highlight UI |
||||||
|
* Creates realistic highlight events with context and optional annotations |
||||||
|
*/ |
||||||
|
|
||||||
|
// Sample highlighted text snippets (things users might actually highlight)
|
||||||
|
const highlightedTexts = [ |
||||||
|
"Knowledge that tries to stay put inevitably becomes ossified", |
||||||
|
"The attempt to hold knowledge still is like trying to photograph a river", |
||||||
|
"Understanding emerges not from rigid frameworks but from fluid engagement", |
||||||
|
"Traditional institutions struggle with the natural promiscuity of ideas", |
||||||
|
"Thinking without permission means refusing predetermined categories", |
||||||
|
"The most valuable insights often come from unexpected juxtapositions", |
||||||
|
"Anarchistic knowledge rejects the notion of authorized interpreters", |
||||||
|
"Every act of reading is an act of creative interpretation", |
||||||
|
"Hierarchy in knowledge systems serves power, not understanding", |
||||||
|
"The boundary between creator and consumer is an artificial construction", |
||||||
|
]; |
||||||
|
|
||||||
|
// Context strings (surrounding text to help locate the highlight)
|
||||||
|
const contexts = [ |
||||||
|
"This is the fundamental paradox of institutionalized knowledge. Knowledge that tries to stay put inevitably becomes ossified, a monument to itself rather than a living practice.", |
||||||
|
"The attempt to hold knowledge still is like trying to photograph a river—you capture an image, but you lose the flow. What remains is a static representation, not the dynamic reality.", |
||||||
|
"Understanding emerges not from rigid frameworks but from fluid engagement with ideas, people, and contexts. This fluidity is precisely what traditional systems attempt to eliminate.", |
||||||
|
"Traditional institutions struggle with the natural promiscuity of ideas—the way concepts naturally migrate, mutate, and merge across boundaries that were meant to contain them.", |
||||||
|
"Thinking without permission means refusing predetermined categories and challenging the gatekeepers who claim authority over legitimate thought.", |
||||||
|
"The most valuable insights often come from unexpected juxtapositions, from bringing together ideas that were never meant to meet.", |
||||||
|
"Anarchistic knowledge rejects the notion of authorized interpreters, asserting instead that meaning-making is a fundamentally distributed and democratic process.", |
||||||
|
"Every act of reading is an act of creative interpretation, a collaboration between text and reader that produces something new each time.", |
||||||
|
"Hierarchy in knowledge systems serves power, not understanding. It determines who gets to speak, who must listen, and what counts as legitimate knowledge.", |
||||||
|
"The boundary between creator and consumer is an artificial construction, one that digital networks make increasingly untenable and obsolete.", |
||||||
|
]; |
||||||
|
|
||||||
|
// Optional annotations (user comments on their highlights)
|
||||||
|
const annotations = [ |
||||||
|
"This perfectly captures the institutional problem", |
||||||
|
"Key insight - worth revisiting", |
||||||
|
"Reminds me of Deleuze on rhizomatic structures", |
||||||
|
"Fundamental critique of academic gatekeeping", |
||||||
|
"The core argument in one sentence", |
||||||
|
null, // Some highlights have no annotation
|
||||||
|
"Important for understanding the broader thesis", |
||||||
|
null, |
||||||
|
"Connects to earlier discussion on page 12", |
||||||
|
null, |
||||||
|
]; |
||||||
|
|
||||||
|
// Mock pubkeys - MUST be exactly 64 hex characters
|
||||||
|
const mockPubkeys = [ |
||||||
|
"a1b2c3d4e5f67890123456789012345678901234567890123456789012345678", |
||||||
|
"b2c3d4e5f67890123456789012345678901234567890123456789012345678ab", |
||||||
|
"c3d4e5f67890123456789012345678901234567890123456789012345678abcd", |
||||||
|
"d4e5f67890123456789012345678901234567890123456789012345678abcdef", |
||||||
|
"e5f6789012345678901234567890123456789012345678901234567890abcdef", |
||||||
|
]; |
||||||
|
|
||||||
|
/** |
||||||
|
* Create a mock highlight event (kind 9802) |
||||||
|
* |
||||||
|
* AI-NOTE: Unlike comments (kind 1111), highlights have: |
||||||
|
* - content field = the highlighted text itself (NOT a user comment) |
||||||
|
* - ["context", ...] tag with surrounding text to help locate the highlight |
||||||
|
* - Optional ["comment", ...] tag for user annotations |
||||||
|
* - Optional ["offset", start, end] tag for position-based highlighting |
||||||
|
* - Single lowercase ["a", targetAddress] tag (not uppercase/lowercase pairs) |
||||||
|
*/ |
||||||
|
function createMockHighlight( |
||||||
|
id: string, |
||||||
|
highlightedText: string, |
||||||
|
context: string, |
||||||
|
targetAddress: string, |
||||||
|
pubkey: string, |
||||||
|
createdAt: number, |
||||||
|
authorPubkey: string, |
||||||
|
annotation?: string | null, |
||||||
|
offsetStart?: number, |
||||||
|
offsetEnd?: number, |
||||||
|
): any { |
||||||
|
const tags: string[][] = [ |
||||||
|
["a", targetAddress, "wss://relay.damus.io"], |
||||||
|
["context", context], |
||||||
|
["p", authorPubkey, "wss://relay.damus.io", "author"], |
||||||
|
]; |
||||||
|
|
||||||
|
// Add optional annotation
|
||||||
|
if (annotation) { |
||||||
|
tags.push(["comment", annotation]); |
||||||
|
} |
||||||
|
|
||||||
|
// Add optional offset for position-based highlighting
|
||||||
|
if (offsetStart !== undefined && offsetEnd !== undefined) { |
||||||
|
tags.push(["offset", offsetStart.toString(), offsetEnd.toString()]); |
||||||
|
} |
||||||
|
|
||||||
|
return { |
||||||
|
id, |
||||||
|
kind: 9802, |
||||||
|
pubkey, |
||||||
|
created_at: createdAt, |
||||||
|
content: highlightedText, // The highlighted text itself
|
||||||
|
tags, |
||||||
|
sig: "mock-signature-" + id, |
||||||
|
}; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Generate mock highlights for a section |
||||||
|
* @param sectionAddress - The section address to attach highlights to |
||||||
|
* @param authorPubkey - The author's pubkey (for the "p" tag) |
||||||
|
* @param numHighlights - Number of highlights to generate (default: 3-5 random) |
||||||
|
* @returns Array of mock highlight objects |
||||||
|
*/ |
||||||
|
export function generateMockHighlights( |
||||||
|
sectionAddress: string, |
||||||
|
authorPubkey: string, |
||||||
|
numHighlights: number = Math.floor(Math.random() * 2) + 2, // 2-3 highlights
|
||||||
|
): any[] { |
||||||
|
const highlights: any[] = []; |
||||||
|
const now = Math.floor(Date.now() / 1000); |
||||||
|
|
||||||
|
// Generate position-based highlights at the beginning of each section
|
||||||
|
// For test mode, we use simple placeholder text and rely on offset-based highlighting
|
||||||
|
// The offset tags will highlight the ACTUAL text at those positions in the section
|
||||||
|
|
||||||
|
for (let i = 0; i < numHighlights; i++) { |
||||||
|
const id = `mock-highlight-${i}-${Date.now()}-${ |
||||||
|
Math.random().toString(36).substring(7) |
||||||
|
}`;
|
||||||
|
const highlighterPubkey = mockPubkeys[i % mockPubkeys.length]; |
||||||
|
const annotation = annotations[i % annotations.length]; |
||||||
|
const createdAt = now - (numHighlights - i) * 7200; // Stagger by 2 hours
|
||||||
|
|
||||||
|
// Create sequential highlights at the beginning of the section
|
||||||
|
// Each highlight is exactly 100 characters
|
||||||
|
const highlightLength = 100; |
||||||
|
const offsetStart = i * 120; // Space between highlights (120 chars apart)
|
||||||
|
const offsetEnd = offsetStart + highlightLength; |
||||||
|
|
||||||
|
// Use placeholder text - the actual highlighted text will be determined by the offsets
|
||||||
|
const placeholderText = `Test highlight ${i + 1}`; |
||||||
|
const placeholderContext = `This is test highlight ${ |
||||||
|
i + 1 |
||||||
|
} at position ${offsetStart}-${offsetEnd}`;
|
||||||
|
|
||||||
|
const highlight = createMockHighlight( |
||||||
|
id, |
||||||
|
placeholderText, |
||||||
|
placeholderContext, |
||||||
|
sectionAddress, |
||||||
|
highlighterPubkey, |
||||||
|
createdAt, |
||||||
|
authorPubkey, |
||||||
|
annotation, |
||||||
|
offsetStart, |
||||||
|
offsetEnd, |
||||||
|
); |
||||||
|
|
||||||
|
highlights.push(highlight); |
||||||
|
} |
||||||
|
|
||||||
|
return highlights; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Generate mock highlights for multiple sections |
||||||
|
* @param sectionAddresses - Array of section addresses |
||||||
|
* @param authorPubkey - The publication author's pubkey |
||||||
|
* @returns Array of all mock highlights across all sections |
||||||
|
*/ |
||||||
|
export function generateMockHighlightsForSections( |
||||||
|
sectionAddresses: string[], |
||||||
|
authorPubkey: string = |
||||||
|
"dc4cd086cd7ce5b1832adf4fdd1211289880d2c7e295bcb0e684c01acee77c06", |
||||||
|
): any[] { |
||||||
|
const allHighlights: any[] = []; |
||||||
|
|
||||||
|
sectionAddresses.forEach((address, index) => { |
||||||
|
// Each section gets 2 highlights at the very beginning (positions 0-100 and 120-220)
|
||||||
|
const numHighlights = 2; |
||||||
|
const sectionHighlights = generateMockHighlights( |
||||||
|
address, |
||||||
|
authorPubkey, |
||||||
|
numHighlights, |
||||||
|
); |
||||||
|
console.log( |
||||||
|
`[MockHighlightData] Generated ${numHighlights} highlights for section ${ |
||||||
|
address.split(":")[2]?.substring(0, 20) |
||||||
|
}... at positions 0-100, 120-220`,
|
||||||
|
); |
||||||
|
allHighlights.push(...sectionHighlights); |
||||||
|
}); |
||||||
|
|
||||||
|
console.log( |
||||||
|
`[MockHighlightData] Total: ${allHighlights.length} highlights across ${sectionAddresses.length} sections`, |
||||||
|
); |
||||||
|
console.log( |
||||||
|
`[MockHighlightData] Each highlight is anchored to its section via "a" tag and uses offset tags for position`, |
||||||
|
); |
||||||
|
return allHighlights; |
||||||
|
} |
||||||
@ -0,0 +1,420 @@ |
|||||||
|
/** |
||||||
|
* Factory for creating PublicationTree instances from AsciiDoc content |
||||||
|
* |
||||||
|
* This integrates the AST parser with Michael's PublicationTree architecture, |
||||||
|
* providing a clean bridge between AsciiDoc parsing and Nostr event publishing. |
||||||
|
*/ |
||||||
|
|
||||||
|
import { PublicationTree } from "$lib/data_structures/publication_tree"; |
||||||
|
import { SveltePublicationTree } from "$lib/components/publications/svelte_publication_tree.svelte"; |
||||||
|
import { parseAsciiDocAST } from "$lib/utils/asciidoc_ast_parser"; |
||||||
|
import { NDKEvent } from "@nostr-dev-kit/ndk"; |
||||||
|
import type NDK from "@nostr-dev-kit/ndk"; |
||||||
|
import { getMimeTags } from "$lib/utils/mime"; |
||||||
|
|
||||||
|
export interface PublicationTreeFactoryResult { |
||||||
|
tree: PublicationTree; |
||||||
|
svelteTree: SveltePublicationTree; |
||||||
|
indexEvent: NDKEvent | null; |
||||||
|
contentEvents: NDKEvent[]; |
||||||
|
metadata: { |
||||||
|
title: string; |
||||||
|
totalSections: number; |
||||||
|
contentType: "article" | "scattered-notes" | "none"; |
||||||
|
attributes: Record<string, string>; |
||||||
|
}; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Create a PublicationTree from AsciiDoc content using AST parsing |
||||||
|
* This is the main integration point between AST parsing and PublicationTree |
||||||
|
*/ |
||||||
|
export async function createPublicationTreeFromContent( |
||||||
|
content: string, |
||||||
|
ndk: NDK, |
||||||
|
parseLevel: number = 2, |
||||||
|
): Promise<PublicationTreeFactoryResult> { |
||||||
|
// For preview purposes, we can work without authentication
|
||||||
|
// Authentication is only required for actual publishing
|
||||||
|
const hasActiveUser = !!ndk.activeUser; |
||||||
|
|
||||||
|
// Parse content using AST
|
||||||
|
const parsed = parseAsciiDocAST(content, parseLevel); |
||||||
|
|
||||||
|
// Determine content type
|
||||||
|
const contentType = detectContentType(parsed); |
||||||
|
|
||||||
|
let tree: PublicationTree; |
||||||
|
let indexEvent: NDKEvent | null = null; |
||||||
|
const contentEvents: NDKEvent[] = []; |
||||||
|
|
||||||
|
if (contentType === "article" && parsed.title) { |
||||||
|
// Create hierarchical structure: 30040 index + 30041 content events
|
||||||
|
indexEvent = createIndexEvent(parsed, ndk); |
||||||
|
tree = new PublicationTree(indexEvent, ndk); |
||||||
|
|
||||||
|
// Add content events to tree
|
||||||
|
for (const section of parsed.sections) { |
||||||
|
const contentEvent = createContentEvent(section, parsed, ndk); |
||||||
|
await tree.addEvent(contentEvent, indexEvent); |
||||||
|
contentEvents.push(contentEvent); |
||||||
|
} |
||||||
|
} else if (contentType === "scattered-notes") { |
||||||
|
// Create flat structure: only 30041 events
|
||||||
|
if (parsed.sections.length === 0) { |
||||||
|
throw new Error("No sections found for scattered notes"); |
||||||
|
} |
||||||
|
|
||||||
|
// Use first section as root for tree structure
|
||||||
|
const firstSection = parsed.sections[0]; |
||||||
|
const rootEvent = createContentEvent(firstSection, parsed, ndk); |
||||||
|
tree = new PublicationTree(rootEvent, ndk); |
||||||
|
contentEvents.push(rootEvent); |
||||||
|
|
||||||
|
// Add remaining sections
|
||||||
|
for (let i = 1; i < parsed.sections.length; i++) { |
||||||
|
const contentEvent = createContentEvent(parsed.sections[i], parsed, ndk); |
||||||
|
await tree.addEvent(contentEvent, rootEvent); |
||||||
|
contentEvents.push(contentEvent); |
||||||
|
} |
||||||
|
} else { |
||||||
|
throw new Error("No valid content found to create publication tree"); |
||||||
|
} |
||||||
|
|
||||||
|
// Create reactive Svelte wrapper
|
||||||
|
const svelteTree = new SveltePublicationTree( |
||||||
|
indexEvent || contentEvents[0], |
||||||
|
ndk, |
||||||
|
); |
||||||
|
|
||||||
|
return { |
||||||
|
tree, |
||||||
|
svelteTree, |
||||||
|
indexEvent, |
||||||
|
contentEvents, |
||||||
|
metadata: { |
||||||
|
title: parsed.title, |
||||||
|
totalSections: parsed.sections.length, |
||||||
|
contentType, |
||||||
|
attributes: parsed.attributes, |
||||||
|
}, |
||||||
|
}; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Create a 30040 index event from parsed document |
||||||
|
*/ |
||||||
|
function createIndexEvent(parsed: any, ndk: NDK): NDKEvent { |
||||||
|
const event = new NDKEvent(ndk); |
||||||
|
event.kind = 30040; |
||||||
|
event.created_at = Math.floor(Date.now() / 1000); |
||||||
|
// Use placeholder pubkey for preview if no active user
|
||||||
|
event.pubkey = ndk.activeUser?.pubkey || "preview-placeholder-pubkey"; |
||||||
|
|
||||||
|
// Generate d-tag from title
|
||||||
|
const dTag = generateDTag(parsed.title); |
||||||
|
const [mTag, MTag] = getMimeTags(30040); |
||||||
|
|
||||||
|
const tags: string[][] = [["d", dTag], mTag, MTag, ["title", parsed.title]]; |
||||||
|
|
||||||
|
// Add document attributes as tags
|
||||||
|
addDocumentAttributesToTags(tags, parsed.attributes, event.pubkey); |
||||||
|
|
||||||
|
// Generate publication abbreviation for namespacing sections
|
||||||
|
const pubAbbrev = generateTitleAbbreviation(parsed.title); |
||||||
|
|
||||||
|
// Add a-tags for each section (30041 references)
|
||||||
|
// Using new format: kind:pubkey:{abbv}-{section-d-tag}
|
||||||
|
parsed.sections.forEach((section: any) => { |
||||||
|
const sectionDTag = generateDTag(section.title); |
||||||
|
const namespacedDTag = `${pubAbbrev}-${sectionDTag}`; |
||||||
|
tags.push(["a", `30041:${event.pubkey}:${namespacedDTag}`]); |
||||||
|
}); |
||||||
|
|
||||||
|
event.tags = tags; |
||||||
|
event.content = parsed.content || generateIndexContent(parsed); |
||||||
|
|
||||||
|
return event; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Create a 30041 content event from parsed section |
||||||
|
*/ |
||||||
|
function createContentEvent( |
||||||
|
section: any, |
||||||
|
documentParsed: any, |
||||||
|
ndk: NDK, |
||||||
|
): NDKEvent { |
||||||
|
const event = new NDKEvent(ndk); |
||||||
|
event.kind = 30041; |
||||||
|
event.created_at = Math.floor(Date.now() / 1000); |
||||||
|
|
||||||
|
// Use placeholder pubkey for preview if no active user
|
||||||
|
event.pubkey = ndk.activeUser?.pubkey || "preview-placeholder-pubkey"; |
||||||
|
|
||||||
|
// Generate namespaced d-tag using publication abbreviation
|
||||||
|
const sectionDTag = generateDTag(section.title); |
||||||
|
const pubAbbrev = generateTitleAbbreviation(documentParsed.title); |
||||||
|
const namespacedDTag = `${pubAbbrev}-${sectionDTag}`; |
||||||
|
|
||||||
|
const [mTag, MTag] = getMimeTags(30041); |
||||||
|
|
||||||
|
const tags: string[][] = [ |
||||||
|
["d", namespacedDTag], |
||||||
|
mTag, |
||||||
|
MTag, |
||||||
|
["title", section.title], |
||||||
|
]; |
||||||
|
|
||||||
|
// Add section-specific attributes
|
||||||
|
addSectionAttributesToTags(tags, section.attributes); |
||||||
|
|
||||||
|
// Add document-level attributes that should be inherited
|
||||||
|
inheritDocumentAttributes(tags, documentParsed.attributes); |
||||||
|
|
||||||
|
event.tags = tags; |
||||||
|
event.content = section.content || ""; |
||||||
|
|
||||||
|
return event; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Detect content type based on parsed structure |
||||||
|
*/ |
||||||
|
function detectContentType( |
||||||
|
parsed: any, |
||||||
|
): "article" | "scattered-notes" | "none" { |
||||||
|
const hasDocTitle = !!parsed.title; |
||||||
|
const hasSections = parsed.sections.length > 0; |
||||||
|
|
||||||
|
// Check if the "title" is actually just the first section title
|
||||||
|
// This happens when AsciiDoc starts with == instead of =
|
||||||
|
const titleMatchesFirstSection = parsed.sections.length > 0 && |
||||||
|
parsed.title === parsed.sections[0].title; |
||||||
|
|
||||||
|
if (hasDocTitle && hasSections && !titleMatchesFirstSection) { |
||||||
|
return "article"; |
||||||
|
} else if (hasSections) { |
||||||
|
return "scattered-notes"; |
||||||
|
} |
||||||
|
|
||||||
|
return "none"; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Generate deterministic d-tag from title |
||||||
|
*/ |
||||||
|
function generateDTag(title: string): string { |
||||||
|
return ( |
||||||
|
title |
||||||
|
.toLowerCase() |
||||||
|
.replace(/[^\p{L}\p{N}]/gu, "-") |
||||||
|
.replace(/-+/g, "-") |
||||||
|
.replace(/^-|-$/g, "") || "untitled" |
||||||
|
); |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Generate title abbreviation from first letters of each word |
||||||
|
* Used for namespacing section a-tags |
||||||
|
* @param title - The publication title |
||||||
|
* @returns Abbreviation string (e.g., "My Test Article" → "mta") |
||||||
|
*/ |
||||||
|
function generateTitleAbbreviation(title: string): string { |
||||||
|
if (!title || !title.trim()) { |
||||||
|
return "u"; // "untitled"
|
||||||
|
} |
||||||
|
|
||||||
|
// Split on non-alphanumeric characters and filter out empty strings
|
||||||
|
const words = title |
||||||
|
.split(/[^\p{L}\p{N}]+/u) |
||||||
|
.filter((word) => word.length > 0); |
||||||
|
|
||||||
|
if (words.length === 0) { |
||||||
|
return "u"; |
||||||
|
} |
||||||
|
|
||||||
|
// Take first letter of each word and join
|
||||||
|
return words |
||||||
|
.map((word) => word.charAt(0).toLowerCase()) |
||||||
|
.join(""); |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Add document attributes as Nostr tags |
||||||
|
*/ |
||||||
|
function addDocumentAttributesToTags( |
||||||
|
tags: string[][], |
||||||
|
attributes: Record<string, string>, |
||||||
|
pubkey: string, |
||||||
|
) { |
||||||
|
// Standard metadata
|
||||||
|
if (attributes.author) tags.push(["author", attributes.author]); |
||||||
|
if (attributes.version) tags.push(["version", attributes.version]); |
||||||
|
if (attributes.published) tags.push(["published", attributes.published]); |
||||||
|
if (attributes.language) tags.push(["language", attributes.language]); |
||||||
|
if (attributes.image) tags.push(["image", attributes.image]); |
||||||
|
if (attributes.description) tags.push(["summary", attributes.description]); |
||||||
|
|
||||||
|
// Tags
|
||||||
|
if (attributes.tags) { |
||||||
|
attributes.tags.split(",").forEach((tag) => tags.push(["t", tag.trim()])); |
||||||
|
} |
||||||
|
|
||||||
|
// Add pubkey reference
|
||||||
|
tags.push(["p", pubkey]); |
||||||
|
|
||||||
|
// Custom attributes (filtered)
|
||||||
|
addCustomAttributes(tags, attributes); |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Add section-specific attributes as tags |
||||||
|
*/ |
||||||
|
function addSectionAttributesToTags( |
||||||
|
tags: string[][], |
||||||
|
attributes: Record<string, string>, |
||||||
|
) { |
||||||
|
addCustomAttributes(tags, attributes); |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Inherit relevant document attributes for content events |
||||||
|
*/ |
||||||
|
function inheritDocumentAttributes( |
||||||
|
tags: string[][], |
||||||
|
documentAttributes: Record<string, string>, |
||||||
|
) { |
||||||
|
// Inherit selected document attributes
|
||||||
|
if (documentAttributes.language) { |
||||||
|
tags.push(["language", documentAttributes.language]); |
||||||
|
} |
||||||
|
if (documentAttributes.type) tags.push(["type", documentAttributes.type]); |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Add custom attributes, filtering out system ones |
||||||
|
*/ |
||||||
|
function addCustomAttributes( |
||||||
|
tags: string[][], |
||||||
|
attributes: Record<string, string>, |
||||||
|
) { |
||||||
|
const systemAttributes = [ |
||||||
|
"attribute-undefined", |
||||||
|
"attribute-missing", |
||||||
|
"appendix-caption", |
||||||
|
"appendix-refsig", |
||||||
|
"caution-caption", |
||||||
|
"chapter-refsig", |
||||||
|
"example-caption", |
||||||
|
"figure-caption", |
||||||
|
"important-caption", |
||||||
|
"last-update-label", |
||||||
|
"manname-title", |
||||||
|
"note-caption", |
||||||
|
"part-refsig", |
||||||
|
"preface-title", |
||||||
|
"section-refsig", |
||||||
|
"table-caption", |
||||||
|
"tip-caption", |
||||||
|
"toc-title", |
||||||
|
"untitled-label", |
||||||
|
"version-label", |
||||||
|
"warning-caption", |
||||||
|
"asciidoctor", |
||||||
|
"asciidoctor-version", |
||||||
|
"safe-mode-name", |
||||||
|
"backend", |
||||||
|
"doctype", |
||||||
|
"basebackend", |
||||||
|
"filetype", |
||||||
|
"outfilesuffix", |
||||||
|
"stylesdir", |
||||||
|
"iconsdir", |
||||||
|
"localdate", |
||||||
|
"localyear", |
||||||
|
"localtime", |
||||||
|
"localdatetime", |
||||||
|
"docdate", |
||||||
|
"docyear", |
||||||
|
"doctime", |
||||||
|
"docdatetime", |
||||||
|
"doctitle", |
||||||
|
"embedded", |
||||||
|
"notitle", |
||||||
|
// Already handled above
|
||||||
|
"author", |
||||||
|
"version", |
||||||
|
"published", |
||||||
|
"language", |
||||||
|
"image", |
||||||
|
"description", |
||||||
|
"tags", |
||||||
|
"title", |
||||||
|
"type", |
||||||
|
]; |
||||||
|
|
||||||
|
Object.entries(attributes).forEach(([key, value]) => { |
||||||
|
if (!systemAttributes.includes(key) && value && typeof value === "string") { |
||||||
|
tags.push([key, value]); |
||||||
|
} |
||||||
|
}); |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Generate default index content if none provided |
||||||
|
*/ |
||||||
|
function generateIndexContent(parsed: any): string { |
||||||
|
return `# ${parsed.title} |
||||||
|
|
||||||
|
${parsed.sections.length} sections available: |
||||||
|
|
||||||
|
${ |
||||||
|
parsed.sections |
||||||
|
.map((section: any, i: number) => `${i + 1}. ${section.title}`) |
||||||
|
.join("\n") |
||||||
|
}`;
|
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Export events from PublicationTree for publishing |
||||||
|
* This provides compatibility with the current publishing workflow |
||||||
|
*/ |
||||||
|
export async function exportEventsFromTree( |
||||||
|
result: PublicationTreeFactoryResult, |
||||||
|
) { |
||||||
|
const events: any[] = []; |
||||||
|
|
||||||
|
// Add index event if it exists
|
||||||
|
if (result.indexEvent) { |
||||||
|
events.push(eventToPublishableObject(result.indexEvent)); |
||||||
|
} |
||||||
|
|
||||||
|
// Add content events
|
||||||
|
result.contentEvents.forEach((event) => { |
||||||
|
events.push(eventToPublishableObject(event)); |
||||||
|
}); |
||||||
|
|
||||||
|
return { |
||||||
|
indexEvent: result.indexEvent |
||||||
|
? eventToPublishableObject(result.indexEvent) |
||||||
|
: undefined, |
||||||
|
contentEvents: result.contentEvents.map(eventToPublishableObject), |
||||||
|
tree: result.tree, |
||||||
|
}; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Convert NDKEvent to publishable object format |
||||||
|
*/ |
||||||
|
function eventToPublishableObject(event: NDKEvent) { |
||||||
|
return { |
||||||
|
kind: event.kind, |
||||||
|
content: event.content, |
||||||
|
tags: event.tags, |
||||||
|
created_at: event.created_at, |
||||||
|
pubkey: event.pubkey, |
||||||
|
id: event.id, |
||||||
|
title: event.tags.find((t) => t[0] === "title")?.[1] || "Untitled", |
||||||
|
}; |
||||||
|
} |
||||||
@ -0,0 +1,144 @@ |
|||||||
|
/** |
||||||
|
* Wiki link parsing and tag generation utilities |
||||||
|
* Supports [[term]], [[w:term]], and [[d:term]] syntax |
||||||
|
*/ |
||||||
|
|
||||||
|
export interface WikiLink { |
||||||
|
fullMatch: string; |
||||||
|
type: "w" | "d" | "auto"; // auto means [[term]] without explicit prefix
|
||||||
|
term: string; |
||||||
|
displayText: string; |
||||||
|
startIndex: number; |
||||||
|
endIndex: number; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Extracts all wiki links from AsciiDoc content. |
||||||
|
* Supports three formats: |
||||||
|
* - [[term]] - Auto (will query both w and d tags) |
||||||
|
* - [[w:term]] - Explicit reference/mention (backward link) |
||||||
|
* - [[d:term]] - Explicit definition (forward link) |
||||||
|
*/ |
||||||
|
export function extractWikiLinks(content: string): WikiLink[] { |
||||||
|
const wikiLinks: WikiLink[] = []; |
||||||
|
|
||||||
|
// Match [[prefix:term]] or [[term]]
|
||||||
|
// Captures: optional prefix (w: or d:), term, optional display text after |
|
||||||
|
const regex = /\[\[(?:(w|d):)?([^\]|]+)(?:\|([^\]]+))?\]\]/g; |
||||||
|
|
||||||
|
let match; |
||||||
|
while ((match = regex.exec(content)) !== null) { |
||||||
|
const prefix = match[1]; // 'w', 'd', or undefined
|
||||||
|
const term = match[2].trim(); |
||||||
|
const customDisplay = match[3]?.trim(); |
||||||
|
|
||||||
|
wikiLinks.push({ |
||||||
|
fullMatch: match[0], |
||||||
|
type: prefix ? (prefix as "w" | "d") : "auto", |
||||||
|
term, |
||||||
|
displayText: customDisplay || term, |
||||||
|
startIndex: match.index, |
||||||
|
endIndex: match.index + match[0].length, |
||||||
|
}); |
||||||
|
} |
||||||
|
|
||||||
|
return wikiLinks; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Converts a term to a clean tag format (lowercase, hyphenated). |
||||||
|
* Example: "Knowledge Graphs" -> "knowledge-graphs" |
||||||
|
*/ |
||||||
|
export function termToTag(term: string): string { |
||||||
|
return term |
||||||
|
.toLowerCase() |
||||||
|
.trim() |
||||||
|
.replace(/\s+/g, "-") |
||||||
|
.replace(/[^a-z0-9-]/g, ""); |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Generates Nostr event tags from wiki links. |
||||||
|
* Format: ['w', 'tag-slug', 'Display Text'] or ['d', 'tag-slug'] |
||||||
|
*/ |
||||||
|
export function wikiLinksToTags(wikiLinks: WikiLink[]): string[][] { |
||||||
|
const tags: string[][] = []; |
||||||
|
|
||||||
|
for (const link of wikiLinks) { |
||||||
|
const tagSlug = termToTag(link.term); |
||||||
|
|
||||||
|
if (link.type === "w" || link.type === "auto") { |
||||||
|
// Reference tag includes display text
|
||||||
|
tags.push(["w", tagSlug, link.displayText]); |
||||||
|
} |
||||||
|
|
||||||
|
if (link.type === "d") { |
||||||
|
// Definition tag (no display text, it IS the thing)
|
||||||
|
tags.push(["d", tagSlug]); |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
return tags; |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Replaces wiki link syntax with HTML for preview rendering. |
||||||
|
* Can be customized for different rendering styles. |
||||||
|
*/ |
||||||
|
export function renderWikiLinksToHtml( |
||||||
|
content: string, |
||||||
|
options: { |
||||||
|
linkClass?: string; |
||||||
|
wLinkClass?: string; |
||||||
|
dLinkClass?: string; |
||||||
|
onClickHandler?: (type: "w" | "d" | "auto", term: string) => string; |
||||||
|
} = {}, |
||||||
|
): string { |
||||||
|
const { |
||||||
|
linkClass = "wiki-link", |
||||||
|
wLinkClass = "wiki-link-reference", |
||||||
|
dLinkClass = "wiki-link-definition", |
||||||
|
onClickHandler, |
||||||
|
} = options; |
||||||
|
|
||||||
|
return content.replace( |
||||||
|
/\[\[(?:(w|d):)?([^\]|]+)(?:\|([^\]]+))?\]\]/g, |
||||||
|
(match, prefix, term, customDisplay) => { |
||||||
|
const displayText = customDisplay?.trim() || term.trim(); |
||||||
|
const type = prefix ? prefix : "auto"; |
||||||
|
const tagSlug = termToTag(term); |
||||||
|
|
||||||
|
// Determine CSS classes
|
||||||
|
let classes = linkClass; |
||||||
|
if (type === "w") classes += ` ${wLinkClass}`; |
||||||
|
else if (type === "d") classes += ` ${dLinkClass}`; |
||||||
|
|
||||||
|
// Generate href or onclick
|
||||||
|
const action = onClickHandler |
||||||
|
? `onclick="${onClickHandler(type, tagSlug)}"` |
||||||
|
: `href="#wiki/${type}/${encodeURIComponent(tagSlug)}"`; |
||||||
|
|
||||||
|
// Add title attribute showing the type
|
||||||
|
const title = type === "w" |
||||||
|
? "Wiki reference (mentions this concept)" |
||||||
|
: type === "d" |
||||||
|
? "Wiki definition (defines this concept)" |
||||||
|
: "Wiki link (searches both references and definitions)"; |
||||||
|
|
||||||
|
return `<a class="${classes}" ${action} title="${title}" data-wiki-type="${type}" data-wiki-term="${tagSlug}">${displayText}</a>`; |
||||||
|
}, |
||||||
|
); |
||||||
|
} |
||||||
|
|
||||||
|
/** |
||||||
|
* Converts wiki links to plain text (for content storage). |
||||||
|
* Preserves the display text if custom, otherwise uses the term. |
||||||
|
*/ |
||||||
|
export function wikiLinksToPlainText(content: string): string { |
||||||
|
return content.replace( |
||||||
|
/\[\[(?:w|d:)?([^\]|]+)(?:\|([^\]]+))?\]\]/g, |
||||||
|
(match, term, customDisplay) => { |
||||||
|
return customDisplay?.trim() || term.trim(); |
||||||
|
}, |
||||||
|
); |
||||||
|
} |
||||||
@ -0,0 +1,906 @@ |
|||||||
|
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest"; |
||||||
|
import { get, writable } from "svelte/store"; |
||||||
|
import type { UserState } from "../../src/lib/stores/userStore.ts"; |
||||||
|
import { NDKEvent } from "@nostr-dev-kit/ndk"; |
||||||
|
|
||||||
|
// Mock userStore
|
||||||
|
const createMockUserStore = (signedIn: boolean = false) => { |
||||||
|
const store = writable<UserState>({ |
||||||
|
pubkey: signedIn ? "a".repeat(64) : null, |
||||||
|
npub: signedIn ? "npub1test" : null, |
||||||
|
profile: signedIn |
||||||
|
? { |
||||||
|
name: "Test User", |
||||||
|
displayName: "Test User", |
||||||
|
picture: "https://example.com/avatar.jpg", |
||||||
|
} |
||||||
|
: null, |
||||||
|
relays: { inbox: [], outbox: [] }, |
||||||
|
loginMethod: signedIn ? "extension" : null, |
||||||
|
ndkUser: null, |
||||||
|
signer: signedIn ? { sign: vi.fn() } as any : null, |
||||||
|
signedIn, |
||||||
|
}); |
||||||
|
return store; |
||||||
|
}; |
||||||
|
|
||||||
|
// Mock activeOutboxRelays
|
||||||
|
const mockActiveOutboxRelays = writable<string[]>(["wss://relay.example.com"]); |
||||||
|
|
||||||
|
// Mock NDK
|
||||||
|
const createMockNDK = () => ({ |
||||||
|
fetchEvent: vi.fn(), |
||||||
|
publish: vi.fn(), |
||||||
|
}); |
||||||
|
|
||||||
|
describe("CommentButton - Address Parsing", () => { |
||||||
|
it("parses valid event address correctly", () => { |
||||||
|
const address = "30041:abc123def456:my-article"; |
||||||
|
const parts = address.split(":"); |
||||||
|
|
||||||
|
expect(parts).toHaveLength(3); |
||||||
|
|
||||||
|
const [kindStr, pubkey, dTag] = parts; |
||||||
|
const kind = parseInt(kindStr); |
||||||
|
|
||||||
|
expect(kind).toBe(30041); |
||||||
|
expect(pubkey).toBe("abc123def456"); |
||||||
|
expect(dTag).toBe("my-article"); |
||||||
|
expect(isNaN(kind)).toBe(false); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles dTag with colons correctly", () => { |
||||||
|
const address = "30041:abc123:article:with:colons"; |
||||||
|
const parts = address.split(":"); |
||||||
|
|
||||||
|
expect(parts.length).toBeGreaterThanOrEqual(3); |
||||||
|
|
||||||
|
const [kindStr, pubkey, ...dTagParts] = parts; |
||||||
|
const dTag = dTagParts.join(":"); |
||||||
|
|
||||||
|
expect(parseInt(kindStr)).toBe(30041); |
||||||
|
expect(pubkey).toBe("abc123"); |
||||||
|
expect(dTag).toBe("article:with:colons"); |
||||||
|
}); |
||||||
|
|
||||||
|
it("returns null for invalid address format (too few parts)", () => { |
||||||
|
const address = "30041:abc123"; |
||||||
|
const parts = address.split(":"); |
||||||
|
|
||||||
|
if (parts.length !== 3) { |
||||||
|
expect(parts.length).toBeLessThan(3); |
||||||
|
} |
||||||
|
}); |
||||||
|
|
||||||
|
it("returns null for invalid address format (invalid kind)", () => { |
||||||
|
const address = "invalid:abc123:dtag"; |
||||||
|
const parts = address.split(":"); |
||||||
|
const kind = parseInt(parts[0]); |
||||||
|
|
||||||
|
expect(isNaN(kind)).toBe(true); |
||||||
|
}); |
||||||
|
|
||||||
|
it("parses different publication kinds correctly", () => { |
||||||
|
const addresses = [ |
||||||
|
"30040:pubkey:section-id", // Zettel section
|
||||||
|
"30041:pubkey:article-id", // Long-form article
|
||||||
|
"30818:pubkey:wiki-id", // Wiki article
|
||||||
|
"30023:pubkey:blog-id", // Blog post
|
||||||
|
]; |
||||||
|
|
||||||
|
addresses.forEach((address) => { |
||||||
|
const parts = address.split(":"); |
||||||
|
const kind = parseInt(parts[0]); |
||||||
|
|
||||||
|
expect(isNaN(kind)).toBe(false); |
||||||
|
expect(kind).toBeGreaterThan(0); |
||||||
|
}); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("CommentButton - NIP-22 Event Creation", () => { |
||||||
|
let mockNDK: any; |
||||||
|
let mockUserStore: any; |
||||||
|
let mockActiveOutboxRelays: any; |
||||||
|
|
||||||
|
beforeEach(() => { |
||||||
|
mockNDK = createMockNDK(); |
||||||
|
mockUserStore = createMockUserStore(true); |
||||||
|
mockActiveOutboxRelays = writable(["wss://relay.example.com"]); |
||||||
|
}); |
||||||
|
|
||||||
|
afterEach(() => { |
||||||
|
vi.clearAllMocks(); |
||||||
|
}); |
||||||
|
|
||||||
|
it("creates kind 1111 comment event", async () => { |
||||||
|
const address = "30041:" + "a".repeat(64) + ":my-article"; |
||||||
|
const content = "This is my comment"; |
||||||
|
|
||||||
|
// Mock event creation
|
||||||
|
const commentEvent = new NDKEvent(mockNDK); |
||||||
|
commentEvent.kind = 1111; |
||||||
|
commentEvent.content = content; |
||||||
|
|
||||||
|
expect(commentEvent.kind).toBe(1111); |
||||||
|
expect(commentEvent.content).toBe(content); |
||||||
|
}); |
||||||
|
|
||||||
|
it("includes correct uppercase tags (A, K, P) for root", () => { |
||||||
|
const address = "30041:" + "b".repeat(64) + ":article-id"; |
||||||
|
const authorPubkey = "b".repeat(64); |
||||||
|
const kind = 30041; |
||||||
|
const relayHint = "wss://relay.example.com"; |
||||||
|
|
||||||
|
const tags = [ |
||||||
|
["A", address, relayHint, authorPubkey], |
||||||
|
["K", kind.toString()], |
||||||
|
["P", authorPubkey, relayHint], |
||||||
|
]; |
||||||
|
|
||||||
|
// Verify uppercase root tags
|
||||||
|
expect(tags[0][0]).toBe("A"); |
||||||
|
expect(tags[0][1]).toBe(address); |
||||||
|
expect(tags[0][2]).toBe(relayHint); |
||||||
|
expect(tags[0][3]).toBe(authorPubkey); |
||||||
|
|
||||||
|
expect(tags[1][0]).toBe("K"); |
||||||
|
expect(tags[1][1]).toBe(kind.toString()); |
||||||
|
|
||||||
|
expect(tags[2][0]).toBe("P"); |
||||||
|
expect(tags[2][1]).toBe(authorPubkey); |
||||||
|
expect(tags[2][2]).toBe(relayHint); |
||||||
|
}); |
||||||
|
|
||||||
|
it("includes correct lowercase tags (a, k, p) for parent", () => { |
||||||
|
const address = "30041:" + "c".repeat(64) + ":article-id"; |
||||||
|
const authorPubkey = "c".repeat(64); |
||||||
|
const kind = 30041; |
||||||
|
const relayHint = "wss://relay.example.com"; |
||||||
|
|
||||||
|
const tags = [ |
||||||
|
["a", address, relayHint], |
||||||
|
["k", kind.toString()], |
||||||
|
["p", authorPubkey, relayHint], |
||||||
|
]; |
||||||
|
|
||||||
|
// Verify lowercase parent tags
|
||||||
|
expect(tags[0][0]).toBe("a"); |
||||||
|
expect(tags[0][1]).toBe(address); |
||||||
|
expect(tags[0][2]).toBe(relayHint); |
||||||
|
|
||||||
|
expect(tags[1][0]).toBe("k"); |
||||||
|
expect(tags[1][1]).toBe(kind.toString()); |
||||||
|
|
||||||
|
expect(tags[2][0]).toBe("p"); |
||||||
|
expect(tags[2][1]).toBe(authorPubkey); |
||||||
|
expect(tags[2][2]).toBe(relayHint); |
||||||
|
}); |
||||||
|
|
||||||
|
it("includes e tag with event ID when available", () => { |
||||||
|
const eventId = "d".repeat(64); |
||||||
|
const relayHint = "wss://relay.example.com"; |
||||||
|
|
||||||
|
const eTag = ["e", eventId, relayHint]; |
||||||
|
|
||||||
|
expect(eTag[0]).toBe("e"); |
||||||
|
expect(eTag[1]).toBe(eventId); |
||||||
|
expect(eTag[2]).toBe(relayHint); |
||||||
|
expect(eTag[1]).toHaveLength(64); |
||||||
|
}); |
||||||
|
|
||||||
|
it("creates complete NIP-22 tag structure", () => { |
||||||
|
const address = "30041:" + "e".repeat(64) + ":test-article"; |
||||||
|
const authorPubkey = "e".repeat(64); |
||||||
|
const kind = 30041; |
||||||
|
const eventId = "f".repeat(64); |
||||||
|
const relayHint = "wss://relay.example.com"; |
||||||
|
|
||||||
|
const tags = [ |
||||||
|
// Root scope - uppercase tags
|
||||||
|
["A", address, relayHint, authorPubkey], |
||||||
|
["K", kind.toString()], |
||||||
|
["P", authorPubkey, relayHint], |
||||||
|
|
||||||
|
// Parent scope - lowercase tags
|
||||||
|
["a", address, relayHint], |
||||||
|
["k", kind.toString()], |
||||||
|
["p", authorPubkey, relayHint], |
||||||
|
|
||||||
|
// Event ID
|
||||||
|
["e", eventId, relayHint], |
||||||
|
]; |
||||||
|
|
||||||
|
// Verify all tags are present
|
||||||
|
expect(tags).toHaveLength(7); |
||||||
|
|
||||||
|
// Verify root tags
|
||||||
|
expect(tags.filter((t) => t[0] === "A")).toHaveLength(1); |
||||||
|
expect(tags.filter((t) => t[0] === "K")).toHaveLength(1); |
||||||
|
expect(tags.filter((t) => t[0] === "P")).toHaveLength(1); |
||||||
|
|
||||||
|
// Verify parent tags
|
||||||
|
expect(tags.filter((t) => t[0] === "a")).toHaveLength(1); |
||||||
|
expect(tags.filter((t) => t[0] === "k")).toHaveLength(1); |
||||||
|
expect(tags.filter((t) => t[0] === "p")).toHaveLength(1); |
||||||
|
|
||||||
|
// Verify event tag
|
||||||
|
expect(tags.filter((t) => t[0] === "e")).toHaveLength(1); |
||||||
|
}); |
||||||
|
|
||||||
|
it("uses correct relay hints from activeOutboxRelays", () => { |
||||||
|
const relays: string[] = get(mockActiveOutboxRelays); |
||||||
|
const relayHint = relays[0]; |
||||||
|
|
||||||
|
expect(relayHint).toBe("wss://relay.example.com"); |
||||||
|
expect(relays).toHaveLength(1); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles multiple outbox relays correctly", () => { |
||||||
|
const multipleRelays = writable([ |
||||||
|
"wss://relay1.example.com", |
||||||
|
"wss://relay2.example.com", |
||||||
|
"wss://relay3.example.com", |
||||||
|
]); |
||||||
|
|
||||||
|
const relays = get(multipleRelays); |
||||||
|
const relayHint = relays[0]; |
||||||
|
|
||||||
|
expect(relayHint).toBe("wss://relay1.example.com"); |
||||||
|
expect(relays).toHaveLength(3); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles empty relay list gracefully", () => { |
||||||
|
const emptyRelays = writable<string[]>([]); |
||||||
|
const relays = get(emptyRelays); |
||||||
|
const relayHint = relays[0] || ""; |
||||||
|
|
||||||
|
expect(relayHint).toBe(""); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("CommentButton - Event Signing and Publishing", () => { |
||||||
|
let mockNDK: any; |
||||||
|
let mockSigner: any; |
||||||
|
|
||||||
|
beforeEach(() => { |
||||||
|
mockNDK = createMockNDK(); |
||||||
|
mockSigner = { |
||||||
|
sign: vi.fn().mockResolvedValue(undefined), |
||||||
|
}; |
||||||
|
}); |
||||||
|
|
||||||
|
afterEach(() => { |
||||||
|
vi.clearAllMocks(); |
||||||
|
}); |
||||||
|
|
||||||
|
it("signs event with user signer", async () => { |
||||||
|
const commentEvent = new NDKEvent(mockNDK); |
||||||
|
commentEvent.kind = 1111; |
||||||
|
commentEvent.content = "Test comment"; |
||||||
|
|
||||||
|
await mockSigner.sign(commentEvent); |
||||||
|
|
||||||
|
expect(mockSigner.sign).toHaveBeenCalledWith(commentEvent); |
||||||
|
expect(mockSigner.sign).toHaveBeenCalledTimes(1); |
||||||
|
}); |
||||||
|
|
||||||
|
it("publishes to outbox relays", async () => { |
||||||
|
const publishMock = vi.fn().mockResolvedValue( |
||||||
|
new Set(["wss://relay.example.com"]), |
||||||
|
); |
||||||
|
|
||||||
|
const commentEvent = new NDKEvent(mockNDK); |
||||||
|
commentEvent.publish = publishMock; |
||||||
|
|
||||||
|
const publishedRelays = await commentEvent.publish(); |
||||||
|
|
||||||
|
expect(publishMock).toHaveBeenCalled(); |
||||||
|
expect(publishedRelays.size).toBeGreaterThan(0); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles publishing errors gracefully", async () => { |
||||||
|
const publishMock = vi.fn().mockResolvedValue(new Set()); |
||||||
|
|
||||||
|
const commentEvent = new NDKEvent(mockNDK); |
||||||
|
commentEvent.publish = publishMock; |
||||||
|
|
||||||
|
const publishedRelays = await commentEvent.publish(); |
||||||
|
|
||||||
|
expect(publishedRelays.size).toBe(0); |
||||||
|
}); |
||||||
|
|
||||||
|
it("throws error when publishing fails", async () => { |
||||||
|
const publishMock = vi.fn().mockRejectedValue(new Error("Network error")); |
||||||
|
|
||||||
|
const commentEvent = new NDKEvent(mockNDK); |
||||||
|
commentEvent.publish = publishMock; |
||||||
|
|
||||||
|
await expect(commentEvent.publish()).rejects.toThrow("Network error"); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("CommentButton - User Authentication", () => { |
||||||
|
it("requires user to be signed in", () => { |
||||||
|
const signedOutStore = createMockUserStore(false); |
||||||
|
const user = get(signedOutStore); |
||||||
|
|
||||||
|
expect(user.signedIn).toBe(false); |
||||||
|
expect(user.signer).toBeNull(); |
||||||
|
}); |
||||||
|
|
||||||
|
it("shows error when user is not signed in", () => { |
||||||
|
const signedOutStore = createMockUserStore(false); |
||||||
|
const user = get(signedOutStore); |
||||||
|
|
||||||
|
if (!user.signedIn || !user.signer) { |
||||||
|
const error = "You must be signed in to comment"; |
||||||
|
expect(error).toBe("You must be signed in to comment"); |
||||||
|
} |
||||||
|
}); |
||||||
|
|
||||||
|
it("allows commenting when user is signed in", () => { |
||||||
|
const signedInStore = createMockUserStore(true); |
||||||
|
const user = get(signedInStore); |
||||||
|
|
||||||
|
expect(user.signedIn).toBe(true); |
||||||
|
expect(user.signer).not.toBeNull(); |
||||||
|
}); |
||||||
|
|
||||||
|
it("displays user profile information when signed in", () => { |
||||||
|
const signedInStore = createMockUserStore(true); |
||||||
|
const user = get(signedInStore); |
||||||
|
|
||||||
|
expect(user.profile).not.toBeNull(); |
||||||
|
expect(user.profile?.displayName).toBe("Test User"); |
||||||
|
expect(user.profile?.picture).toBe("https://example.com/avatar.jpg"); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles missing user profile gracefully", () => { |
||||||
|
const storeWithoutProfile = writable<UserState>({ |
||||||
|
pubkey: "a".repeat(64), |
||||||
|
npub: "npub1test", |
||||||
|
profile: null, |
||||||
|
relays: { inbox: [], outbox: [] }, |
||||||
|
loginMethod: "extension", |
||||||
|
ndkUser: null, |
||||||
|
signer: { sign: vi.fn() } as any, |
||||||
|
signedIn: true, |
||||||
|
}); |
||||||
|
|
||||||
|
const user = get(storeWithoutProfile); |
||||||
|
const displayName = user.profile?.displayName || user.profile?.name || |
||||||
|
"Anonymous"; |
||||||
|
|
||||||
|
expect(displayName).toBe("Anonymous"); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("CommentButton - User Interactions", () => { |
||||||
|
it("prevents submission of empty comment", () => { |
||||||
|
const commentContent = ""; |
||||||
|
const isEmpty = !commentContent.trim(); |
||||||
|
|
||||||
|
expect(isEmpty).toBe(true); |
||||||
|
}); |
||||||
|
|
||||||
|
it("allows submission of non-empty comment", () => { |
||||||
|
const commentContent = "This is a valid comment"; |
||||||
|
const isEmpty = !commentContent.trim(); |
||||||
|
|
||||||
|
expect(isEmpty).toBe(false); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles whitespace-only comments as empty", () => { |
||||||
|
const commentContent = " \n\t "; |
||||||
|
const isEmpty = !commentContent.trim(); |
||||||
|
|
||||||
|
expect(isEmpty).toBe(true); |
||||||
|
}); |
||||||
|
|
||||||
|
it("clears input after successful comment", () => { |
||||||
|
let commentContent = "This is my comment"; |
||||||
|
|
||||||
|
// Simulate successful submission
|
||||||
|
commentContent = ""; |
||||||
|
|
||||||
|
expect(commentContent).toBe(""); |
||||||
|
}); |
||||||
|
|
||||||
|
it("closes comment UI after successful posting", () => { |
||||||
|
let showCommentUI = true; |
||||||
|
|
||||||
|
// Simulate successful post with delay
|
||||||
|
setTimeout(() => { |
||||||
|
showCommentUI = false; |
||||||
|
}, 0); |
||||||
|
|
||||||
|
// Initially still open
|
||||||
|
expect(showCommentUI).toBe(true); |
||||||
|
}); |
||||||
|
|
||||||
|
it("calls onCommentPosted callback when provided", () => { |
||||||
|
const onCommentPosted = vi.fn(); |
||||||
|
|
||||||
|
// Simulate successful comment post
|
||||||
|
onCommentPosted(); |
||||||
|
|
||||||
|
expect(onCommentPosted).toHaveBeenCalled(); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("CommentButton - UI State Management", () => { |
||||||
|
it("button is hidden by default", () => { |
||||||
|
const sectionHovered = false; |
||||||
|
const showCommentUI = false; |
||||||
|
const visible = sectionHovered || showCommentUI; |
||||||
|
|
||||||
|
expect(visible).toBe(false); |
||||||
|
}); |
||||||
|
|
||||||
|
it("button appears on section hover", () => { |
||||||
|
const sectionHovered = true; |
||||||
|
const showCommentUI = false; |
||||||
|
const visible = sectionHovered || showCommentUI; |
||||||
|
|
||||||
|
expect(visible).toBe(true); |
||||||
|
}); |
||||||
|
|
||||||
|
it("button remains visible when comment UI is shown", () => { |
||||||
|
const sectionHovered = false; |
||||||
|
const showCommentUI = true; |
||||||
|
const visible = sectionHovered || showCommentUI; |
||||||
|
|
||||||
|
expect(visible).toBe(true); |
||||||
|
}); |
||||||
|
|
||||||
|
it("toggles comment UI when button is clicked", () => { |
||||||
|
let showCommentUI = false; |
||||||
|
|
||||||
|
// Simulate button click
|
||||||
|
showCommentUI = !showCommentUI; |
||||||
|
expect(showCommentUI).toBe(true); |
||||||
|
|
||||||
|
// Click again
|
||||||
|
showCommentUI = !showCommentUI; |
||||||
|
expect(showCommentUI).toBe(false); |
||||||
|
}); |
||||||
|
|
||||||
|
it("resets error state when toggling UI", () => { |
||||||
|
let error: string | null = "Previous error"; |
||||||
|
let success = true; |
||||||
|
|
||||||
|
// Simulate UI toggle
|
||||||
|
error = null; |
||||||
|
success = false; |
||||||
|
|
||||||
|
expect(error).toBeNull(); |
||||||
|
expect(success).toBe(false); |
||||||
|
}); |
||||||
|
|
||||||
|
it("shows error message when present", () => { |
||||||
|
const error = "Failed to post comment"; |
||||||
|
|
||||||
|
expect(error).toBeDefined(); |
||||||
|
expect(error.length).toBeGreaterThan(0); |
||||||
|
}); |
||||||
|
|
||||||
|
it("shows success message after posting", () => { |
||||||
|
const success = true; |
||||||
|
const successMessage = "Comment posted successfully!"; |
||||||
|
|
||||||
|
if (success) { |
||||||
|
expect(successMessage).toBe("Comment posted successfully!"); |
||||||
|
} |
||||||
|
}); |
||||||
|
|
||||||
|
it("disables submit button when submitting", () => { |
||||||
|
const isSubmitting = true; |
||||||
|
const disabled = isSubmitting; |
||||||
|
|
||||||
|
expect(disabled).toBe(true); |
||||||
|
}); |
||||||
|
|
||||||
|
it("disables submit button when comment is empty", () => { |
||||||
|
const commentContent = ""; |
||||||
|
const isSubmitting = false; |
||||||
|
const disabled = isSubmitting || !commentContent.trim(); |
||||||
|
|
||||||
|
expect(disabled).toBe(true); |
||||||
|
}); |
||||||
|
|
||||||
|
it("enables submit button when comment is valid", () => { |
||||||
|
const commentContent = "Valid comment"; |
||||||
|
const isSubmitting = false; |
||||||
|
const disabled = isSubmitting || !commentContent.trim(); |
||||||
|
|
||||||
|
expect(disabled).toBe(false); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("CommentButton - Edge Cases", () => { |
||||||
|
it("handles invalid address format gracefully", () => { |
||||||
|
const invalidAddresses = [ |
||||||
|
"", |
||||||
|
"invalid", |
||||||
|
"30041:", |
||||||
|
":pubkey:dtag", |
||||||
|
"30041:pubkey", |
||||||
|
"not-a-number:pubkey:dtag", |
||||||
|
]; |
||||||
|
|
||||||
|
invalidAddresses.forEach((address) => { |
||||||
|
const parts = address.split(":"); |
||||||
|
const isValid = parts.length === 3 && !isNaN(parseInt(parts[0])); |
||||||
|
|
||||||
|
expect(isValid).toBe(false); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles network errors during event fetch", async () => { |
||||||
|
const mockNDK = { |
||||||
|
fetchEvent: vi.fn().mockRejectedValue(new Error("Network error")), |
||||||
|
}; |
||||||
|
|
||||||
|
let eventId = ""; |
||||||
|
try { |
||||||
|
await mockNDK.fetchEvent({}); |
||||||
|
} catch (err) { |
||||||
|
// Handle gracefully, continue without event ID
|
||||||
|
eventId = ""; |
||||||
|
} |
||||||
|
|
||||||
|
expect(eventId).toBe(""); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles missing relay information", () => { |
||||||
|
const emptyRelays: string[] = []; |
||||||
|
const relayHint = emptyRelays[0] || ""; |
||||||
|
|
||||||
|
expect(relayHint).toBe(""); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles very long comment text without truncation", () => { |
||||||
|
const longComment = "a".repeat(10000); |
||||||
|
const content = longComment; |
||||||
|
|
||||||
|
expect(content.length).toBe(10000); |
||||||
|
expect(content).toBe(longComment); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles special characters in comments", () => { |
||||||
|
const specialComments = [ |
||||||
|
'Comment with "quotes"', |
||||||
|
"Comment with emoji 😊", |
||||||
|
"Comment with\nnewlines", |
||||||
|
"Comment with\ttabs", |
||||||
|
"Comment with <html> tags", |
||||||
|
"Comment with & ampersands", |
||||||
|
]; |
||||||
|
|
||||||
|
specialComments.forEach((comment) => { |
||||||
|
expect(comment.length).toBeGreaterThan(0); |
||||||
|
expect(typeof comment).toBe("string"); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles event creation failure", async () => { |
||||||
|
const address = "invalid:address"; |
||||||
|
const parts = address.split(":"); |
||||||
|
|
||||||
|
if (parts.length !== 3) { |
||||||
|
const error = "Invalid event address"; |
||||||
|
expect(error).toBe("Invalid event address"); |
||||||
|
} |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles signing errors", async () => { |
||||||
|
const mockSigner = { |
||||||
|
sign: vi.fn().mockRejectedValue(new Error("Signing failed")), |
||||||
|
}; |
||||||
|
|
||||||
|
const event = { kind: 1111, content: "test" }; |
||||||
|
|
||||||
|
await expect(mockSigner.sign(event)).rejects.toThrow("Signing failed"); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles publish failure when no relays accept event", async () => { |
||||||
|
const publishMock = vi.fn().mockResolvedValue(new Set()); |
||||||
|
|
||||||
|
const relaySet = await publishMock(); |
||||||
|
|
||||||
|
if (relaySet.size === 0) { |
||||||
|
const error = "Failed to publish to any relays"; |
||||||
|
expect(error).toBe("Failed to publish to any relays"); |
||||||
|
} |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("CommentButton - Cancel Functionality", () => { |
||||||
|
it("clears comment content when canceling", () => { |
||||||
|
let commentContent = "This comment will be canceled"; |
||||||
|
|
||||||
|
// Simulate cancel
|
||||||
|
commentContent = ""; |
||||||
|
|
||||||
|
expect(commentContent).toBe(""); |
||||||
|
}); |
||||||
|
|
||||||
|
it("closes comment UI when canceling", () => { |
||||||
|
let showCommentUI = true; |
||||||
|
|
||||||
|
// Simulate cancel
|
||||||
|
showCommentUI = false; |
||||||
|
|
||||||
|
expect(showCommentUI).toBe(false); |
||||||
|
}); |
||||||
|
|
||||||
|
it("clears error state when canceling", () => { |
||||||
|
let error: string | null = "Some error"; |
||||||
|
|
||||||
|
// Simulate cancel
|
||||||
|
error = null; |
||||||
|
|
||||||
|
expect(error).toBeNull(); |
||||||
|
}); |
||||||
|
|
||||||
|
it("clears success state when canceling", () => { |
||||||
|
let success = true; |
||||||
|
|
||||||
|
// Simulate cancel
|
||||||
|
success = false; |
||||||
|
|
||||||
|
expect(success).toBe(false); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("CommentButton - Event Fetching", () => { |
||||||
|
let mockNDK: any; |
||||||
|
|
||||||
|
beforeEach(() => { |
||||||
|
mockNDK = createMockNDK(); |
||||||
|
}); |
||||||
|
|
||||||
|
afterEach(() => { |
||||||
|
vi.clearAllMocks(); |
||||||
|
}); |
||||||
|
|
||||||
|
it("fetches target event to get event ID", async () => { |
||||||
|
const address = "30041:" + "a".repeat(64) + ":article"; |
||||||
|
const parts = address.split(":"); |
||||||
|
const [kindStr, authorPubkey, dTag] = parts; |
||||||
|
const kind = parseInt(kindStr); |
||||||
|
|
||||||
|
const mockEvent = { |
||||||
|
id: "b".repeat(64), |
||||||
|
kind, |
||||||
|
pubkey: authorPubkey, |
||||||
|
tags: [["d", dTag]], |
||||||
|
}; |
||||||
|
|
||||||
|
mockNDK.fetchEvent.mockResolvedValue(mockEvent); |
||||||
|
|
||||||
|
const targetEvent = await mockNDK.fetchEvent({ |
||||||
|
kinds: [kind], |
||||||
|
authors: [authorPubkey], |
||||||
|
"#d": [dTag], |
||||||
|
}); |
||||||
|
|
||||||
|
expect(mockNDK.fetchEvent).toHaveBeenCalled(); |
||||||
|
expect(targetEvent?.id).toBe("b".repeat(64)); |
||||||
|
}); |
||||||
|
|
||||||
|
it("continues without event ID when fetch fails", async () => { |
||||||
|
mockNDK.fetchEvent.mockRejectedValue(new Error("Fetch failed")); |
||||||
|
|
||||||
|
let eventId = ""; |
||||||
|
try { |
||||||
|
const targetEvent = await mockNDK.fetchEvent({}); |
||||||
|
if (targetEvent) { |
||||||
|
eventId = targetEvent.id; |
||||||
|
} |
||||||
|
} catch (err) { |
||||||
|
// Continue without event ID
|
||||||
|
eventId = ""; |
||||||
|
} |
||||||
|
|
||||||
|
expect(eventId).toBe(""); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles null event from fetch", async () => { |
||||||
|
mockNDK.fetchEvent.mockResolvedValue(null); |
||||||
|
|
||||||
|
const targetEvent = await mockNDK.fetchEvent({}); |
||||||
|
let eventId = ""; |
||||||
|
|
||||||
|
if (targetEvent) { |
||||||
|
eventId = targetEvent.id; |
||||||
|
} |
||||||
|
|
||||||
|
expect(eventId).toBe(""); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("CommentButton - CSS Classes and Styling", () => { |
||||||
|
it("applies visible class when section is hovered", () => { |
||||||
|
const sectionHovered = true; |
||||||
|
const showCommentUI = false; |
||||||
|
const hasVisibleClass = sectionHovered || showCommentUI; |
||||||
|
|
||||||
|
expect(hasVisibleClass).toBe(true); |
||||||
|
}); |
||||||
|
|
||||||
|
it("removes visible class when not hovered and UI closed", () => { |
||||||
|
const sectionHovered = false; |
||||||
|
const showCommentUI = false; |
||||||
|
const hasVisibleClass = sectionHovered || showCommentUI; |
||||||
|
|
||||||
|
expect(hasVisibleClass).toBe(false); |
||||||
|
}); |
||||||
|
|
||||||
|
it("button has correct aria-label", () => { |
||||||
|
const ariaLabel = "Add comment"; |
||||||
|
|
||||||
|
expect(ariaLabel).toBe("Add comment"); |
||||||
|
}); |
||||||
|
|
||||||
|
it("button has correct title attribute", () => { |
||||||
|
const title = "Add comment"; |
||||||
|
|
||||||
|
expect(title).toBe("Add comment"); |
||||||
|
}); |
||||||
|
|
||||||
|
it("submit button shows loading state when submitting", () => { |
||||||
|
const isSubmitting = true; |
||||||
|
const buttonText = isSubmitting ? "Posting..." : "Post Comment"; |
||||||
|
|
||||||
|
expect(buttonText).toBe("Posting..."); |
||||||
|
}); |
||||||
|
|
||||||
|
it("submit button shows normal state when not submitting", () => { |
||||||
|
const isSubmitting = false; |
||||||
|
const buttonText = isSubmitting ? "Posting..." : "Post Comment"; |
||||||
|
|
||||||
|
expect(buttonText).toBe("Post Comment"); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("CommentButton - NIP-22 Compliance", () => { |
||||||
|
it("uses kind 1111 for comment events", () => { |
||||||
|
const kind = 1111; |
||||||
|
|
||||||
|
expect(kind).toBe(1111); |
||||||
|
}); |
||||||
|
|
||||||
|
it("includes all required NIP-22 tags for addressable events", () => { |
||||||
|
const requiredRootTags = ["A", "K", "P"]; |
||||||
|
const requiredParentTags = ["a", "k", "p"]; |
||||||
|
|
||||||
|
const tags = [ |
||||||
|
["A", "address", "relay", "pubkey"], |
||||||
|
["K", "kind"], |
||||||
|
["P", "pubkey", "relay"], |
||||||
|
["a", "address", "relay"], |
||||||
|
["k", "kind"], |
||||||
|
["p", "pubkey", "relay"], |
||||||
|
]; |
||||||
|
|
||||||
|
requiredRootTags.forEach((tag) => { |
||||||
|
expect(tags.some((t) => t[0] === tag)).toBe(true); |
||||||
|
}); |
||||||
|
|
||||||
|
requiredParentTags.forEach((tag) => { |
||||||
|
expect(tags.some((t) => t[0] === tag)).toBe(true); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
it("A tag includes relay hint and author pubkey", () => { |
||||||
|
const aTag = ["A", "30041:pubkey:dtag", "wss://relay.com", "pubkey"]; |
||||||
|
|
||||||
|
expect(aTag).toHaveLength(4); |
||||||
|
expect(aTag[0]).toBe("A"); |
||||||
|
expect(aTag[2]).toMatch(/^wss:\/\//); |
||||||
|
expect(aTag[3]).toBeTruthy(); |
||||||
|
}); |
||||||
|
|
||||||
|
it("P tag includes relay hint", () => { |
||||||
|
const pTag = ["P", "pubkey", "wss://relay.com"]; |
||||||
|
|
||||||
|
expect(pTag).toHaveLength(3); |
||||||
|
expect(pTag[0]).toBe("P"); |
||||||
|
expect(pTag[2]).toMatch(/^wss:\/\//); |
||||||
|
}); |
||||||
|
|
||||||
|
it("lowercase tags for parent scope match root tags", () => { |
||||||
|
const address = "30041:pubkey:dtag"; |
||||||
|
const kind = "30041"; |
||||||
|
const pubkey = "pubkey"; |
||||||
|
const relay = "wss://relay.com"; |
||||||
|
|
||||||
|
const rootTags = [ |
||||||
|
["A", address, relay, pubkey], |
||||||
|
["K", kind], |
||||||
|
["P", pubkey, relay], |
||||||
|
]; |
||||||
|
|
||||||
|
const parentTags = [ |
||||||
|
["a", address, relay], |
||||||
|
["k", kind], |
||||||
|
["p", pubkey, relay], |
||||||
|
]; |
||||||
|
|
||||||
|
// Verify parent tags match root tags (lowercase)
|
||||||
|
expect(parentTags[0][1]).toBe(rootTags[0][1]); // address
|
||||||
|
expect(parentTags[1][1]).toBe(rootTags[1][1]); // kind
|
||||||
|
expect(parentTags[2][1]).toBe(rootTags[2][1]); // pubkey
|
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("CommentButton - Integration Scenarios", () => { |
||||||
|
it("complete comment flow for signed-in user", () => { |
||||||
|
const userStore = createMockUserStore(true); |
||||||
|
const user = get(userStore); |
||||||
|
|
||||||
|
// User is signed in
|
||||||
|
expect(user.signedIn).toBe(true); |
||||||
|
|
||||||
|
// Comment content is valid
|
||||||
|
const content = "Great article!"; |
||||||
|
expect(content.trim().length).toBeGreaterThan(0); |
||||||
|
|
||||||
|
// Address is valid
|
||||||
|
const address = "30041:" + "a".repeat(64) + ":article"; |
||||||
|
const parts = address.split(":"); |
||||||
|
expect(parts.length).toBe(3); |
||||||
|
|
||||||
|
// Event would be created with kind 1111
|
||||||
|
const kind = 1111; |
||||||
|
expect(kind).toBe(1111); |
||||||
|
}); |
||||||
|
|
||||||
|
it("prevents comment flow for signed-out user", () => { |
||||||
|
const userStore = createMockUserStore(false); |
||||||
|
const user = get(userStore); |
||||||
|
|
||||||
|
expect(user.signedIn).toBe(false); |
||||||
|
|
||||||
|
if (!user.signedIn) { |
||||||
|
const error = "You must be signed in to comment"; |
||||||
|
expect(error).toBeTruthy(); |
||||||
|
} |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles comment with event ID lookup", async () => { |
||||||
|
const mockNDK = createMockNDK(); |
||||||
|
const eventId = "c".repeat(64); |
||||||
|
|
||||||
|
mockNDK.fetchEvent.mockResolvedValue({ id: eventId }); |
||||||
|
|
||||||
|
const targetEvent = await mockNDK.fetchEvent({}); |
||||||
|
|
||||||
|
const tags = [ |
||||||
|
["e", targetEvent.id, "wss://relay.com"], |
||||||
|
]; |
||||||
|
|
||||||
|
expect(tags[0][1]).toBe(eventId); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles comment without event ID lookup", () => { |
||||||
|
const eventId = ""; |
||||||
|
|
||||||
|
const tags = [ |
||||||
|
["A", "address", "relay", "pubkey"], |
||||||
|
["K", "kind"], |
||||||
|
["P", "pubkey", "relay"], |
||||||
|
["a", "address", "relay"], |
||||||
|
["k", "kind"], |
||||||
|
["p", "pubkey", "relay"], |
||||||
|
]; |
||||||
|
|
||||||
|
// No e tag should be included
|
||||||
|
expect(tags.filter((t) => t[0] === "e")).toHaveLength(0); |
||||||
|
|
||||||
|
// But all other required tags should be present
|
||||||
|
expect(tags.length).toBe(6); |
||||||
|
}); |
||||||
|
}); |
||||||
@ -0,0 +1,136 @@ |
|||||||
|
import { beforeEach, describe, expect, it, vi } from "vitest"; |
||||||
|
import { |
||||||
|
canDeleteEvent, |
||||||
|
deleteEvent, |
||||||
|
} from "../../src/lib/services/deletion.ts"; |
||||||
|
import NDK, { NDKEvent, NDKRelaySet } from "@nostr-dev-kit/ndk"; |
||||||
|
|
||||||
|
describe("Deletion Service", () => { |
||||||
|
let mockNdk: NDK; |
||||||
|
let mockEvent: NDKEvent; |
||||||
|
|
||||||
|
beforeEach(() => { |
||||||
|
// Create mock NDK instance
|
||||||
|
mockNdk = { |
||||||
|
activeUser: { |
||||||
|
pubkey: "test-pubkey-123", |
||||||
|
}, |
||||||
|
pool: { |
||||||
|
relays: new Map([ |
||||||
|
["wss://relay1.example.com", { url: "wss://relay1.example.com" }], |
||||||
|
["wss://relay2.example.com", { url: "wss://relay2.example.com" }], |
||||||
|
]), |
||||||
|
}, |
||||||
|
} as unknown as NDK; |
||||||
|
|
||||||
|
// Create mock event
|
||||||
|
mockEvent = { |
||||||
|
id: "event-id-123", |
||||||
|
kind: 30041, |
||||||
|
pubkey: "test-pubkey-123", |
||||||
|
tagAddress: () => "30041:test-pubkey-123:test-identifier", |
||||||
|
} as unknown as NDKEvent; |
||||||
|
}); |
||||||
|
|
||||||
|
describe("canDeleteEvent", () => { |
||||||
|
it("should return true when user is the event author", () => { |
||||||
|
const result = canDeleteEvent(mockEvent, mockNdk); |
||||||
|
expect(result).toBe(true); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should return false when user is not the event author", () => { |
||||||
|
const differentUserEvent = { |
||||||
|
...mockEvent, |
||||||
|
pubkey: "different-pubkey-456", |
||||||
|
} as unknown as NDKEvent; |
||||||
|
|
||||||
|
const result = canDeleteEvent(differentUserEvent, mockNdk); |
||||||
|
expect(result).toBe(false); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should return false when event is null", () => { |
||||||
|
const result = canDeleteEvent(null, mockNdk); |
||||||
|
expect(result).toBe(false); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should return false when ndk has no active user", () => { |
||||||
|
const ndkWithoutUser = { |
||||||
|
...mockNdk, |
||||||
|
activeUser: undefined, |
||||||
|
} as unknown as NDK; |
||||||
|
|
||||||
|
const result = canDeleteEvent(mockEvent, ndkWithoutUser); |
||||||
|
expect(result).toBe(false); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("deleteEvent", () => { |
||||||
|
it("should return error when no eventId or eventAddress provided", async () => { |
||||||
|
const result = await deleteEvent({}, mockNdk); |
||||||
|
|
||||||
|
expect(result.success).toBe(false); |
||||||
|
expect(result.error).toBe( |
||||||
|
"Either eventId or eventAddress must be provided", |
||||||
|
); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should return error when user is not logged in", async () => { |
||||||
|
const ndkWithoutUser = { |
||||||
|
...mockNdk, |
||||||
|
activeUser: undefined, |
||||||
|
} as unknown as NDK; |
||||||
|
|
||||||
|
const result = await deleteEvent( |
||||||
|
{ eventId: "test-id" }, |
||||||
|
ndkWithoutUser, |
||||||
|
); |
||||||
|
|
||||||
|
expect(result.success).toBe(false); |
||||||
|
expect(result.error).toBe("Please log in first"); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should create deletion event with correct tags", async () => { |
||||||
|
const mockSign = vi.fn(); |
||||||
|
const mockPublish = vi.fn().mockResolvedValue( |
||||||
|
new Set(["wss://relay1.example.com"]), |
||||||
|
); |
||||||
|
|
||||||
|
// Mock NDKEvent constructor
|
||||||
|
const MockNDKEvent = vi.fn().mockImplementation(function (this: any) { |
||||||
|
this.kind = 0; |
||||||
|
this.created_at = 0; |
||||||
|
this.tags = []; |
||||||
|
this.content = ""; |
||||||
|
this.pubkey = ""; |
||||||
|
this.sign = mockSign; |
||||||
|
this.publish = mockPublish; |
||||||
|
return this; |
||||||
|
}); |
||||||
|
|
||||||
|
// Mock NDKRelaySet
|
||||||
|
const mockRelaySet = {} as NDKRelaySet; |
||||||
|
vi.spyOn(NDKRelaySet, "fromRelayUrls").mockReturnValue(mockRelaySet); |
||||||
|
|
||||||
|
// Replace global NDKEvent temporarily
|
||||||
|
const originalNDKEvent = (globalThis as any).NDKEvent; |
||||||
|
(global as any).NDKEvent = MockNDKEvent; |
||||||
|
|
||||||
|
const result = await deleteEvent( |
||||||
|
{ |
||||||
|
eventId: "event-123", |
||||||
|
eventAddress: "30041:pubkey:identifier", |
||||||
|
eventKind: 30041, |
||||||
|
reason: "Test deletion", |
||||||
|
}, |
||||||
|
mockNdk, |
||||||
|
); |
||||||
|
|
||||||
|
// Restore original
|
||||||
|
(global as any).NDKEvent = originalNDKEvent; |
||||||
|
|
||||||
|
expect(MockNDKEvent).toHaveBeenCalled(); |
||||||
|
expect(mockSign).toHaveBeenCalled(); |
||||||
|
expect(mockPublish).toHaveBeenCalled(); |
||||||
|
}); |
||||||
|
}); |
||||||
|
}); |
||||||
@ -0,0 +1,320 @@ |
|||||||
|
import { beforeEach, describe, expect, it, vi } from "vitest"; |
||||||
|
import type { NDKEvent } from "@nostr-dev-kit/ndk"; |
||||||
|
import type NDK from "@nostr-dev-kit/ndk"; |
||||||
|
import { fetchHighlightsForPublication } from "../../src/lib/utils/fetch_publication_highlights.ts"; |
||||||
|
|
||||||
|
// Mock NDKEvent class
|
||||||
|
class MockNDKEvent { |
||||||
|
kind: number; |
||||||
|
pubkey: string; |
||||||
|
content: string; |
||||||
|
tags: string[][]; |
||||||
|
created_at: number; |
||||||
|
id: string; |
||||||
|
sig: string; |
||||||
|
|
||||||
|
constructor(event: { |
||||||
|
kind: number; |
||||||
|
pubkey: string; |
||||||
|
content: string; |
||||||
|
tags: string[][]; |
||||||
|
created_at?: number; |
||||||
|
id?: string; |
||||||
|
sig?: string; |
||||||
|
}) { |
||||||
|
this.kind = event.kind; |
||||||
|
this.pubkey = event.pubkey; |
||||||
|
this.content = event.content; |
||||||
|
this.tags = event.tags; |
||||||
|
this.created_at = event.created_at || Date.now() / 1000; |
||||||
|
this.id = event.id || "mock-id"; |
||||||
|
this.sig = event.sig || "mock-sig"; |
||||||
|
} |
||||||
|
|
||||||
|
getMatchingTags(tagName: string): string[][] { |
||||||
|
return this.tags.filter((tag) => tag[0] === tagName); |
||||||
|
} |
||||||
|
|
||||||
|
tagValue(tagName: string): string | undefined { |
||||||
|
const tag = this.tags.find((tag) => tag[0] === tagName); |
||||||
|
return tag ? tag[1] : undefined; |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
describe("fetchHighlightsForPublication", () => { |
||||||
|
let mockNDK: NDK; |
||||||
|
let publicationEvent: NDKEvent; |
||||||
|
let mockHighlights: MockNDKEvent[]; |
||||||
|
|
||||||
|
beforeEach(() => { |
||||||
|
// Create the sample 30040 publication event from the user's example
|
||||||
|
publicationEvent = new MockNDKEvent({ |
||||||
|
kind: 30040, |
||||||
|
pubkey: |
||||||
|
"fd208ee8c8f283780a9552896e4823cc9dc6bfd442063889577106940fd927c1", |
||||||
|
content: "", |
||||||
|
tags: [ |
||||||
|
["d", "document-test"], |
||||||
|
["title", "Document Test"], |
||||||
|
["author", "unknown"], |
||||||
|
["version", "1"], |
||||||
|
["m", "application/json"], |
||||||
|
["M", "meta-data/index/replaceable"], |
||||||
|
[ |
||||||
|
"a", |
||||||
|
"30041:fd208ee8c8f283780a9552896e4823cc9dc6bfd442063889577106940fd927c1:first-level-heading", |
||||||
|
], |
||||||
|
[ |
||||||
|
"a", |
||||||
|
"30041:fd208ee8c8f283780a9552896e4823cc9dc6bfd442063889577106940fd927c1:another-first-level-heading", |
||||||
|
], |
||||||
|
[ |
||||||
|
"a", |
||||||
|
"30041:fd208ee8c8f283780a9552896e4823cc9dc6bfd442063889577106940fd927c1:a-third-first-level-heading", |
||||||
|
], |
||||||
|
[ |
||||||
|
"a", |
||||||
|
"30041:fd208ee8c8f283780a9552896e4823cc9dc6bfd442063889577106940fd927c1:asciimath-test-document", |
||||||
|
], |
||||||
|
["t", "a-tags"], |
||||||
|
["t", "testfile"], |
||||||
|
["t", "asciimath"], |
||||||
|
["t", "latexmath"], |
||||||
|
["image", "https://i.nostr.build/5kWwbDR04joIASVx.png"], |
||||||
|
], |
||||||
|
created_at: 1744910311, |
||||||
|
id: "4585ed74a0be37655aa887340d239f0bbb9df5476165d912f098c55a71196fef", |
||||||
|
sig: |
||||||
|
"e6a832dcfc919c913acee62cb598211544bc8e03a3f61c016eb3bf6c8cb4fb333eff8fecc601517604c7a8029dfa73591f3218465071a532f4abfe8c0bf3662d", |
||||||
|
}) as unknown as NDKEvent; |
||||||
|
|
||||||
|
// Create mock highlight events for different sections
|
||||||
|
mockHighlights = [ |
||||||
|
new MockNDKEvent({ |
||||||
|
kind: 9802, |
||||||
|
pubkey: "user-pubkey-1", |
||||||
|
content: "This is an interesting point", |
||||||
|
tags: [ |
||||||
|
[ |
||||||
|
"a", |
||||||
|
"30041:fd208ee8c8f283780a9552896e4823cc9dc6bfd442063889577106940fd927c1:first-level-heading", |
||||||
|
], |
||||||
|
["context", "surrounding text here"], |
||||||
|
[ |
||||||
|
"p", |
||||||
|
"fd208ee8c8f283780a9552896e4823cc9dc6bfd442063889577106940fd927c1", |
||||||
|
"", |
||||||
|
"author", |
||||||
|
], |
||||||
|
], |
||||||
|
id: "highlight-1", |
||||||
|
}), |
||||||
|
new MockNDKEvent({ |
||||||
|
kind: 9802, |
||||||
|
pubkey: "user-pubkey-2", |
||||||
|
content: "Another highlight on same section", |
||||||
|
tags: [ |
||||||
|
[ |
||||||
|
"a", |
||||||
|
"30041:fd208ee8c8f283780a9552896e4823cc9dc6bfd442063889577106940fd927c1:first-level-heading", |
||||||
|
], |
||||||
|
["context", "more surrounding text"], |
||||||
|
[ |
||||||
|
"p", |
||||||
|
"fd208ee8c8f283780a9552896e4823cc9dc6bfd442063889577106940fd927c1", |
||||||
|
"", |
||||||
|
"author", |
||||||
|
], |
||||||
|
], |
||||||
|
id: "highlight-2", |
||||||
|
}), |
||||||
|
new MockNDKEvent({ |
||||||
|
kind: 9802, |
||||||
|
pubkey: "user-pubkey-3", |
||||||
|
content: "Highlight on different section", |
||||||
|
tags: [ |
||||||
|
[ |
||||||
|
"a", |
||||||
|
"30041:fd208ee8c8f283780a9552896e4823cc9dc6bfd442063889577106940fd927c1:another-first-level-heading", |
||||||
|
], |
||||||
|
["context", "different section text"], |
||||||
|
[ |
||||||
|
"p", |
||||||
|
"fd208ee8c8f283780a9552896e4823cc9dc6bfd442063889577106940fd927c1", |
||||||
|
"", |
||||||
|
"author", |
||||||
|
], |
||||||
|
], |
||||||
|
id: "highlight-3", |
||||||
|
}), |
||||||
|
]; |
||||||
|
|
||||||
|
// Mock NDK instance
|
||||||
|
mockNDK = { |
||||||
|
fetchEvents: vi.fn(async (filter) => { |
||||||
|
// Return highlights that match the filter
|
||||||
|
const aTagFilter = filter["#a"]; |
||||||
|
if (aTagFilter) { |
||||||
|
return new Set( |
||||||
|
mockHighlights.filter((highlight) => |
||||||
|
aTagFilter.includes(highlight.tagValue("a") || "") |
||||||
|
), |
||||||
|
); |
||||||
|
} |
||||||
|
return new Set(); |
||||||
|
}), |
||||||
|
} as unknown as NDK; |
||||||
|
}); |
||||||
|
|
||||||
|
it("should extract section references from 30040 publication event", async () => { |
||||||
|
const result = await fetchHighlightsForPublication( |
||||||
|
publicationEvent, |
||||||
|
mockNDK, |
||||||
|
); |
||||||
|
|
||||||
|
// Should have results for the sections that have highlights
|
||||||
|
expect(result.size).toBeGreaterThan(0); |
||||||
|
expect( |
||||||
|
result.has( |
||||||
|
"30041:fd208ee8c8f283780a9552896e4823cc9dc6bfd442063889577106940fd927c1:first-level-heading", |
||||||
|
), |
||||||
|
).toBe(true); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should fetch highlights for each section reference", async () => { |
||||||
|
const result = await fetchHighlightsForPublication( |
||||||
|
publicationEvent, |
||||||
|
mockNDK, |
||||||
|
); |
||||||
|
|
||||||
|
// First section should have 2 highlights
|
||||||
|
const firstSectionHighlights = result.get( |
||||||
|
"30041:fd208ee8c8f283780a9552896e4823cc9dc6bfd442063889577106940fd927c1:first-level-heading", |
||||||
|
); |
||||||
|
expect(firstSectionHighlights?.length).toBe(2); |
||||||
|
|
||||||
|
// Second section should have 1 highlight
|
||||||
|
const secondSectionHighlights = result.get( |
||||||
|
"30041:fd208ee8c8f283780a9552896e4823cc9dc6bfd442063889577106940fd927c1:another-first-level-heading", |
||||||
|
); |
||||||
|
expect(secondSectionHighlights?.length).toBe(1); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should group highlights by section address", async () => { |
||||||
|
const result = await fetchHighlightsForPublication( |
||||||
|
publicationEvent, |
||||||
|
mockNDK, |
||||||
|
); |
||||||
|
|
||||||
|
const firstSectionHighlights = result.get( |
||||||
|
"30041:fd208ee8c8f283780a9552896e4823cc9dc6bfd442063889577106940fd927c1:first-level-heading", |
||||||
|
); |
||||||
|
|
||||||
|
// Verify the highlights are correctly grouped
|
||||||
|
expect(firstSectionHighlights?.[0].content).toBe( |
||||||
|
"This is an interesting point", |
||||||
|
); |
||||||
|
expect(firstSectionHighlights?.[1].content).toBe( |
||||||
|
"Another highlight on same section", |
||||||
|
); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should not include sections without highlights", async () => { |
||||||
|
const result = await fetchHighlightsForPublication( |
||||||
|
publicationEvent, |
||||||
|
mockNDK, |
||||||
|
); |
||||||
|
|
||||||
|
// Sections without highlights should not be in the result
|
||||||
|
expect( |
||||||
|
result.has( |
||||||
|
"30041:fd208ee8c8f283780a9552896e4823cc9dc6bfd442063889577106940fd927c1:a-third-first-level-heading", |
||||||
|
), |
||||||
|
).toBe(false); |
||||||
|
expect( |
||||||
|
result.has( |
||||||
|
"30041:fd208ee8c8f283780a9552896e4823cc9dc6bfd442063889577106940fd927c1:asciimath-test-document", |
||||||
|
), |
||||||
|
).toBe(false); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should handle publication with no section references", async () => { |
||||||
|
const emptyPublication = new MockNDKEvent({ |
||||||
|
kind: 30040, |
||||||
|
pubkey: "test-pubkey", |
||||||
|
content: "", |
||||||
|
tags: [ |
||||||
|
["d", "empty-doc"], |
||||||
|
["title", "Empty Document"], |
||||||
|
], |
||||||
|
}) as unknown as NDKEvent; |
||||||
|
|
||||||
|
const result = await fetchHighlightsForPublication( |
||||||
|
emptyPublication, |
||||||
|
mockNDK, |
||||||
|
); |
||||||
|
|
||||||
|
expect(result.size).toBe(0); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should only process 30041 kind references, ignoring other a-tags", async () => { |
||||||
|
const mixedPublication = new MockNDKEvent({ |
||||||
|
kind: 30040, |
||||||
|
pubkey: "test-pubkey", |
||||||
|
content: "", |
||||||
|
tags: [ |
||||||
|
["d", "mixed-doc"], |
||||||
|
[ |
||||||
|
"a", |
||||||
|
"30041:fd208ee8c8f283780a9552896e4823cc9dc6bfd442063889577106940fd927c1:first-level-heading", |
||||||
|
], |
||||||
|
["a", "30023:some-pubkey:blog-post"], // Different kind, should be ignored
|
||||||
|
["a", "1:some-pubkey"], // Different kind, should be ignored
|
||||||
|
], |
||||||
|
}) as unknown as NDKEvent; |
||||||
|
|
||||||
|
const result = await fetchHighlightsForPublication( |
||||||
|
mixedPublication, |
||||||
|
mockNDK, |
||||||
|
); |
||||||
|
|
||||||
|
// Should call fetchEvents with only the 30041 reference
|
||||||
|
expect(mockNDK.fetchEvents).toHaveBeenCalledWith( |
||||||
|
expect.objectContaining({ |
||||||
|
kinds: [9802], |
||||||
|
"#a": [ |
||||||
|
"30041:fd208ee8c8f283780a9552896e4823cc9dc6bfd442063889577106940fd927c1:first-level-heading", |
||||||
|
], |
||||||
|
}), |
||||||
|
); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should handle d-tags with colons correctly", async () => { |
||||||
|
const colonPublication = new MockNDKEvent({ |
||||||
|
kind: 30040, |
||||||
|
pubkey: "test-pubkey", |
||||||
|
content: "", |
||||||
|
tags: [ |
||||||
|
["d", "colon-doc"], |
||||||
|
[ |
||||||
|
"a", |
||||||
|
"30041:fd208ee8c8f283780a9552896e4823cc9dc6bfd442063889577106940fd927c1:section:with:colons", |
||||||
|
], |
||||||
|
], |
||||||
|
}) as unknown as NDKEvent; |
||||||
|
|
||||||
|
const result = await fetchHighlightsForPublication( |
||||||
|
colonPublication, |
||||||
|
mockNDK, |
||||||
|
); |
||||||
|
|
||||||
|
// Should correctly parse the section address with colons
|
||||||
|
expect(mockNDK.fetchEvents).toHaveBeenCalledWith( |
||||||
|
expect.objectContaining({ |
||||||
|
"#a": [ |
||||||
|
"30041:fd208ee8c8f283780a9552896e4823cc9dc6bfd442063889577106940fd927c1:section:with:colons", |
||||||
|
], |
||||||
|
}), |
||||||
|
); |
||||||
|
}); |
||||||
|
}); |
||||||
@ -0,0 +1,870 @@ |
|||||||
|
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest"; |
||||||
|
import { pubkeyToHue } from "../../src/lib/utils/nostrUtils"; |
||||||
|
import { nip19 } from "nostr-tools"; |
||||||
|
|
||||||
|
describe("pubkeyToHue", () => { |
||||||
|
describe("Consistency", () => { |
||||||
|
it("returns consistent hue for same pubkey", () => { |
||||||
|
const pubkey = "a".repeat(64); |
||||||
|
const hue1 = pubkeyToHue(pubkey); |
||||||
|
const hue2 = pubkeyToHue(pubkey); |
||||||
|
|
||||||
|
expect(hue1).toBe(hue2); |
||||||
|
}); |
||||||
|
|
||||||
|
it("returns same hue for same pubkey called multiple times", () => { |
||||||
|
const pubkey = "abc123def456".repeat(5) + "abcd"; |
||||||
|
const hues = Array.from({ length: 10 }, () => pubkeyToHue(pubkey)); |
||||||
|
|
||||||
|
expect(new Set(hues).size).toBe(1); // All hues should be the same
|
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Range Validation", () => { |
||||||
|
it("returns hue in valid range (0-360)", () => { |
||||||
|
const pubkeys = [ |
||||||
|
"a".repeat(64), |
||||||
|
"f".repeat(64), |
||||||
|
"0".repeat(64), |
||||||
|
"9".repeat(64), |
||||||
|
"abc123def456".repeat(5) + "abcd", |
||||||
|
"123456789abc".repeat(5) + "def0", |
||||||
|
]; |
||||||
|
|
||||||
|
pubkeys.forEach((pubkey) => { |
||||||
|
const hue = pubkeyToHue(pubkey); |
||||||
|
expect(hue).toBeGreaterThanOrEqual(0); |
||||||
|
expect(hue).toBeLessThan(360); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
it("returns integer hue value", () => { |
||||||
|
const pubkey = "a".repeat(64); |
||||||
|
const hue = pubkeyToHue(pubkey); |
||||||
|
|
||||||
|
expect(Number.isInteger(hue)).toBe(true); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Format Handling", () => { |
||||||
|
it("handles hex format pubkeys", () => { |
||||||
|
const hexPubkey = "abcdef123456789".repeat(4) + "0123"; |
||||||
|
const hue = pubkeyToHue(hexPubkey); |
||||||
|
|
||||||
|
expect(hue).toBeGreaterThanOrEqual(0); |
||||||
|
expect(hue).toBeLessThan(360); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles npub format pubkeys", () => { |
||||||
|
const hexPubkey = "a".repeat(64); |
||||||
|
const npub = nip19.npubEncode(hexPubkey); |
||||||
|
const hue = pubkeyToHue(npub); |
||||||
|
|
||||||
|
expect(hue).toBeGreaterThanOrEqual(0); |
||||||
|
expect(hue).toBeLessThan(360); |
||||||
|
}); |
||||||
|
|
||||||
|
it("returns same hue for hex and npub format of same pubkey", () => { |
||||||
|
const hexPubkey = "abc123def456".repeat(5) + "abcd"; |
||||||
|
const npub = nip19.npubEncode(hexPubkey); |
||||||
|
|
||||||
|
const hueFromHex = pubkeyToHue(hexPubkey); |
||||||
|
const hueFromNpub = pubkeyToHue(npub); |
||||||
|
|
||||||
|
expect(hueFromHex).toBe(hueFromNpub); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Uniqueness", () => { |
||||||
|
it("different pubkeys generate different hues", () => { |
||||||
|
const pubkey1 = "a".repeat(64); |
||||||
|
const pubkey2 = "b".repeat(64); |
||||||
|
const pubkey3 = "c".repeat(64); |
||||||
|
|
||||||
|
const hue1 = pubkeyToHue(pubkey1); |
||||||
|
const hue2 = pubkeyToHue(pubkey2); |
||||||
|
const hue3 = pubkeyToHue(pubkey3); |
||||||
|
|
||||||
|
expect(hue1).not.toBe(hue2); |
||||||
|
expect(hue2).not.toBe(hue3); |
||||||
|
expect(hue1).not.toBe(hue3); |
||||||
|
}); |
||||||
|
|
||||||
|
it("generates diverse hues for multiple pubkeys", () => { |
||||||
|
const pubkeys = Array.from( |
||||||
|
{ length: 10 }, |
||||||
|
(_, i) => String.fromCharCode(97 + i).repeat(64), |
||||||
|
); |
||||||
|
|
||||||
|
const hues = pubkeys.map((pk) => pubkeyToHue(pk)); |
||||||
|
const uniqueHues = new Set(hues); |
||||||
|
|
||||||
|
// Most pubkeys should generate unique hues (allowing for some collisions)
|
||||||
|
expect(uniqueHues.size).toBeGreaterThan(7); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Edge Cases", () => { |
||||||
|
it("handles empty string input", () => { |
||||||
|
const hue = pubkeyToHue(""); |
||||||
|
|
||||||
|
expect(hue).toBeGreaterThanOrEqual(0); |
||||||
|
expect(hue).toBeLessThan(360); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles invalid npub format gracefully", () => { |
||||||
|
const invalidNpub = "npub1invalid"; |
||||||
|
const hue = pubkeyToHue(invalidNpub); |
||||||
|
|
||||||
|
// Should still return a valid hue even if decode fails
|
||||||
|
expect(hue).toBeGreaterThanOrEqual(0); |
||||||
|
expect(hue).toBeLessThan(360); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles short input strings", () => { |
||||||
|
const shortInput = "abc"; |
||||||
|
const hue = pubkeyToHue(shortInput); |
||||||
|
|
||||||
|
expect(hue).toBeGreaterThanOrEqual(0); |
||||||
|
expect(hue).toBeLessThan(360); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles special characters", () => { |
||||||
|
const specialInput = "!@#$%^&*()"; |
||||||
|
const hue = pubkeyToHue(specialInput); |
||||||
|
|
||||||
|
expect(hue).toBeGreaterThanOrEqual(0); |
||||||
|
expect(hue).toBeLessThan(360); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Color Distribution", () => { |
||||||
|
it("distributes colors across the spectrum", () => { |
||||||
|
// Generate hues for many different pubkeys
|
||||||
|
const pubkeys = Array.from( |
||||||
|
{ length: 50 }, |
||||||
|
(_, i) => i.toString().repeat(16), |
||||||
|
); |
||||||
|
|
||||||
|
const hues = pubkeys.map((pk) => pubkeyToHue(pk)); |
||||||
|
|
||||||
|
// Check that we have hues in different ranges of the spectrum
|
||||||
|
const hasLowHues = hues.some((h) => h < 120); |
||||||
|
const hasMidHues = hues.some((h) => h >= 120 && h < 240); |
||||||
|
const hasHighHues = hues.some((h) => h >= 240); |
||||||
|
|
||||||
|
expect(hasLowHues).toBe(true); |
||||||
|
expect(hasMidHues).toBe(true); |
||||||
|
expect(hasHighHues).toBe(true); |
||||||
|
}); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("HighlightLayer Component", () => { |
||||||
|
let mockNdk: any; |
||||||
|
let mockSubscription: any; |
||||||
|
let eventHandlers: Map<string, Function>; |
||||||
|
|
||||||
|
beforeEach(() => { |
||||||
|
eventHandlers = new Map(); |
||||||
|
|
||||||
|
// Mock NDK subscription
|
||||||
|
mockSubscription = { |
||||||
|
on: vi.fn((event: string, handler: Function) => { |
||||||
|
eventHandlers.set(event, handler); |
||||||
|
}), |
||||||
|
stop: vi.fn(), |
||||||
|
}; |
||||||
|
|
||||||
|
mockNdk = { |
||||||
|
subscribe: vi.fn(() => mockSubscription), |
||||||
|
}; |
||||||
|
|
||||||
|
// Mock DOM APIs
|
||||||
|
global.document = { |
||||||
|
createTreeWalker: vi.fn(() => ({ |
||||||
|
nextNode: vi.fn(() => null), |
||||||
|
})), |
||||||
|
createDocumentFragment: vi.fn(() => ({ |
||||||
|
appendChild: vi.fn(), |
||||||
|
})), |
||||||
|
createTextNode: vi.fn((text: string) => ({ |
||||||
|
textContent: text, |
||||||
|
})), |
||||||
|
createElement: vi.fn((tag: string) => ({ |
||||||
|
className: "", |
||||||
|
style: {}, |
||||||
|
textContent: "", |
||||||
|
})), |
||||||
|
} as any; |
||||||
|
}); |
||||||
|
|
||||||
|
afterEach(() => { |
||||||
|
vi.clearAllMocks(); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("NDK Subscription", () => { |
||||||
|
it("fetches kind 9802 events with correct filter when eventId provided", () => { |
||||||
|
const eventId = "a".repeat(64); |
||||||
|
|
||||||
|
// Simulate calling fetchHighlights
|
||||||
|
mockNdk.subscribe({ kinds: [9802], "#e": [eventId], limit: 100 }); |
||||||
|
|
||||||
|
expect(mockNdk.subscribe).toHaveBeenCalledWith( |
||||||
|
expect.objectContaining({ |
||||||
|
kinds: [9802], |
||||||
|
"#e": [eventId], |
||||||
|
limit: 100, |
||||||
|
}), |
||||||
|
); |
||||||
|
}); |
||||||
|
|
||||||
|
it("fetches kind 9802 events with correct filter when eventAddress provided", () => { |
||||||
|
const eventAddress = "30040:" + "a".repeat(64) + ":chapter-1"; |
||||||
|
|
||||||
|
// Simulate calling fetchHighlights
|
||||||
|
mockNdk.subscribe({ kinds: [9802], "#a": [eventAddress], limit: 100 }); |
||||||
|
|
||||||
|
expect(mockNdk.subscribe).toHaveBeenCalledWith( |
||||||
|
expect.objectContaining({ |
||||||
|
kinds: [9802], |
||||||
|
"#a": [eventAddress], |
||||||
|
limit: 100, |
||||||
|
}), |
||||||
|
); |
||||||
|
}); |
||||||
|
|
||||||
|
it("fetches with both eventId and eventAddress filters when both provided", () => { |
||||||
|
const eventId = "a".repeat(64); |
||||||
|
const eventAddress = "30040:" + "b".repeat(64) + ":chapter-1"; |
||||||
|
|
||||||
|
// Simulate calling fetchHighlights
|
||||||
|
mockNdk.subscribe({ |
||||||
|
kinds: [9802], |
||||||
|
"#e": [eventId], |
||||||
|
"#a": [eventAddress], |
||||||
|
limit: 100, |
||||||
|
}); |
||||||
|
|
||||||
|
expect(mockNdk.subscribe).toHaveBeenCalledWith( |
||||||
|
expect.objectContaining({ |
||||||
|
kinds: [9802], |
||||||
|
"#e": [eventId], |
||||||
|
"#a": [eventAddress], |
||||||
|
limit: 100, |
||||||
|
}), |
||||||
|
); |
||||||
|
}); |
||||||
|
|
||||||
|
it("cleans up subscription on unmount", () => { |
||||||
|
mockNdk.subscribe({ kinds: [9802], limit: 100 }); |
||||||
|
|
||||||
|
// Simulate unmount by calling stop
|
||||||
|
mockSubscription.stop(); |
||||||
|
|
||||||
|
expect(mockSubscription.stop).toHaveBeenCalled(); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Color Mapping", () => { |
||||||
|
it("maps highlights to colors correctly", () => { |
||||||
|
const pubkey1 = "a".repeat(64); |
||||||
|
const pubkey2 = "b".repeat(64); |
||||||
|
|
||||||
|
const hue1 = pubkeyToHue(pubkey1); |
||||||
|
const hue2 = pubkeyToHue(pubkey2); |
||||||
|
|
||||||
|
const expectedColor1 = `hsla(${hue1}, 70%, 60%, 0.3)`; |
||||||
|
const expectedColor2 = `hsla(${hue2}, 70%, 60%, 0.3)`; |
||||||
|
|
||||||
|
expect(expectedColor1).toMatch(/^hsla\(\d+, 70%, 60%, 0\.3\)$/); |
||||||
|
expect(expectedColor2).toMatch(/^hsla\(\d+, 70%, 60%, 0\.3\)$/); |
||||||
|
expect(expectedColor1).not.toBe(expectedColor2); |
||||||
|
}); |
||||||
|
|
||||||
|
it("uses consistent color for same pubkey", () => { |
||||||
|
const pubkey = "abc123def456".repeat(5) + "abcd"; |
||||||
|
const hue = pubkeyToHue(pubkey); |
||||||
|
|
||||||
|
const color1 = `hsla(${hue}, 70%, 60%, 0.3)`; |
||||||
|
const color2 = `hsla(${hue}, 70%, 60%, 0.3)`; |
||||||
|
|
||||||
|
expect(color1).toBe(color2); |
||||||
|
}); |
||||||
|
|
||||||
|
it("generates semi-transparent colors with 0.3 opacity", () => { |
||||||
|
const pubkey = "a".repeat(64); |
||||||
|
const hue = pubkeyToHue(pubkey); |
||||||
|
const color = `hsla(${hue}, 70%, 60%, 0.3)`; |
||||||
|
|
||||||
|
expect(color).toContain("0.3"); |
||||||
|
}); |
||||||
|
|
||||||
|
it("uses HSL color format with correct values", () => { |
||||||
|
const pubkey = "a".repeat(64); |
||||||
|
const hue = pubkeyToHue(pubkey); |
||||||
|
const color = `hsla(${hue}, 70%, 60%, 0.3)`; |
||||||
|
|
||||||
|
// Verify format: hsla(hue, 70%, 60%, 0.3)
|
||||||
|
expect(color).toMatch(/^hsla\(\d+, 70%, 60%, 0\.3\)$/); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Highlight Events", () => { |
||||||
|
it("handles no highlights gracefully", () => { |
||||||
|
const highlights: any[] = []; |
||||||
|
|
||||||
|
expect(highlights.length).toBe(0); |
||||||
|
// Component should render without errors
|
||||||
|
}); |
||||||
|
|
||||||
|
it("handles single highlight from one user", () => { |
||||||
|
const mockHighlight = { |
||||||
|
id: "highlight1", |
||||||
|
kind: 9802, |
||||||
|
pubkey: "a".repeat(64), |
||||||
|
content: "highlighted text", |
||||||
|
created_at: Date.now(), |
||||||
|
tags: [], |
||||||
|
}; |
||||||
|
|
||||||
|
const highlights = [mockHighlight]; |
||||||
|
|
||||||
|
expect(highlights.length).toBe(1); |
||||||
|
expect(highlights[0].pubkey).toBe("a".repeat(64)); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles multiple highlights from same user", () => { |
||||||
|
const pubkey = "a".repeat(64); |
||||||
|
const mockHighlights = [ |
||||||
|
{ |
||||||
|
id: "highlight1", |
||||||
|
kind: 9802, |
||||||
|
pubkey: pubkey, |
||||||
|
content: "first highlight", |
||||||
|
created_at: Date.now(), |
||||||
|
tags: [], |
||||||
|
}, |
||||||
|
{ |
||||||
|
id: "highlight2", |
||||||
|
kind: 9802, |
||||||
|
pubkey: pubkey, |
||||||
|
content: "second highlight", |
||||||
|
created_at: Date.now(), |
||||||
|
tags: [], |
||||||
|
}, |
||||||
|
]; |
||||||
|
|
||||||
|
expect(mockHighlights.length).toBe(2); |
||||||
|
expect(mockHighlights[0].pubkey).toBe(mockHighlights[1].pubkey); |
||||||
|
|
||||||
|
// Should use same color for both
|
||||||
|
const hue = pubkeyToHue(pubkey); |
||||||
|
const color = `hsla(${hue}, 70%, 60%, 0.3)`; |
||||||
|
|
||||||
|
expect(color).toMatch(/^hsla\(\d+, 70%, 60%, 0\.3\)$/); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles multiple highlights from different users", () => { |
||||||
|
const pubkey1 = "a".repeat(64); |
||||||
|
const pubkey2 = "b".repeat(64); |
||||||
|
const pubkey3 = "c".repeat(64); |
||||||
|
|
||||||
|
const mockHighlights = [ |
||||||
|
{ |
||||||
|
id: "highlight1", |
||||||
|
kind: 9802, |
||||||
|
pubkey: pubkey1, |
||||||
|
content: "highlight from user 1", |
||||||
|
created_at: Date.now(), |
||||||
|
tags: [], |
||||||
|
}, |
||||||
|
{ |
||||||
|
id: "highlight2", |
||||||
|
kind: 9802, |
||||||
|
pubkey: pubkey2, |
||||||
|
content: "highlight from user 2", |
||||||
|
created_at: Date.now(), |
||||||
|
tags: [], |
||||||
|
}, |
||||||
|
{ |
||||||
|
id: "highlight3", |
||||||
|
kind: 9802, |
||||||
|
pubkey: pubkey3, |
||||||
|
content: "highlight from user 3", |
||||||
|
created_at: Date.now(), |
||||||
|
tags: [], |
||||||
|
}, |
||||||
|
]; |
||||||
|
|
||||||
|
expect(mockHighlights.length).toBe(3); |
||||||
|
|
||||||
|
// Each should have different color
|
||||||
|
const hue1 = pubkeyToHue(pubkey1); |
||||||
|
const hue2 = pubkeyToHue(pubkey2); |
||||||
|
const hue3 = pubkeyToHue(pubkey3); |
||||||
|
|
||||||
|
expect(hue1).not.toBe(hue2); |
||||||
|
expect(hue2).not.toBe(hue3); |
||||||
|
expect(hue1).not.toBe(hue3); |
||||||
|
}); |
||||||
|
|
||||||
|
it("prevents duplicate highlights", () => { |
||||||
|
const mockHighlight = { |
||||||
|
id: "highlight1", |
||||||
|
kind: 9802, |
||||||
|
pubkey: "a".repeat(64), |
||||||
|
content: "highlighted text", |
||||||
|
created_at: Date.now(), |
||||||
|
tags: [], |
||||||
|
}; |
||||||
|
|
||||||
|
const highlights = [mockHighlight]; |
||||||
|
|
||||||
|
// Try to add duplicate
|
||||||
|
const isDuplicate = highlights.some((h) => h.id === mockHighlight.id); |
||||||
|
|
||||||
|
expect(isDuplicate).toBe(true); |
||||||
|
// Should not add duplicate
|
||||||
|
}); |
||||||
|
|
||||||
|
it("handles empty content gracefully", () => { |
||||||
|
const mockHighlight = { |
||||||
|
id: "highlight1", |
||||||
|
kind: 9802, |
||||||
|
pubkey: "a".repeat(64), |
||||||
|
content: "", |
||||||
|
created_at: Date.now(), |
||||||
|
tags: [], |
||||||
|
}; |
||||||
|
|
||||||
|
// Should not crash
|
||||||
|
expect(mockHighlight.content).toBe(""); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles whitespace-only content", () => { |
||||||
|
const mockHighlight = { |
||||||
|
id: "highlight1", |
||||||
|
kind: 9802, |
||||||
|
pubkey: "a".repeat(64), |
||||||
|
content: " \n\t ", |
||||||
|
created_at: Date.now(), |
||||||
|
tags: [], |
||||||
|
}; |
||||||
|
|
||||||
|
const trimmed = mockHighlight.content.trim(); |
||||||
|
expect(trimmed.length).toBe(0); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Highlighter Legend", () => { |
||||||
|
it("displays legend with correct color for single highlighter", () => { |
||||||
|
const pubkey = "abc123def456".repeat(5) + "abcd"; |
||||||
|
const hue = pubkeyToHue(pubkey); |
||||||
|
const color = `hsla(${hue}, 70%, 60%, 0.3)`; |
||||||
|
|
||||||
|
const legend = { |
||||||
|
pubkey: pubkey, |
||||||
|
color: color, |
||||||
|
shortPubkey: `${pubkey.slice(0, 8)}...`, |
||||||
|
}; |
||||||
|
|
||||||
|
expect(legend.color).toBe(color); |
||||||
|
expect(legend.shortPubkey).toBe(`${pubkey.slice(0, 8)}...`); |
||||||
|
}); |
||||||
|
|
||||||
|
it("displays legend with colors for multiple highlighters", () => { |
||||||
|
const pubkeys = [ |
||||||
|
"a".repeat(64), |
||||||
|
"b".repeat(64), |
||||||
|
"c".repeat(64), |
||||||
|
]; |
||||||
|
|
||||||
|
const legendEntries = pubkeys.map((pubkey) => ({ |
||||||
|
pubkey, |
||||||
|
color: `hsla(${pubkeyToHue(pubkey)}, 70%, 60%, 0.3)`, |
||||||
|
shortPubkey: `${pubkey.slice(0, 8)}...`, |
||||||
|
})); |
||||||
|
|
||||||
|
expect(legendEntries.length).toBe(3); |
||||||
|
|
||||||
|
// Each should have unique color
|
||||||
|
const colors = legendEntries.map((e) => e.color); |
||||||
|
const uniqueColors = new Set(colors); |
||||||
|
expect(uniqueColors.size).toBe(3); |
||||||
|
}); |
||||||
|
|
||||||
|
it("shows truncated pubkey in legend", () => { |
||||||
|
const pubkey = "abcdefghijklmnop".repeat(4); |
||||||
|
const shortPubkey = `${pubkey.slice(0, 8)}...`; |
||||||
|
|
||||||
|
expect(shortPubkey).toBe("abcdefgh..."); |
||||||
|
expect(shortPubkey.length).toBeLessThan(pubkey.length); |
||||||
|
}); |
||||||
|
|
||||||
|
it("displays highlight count", () => { |
||||||
|
const highlights = [ |
||||||
|
{ id: "1", pubkey: "a".repeat(64), content: "text1" }, |
||||||
|
{ id: "2", pubkey: "b".repeat(64), content: "text2" }, |
||||||
|
{ id: "3", pubkey: "a".repeat(64), content: "text3" }, |
||||||
|
]; |
||||||
|
|
||||||
|
expect(highlights.length).toBe(3); |
||||||
|
|
||||||
|
// Count unique highlighters
|
||||||
|
const uniqueHighlighters = new Set(highlights.map((h) => h.pubkey)); |
||||||
|
expect(uniqueHighlighters.size).toBe(2); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Text Matching", () => { |
||||||
|
it("matches text case-insensitively", () => { |
||||||
|
const searchText = "Hello World"; |
||||||
|
const contentText = "hello world"; |
||||||
|
|
||||||
|
const index = contentText.toLowerCase().indexOf(searchText.toLowerCase()); |
||||||
|
|
||||||
|
expect(index).toBeGreaterThanOrEqual(0); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles special characters in search text", () => { |
||||||
|
const searchText = 'text with "quotes" and symbols!'; |
||||||
|
const contentText = 'This is text with "quotes" and symbols! in it.'; |
||||||
|
|
||||||
|
const index = contentText.toLowerCase().indexOf(searchText.toLowerCase()); |
||||||
|
|
||||||
|
expect(index).toBeGreaterThanOrEqual(0); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles Unicode characters", () => { |
||||||
|
const searchText = "café résumé"; |
||||||
|
const contentText = "The café résumé was excellent."; |
||||||
|
|
||||||
|
const index = contentText.toLowerCase().indexOf(searchText.toLowerCase()); |
||||||
|
|
||||||
|
expect(index).toBeGreaterThanOrEqual(0); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles multi-line text", () => { |
||||||
|
const searchText = "line one\nline two"; |
||||||
|
const contentText = "This is line one\nline two in the document."; |
||||||
|
|
||||||
|
const index = contentText.indexOf(searchText); |
||||||
|
|
||||||
|
expect(index).toBeGreaterThanOrEqual(0); |
||||||
|
}); |
||||||
|
|
||||||
|
it("does not match partial words when searching for whole words", () => { |
||||||
|
const searchText = "cat"; |
||||||
|
const contentText = "The category is important."; |
||||||
|
|
||||||
|
// Simple word boundary check
|
||||||
|
const wordBoundaryMatch = new RegExp(`\\b${searchText}\\b`, "i").test( |
||||||
|
contentText, |
||||||
|
); |
||||||
|
|
||||||
|
expect(wordBoundaryMatch).toBe(false); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Subscription Lifecycle", () => { |
||||||
|
it("registers EOSE event handler", () => { |
||||||
|
const subscription = mockNdk.subscribe({ kinds: [9802], limit: 100 }); |
||||||
|
|
||||||
|
// Verify that 'on' method is available for registering handlers
|
||||||
|
expect(subscription.on).toBeDefined(); |
||||||
|
|
||||||
|
// Register EOSE handler
|
||||||
|
subscription.on("eose", () => { |
||||||
|
subscription.stop(); |
||||||
|
}); |
||||||
|
|
||||||
|
// Verify on was called
|
||||||
|
expect(subscription.on).toHaveBeenCalledWith( |
||||||
|
"eose", |
||||||
|
expect.any(Function), |
||||||
|
); |
||||||
|
}); |
||||||
|
|
||||||
|
it("registers error event handler", () => { |
||||||
|
const subscription = mockNdk.subscribe({ kinds: [9802], limit: 100 }); |
||||||
|
|
||||||
|
// Verify that 'on' method is available for registering handlers
|
||||||
|
expect(subscription.on).toBeDefined(); |
||||||
|
|
||||||
|
// Register error handler
|
||||||
|
subscription.on("error", () => { |
||||||
|
subscription.stop(); |
||||||
|
}); |
||||||
|
|
||||||
|
// Verify on was called
|
||||||
|
expect(subscription.on).toHaveBeenCalledWith( |
||||||
|
"error", |
||||||
|
expect.any(Function), |
||||||
|
); |
||||||
|
}); |
||||||
|
|
||||||
|
it("stops subscription on timeout", async () => { |
||||||
|
vi.useFakeTimers(); |
||||||
|
|
||||||
|
mockNdk.subscribe({ kinds: [9802], limit: 100 }); |
||||||
|
|
||||||
|
// Fast-forward time by 10 seconds
|
||||||
|
vi.advanceTimersByTime(10000); |
||||||
|
|
||||||
|
// Subscription should be stopped after timeout
|
||||||
|
// Note: This would be tested in the actual component
|
||||||
|
|
||||||
|
vi.useRealTimers(); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles multiple subscription cleanup calls safely", () => { |
||||||
|
mockNdk.subscribe({ kinds: [9802], limit: 100 }); |
||||||
|
|
||||||
|
// Call stop multiple times
|
||||||
|
mockSubscription.stop(); |
||||||
|
mockSubscription.stop(); |
||||||
|
mockSubscription.stop(); |
||||||
|
|
||||||
|
expect(mockSubscription.stop).toHaveBeenCalledTimes(3); |
||||||
|
// Should not throw errors
|
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Performance", () => { |
||||||
|
it("handles large number of highlights efficiently", () => { |
||||||
|
const startTime = Date.now(); |
||||||
|
|
||||||
|
const highlights = Array.from({ length: 1000 }, (_, i) => ({ |
||||||
|
id: `highlight${i}`, |
||||||
|
kind: 9802, |
||||||
|
pubkey: (i % 10).toString().repeat(64), |
||||||
|
content: `highlighted text ${i}`, |
||||||
|
created_at: Date.now(), |
||||||
|
tags: [], |
||||||
|
})); |
||||||
|
|
||||||
|
// Generate colors for all highlights
|
||||||
|
const colorMap = new Map<string, string>(); |
||||||
|
highlights.forEach((h) => { |
||||||
|
if (!colorMap.has(h.pubkey)) { |
||||||
|
const hue = pubkeyToHue(h.pubkey); |
||||||
|
colorMap.set(h.pubkey, `hsla(${hue}, 70%, 60%, 0.3)`); |
||||||
|
} |
||||||
|
}); |
||||||
|
|
||||||
|
const endTime = Date.now(); |
||||||
|
const duration = endTime - startTime; |
||||||
|
|
||||||
|
expect(highlights.length).toBe(1000); |
||||||
|
expect(colorMap.size).toBe(10); |
||||||
|
expect(duration).toBeLessThan(1000); // Should complete in less than 1 second
|
||||||
|
}); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Integration Tests", () => { |
||||||
|
describe("Toggle Functionality", () => { |
||||||
|
it("toggle button shows highlights when clicked", () => { |
||||||
|
let highlightsVisible = false; |
||||||
|
|
||||||
|
// Simulate toggle
|
||||||
|
highlightsVisible = !highlightsVisible; |
||||||
|
|
||||||
|
expect(highlightsVisible).toBe(true); |
||||||
|
}); |
||||||
|
|
||||||
|
it("toggle button hides highlights when clicked again", () => { |
||||||
|
let highlightsVisible = true; |
||||||
|
|
||||||
|
// Simulate toggle
|
||||||
|
highlightsVisible = !highlightsVisible; |
||||||
|
|
||||||
|
expect(highlightsVisible).toBe(false); |
||||||
|
}); |
||||||
|
|
||||||
|
it("toggle state persists between interactions", () => { |
||||||
|
let highlightsVisible = false; |
||||||
|
|
||||||
|
highlightsVisible = !highlightsVisible; |
||||||
|
expect(highlightsVisible).toBe(true); |
||||||
|
|
||||||
|
highlightsVisible = !highlightsVisible; |
||||||
|
expect(highlightsVisible).toBe(false); |
||||||
|
|
||||||
|
highlightsVisible = !highlightsVisible; |
||||||
|
expect(highlightsVisible).toBe(true); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Color Format Validation", () => { |
||||||
|
it("generates semi-transparent colors with 0.3 opacity", () => { |
||||||
|
const pubkeys = [ |
||||||
|
"a".repeat(64), |
||||||
|
"b".repeat(64), |
||||||
|
"c".repeat(64), |
||||||
|
]; |
||||||
|
|
||||||
|
pubkeys.forEach((pubkey) => { |
||||||
|
const hue = pubkeyToHue(pubkey); |
||||||
|
const color = `hsla(${hue}, 70%, 60%, 0.3)`; |
||||||
|
|
||||||
|
expect(color).toContain("0.3"); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
it("uses HSL color format with correct saturation and lightness", () => { |
||||||
|
const pubkey = "a".repeat(64); |
||||||
|
const hue = pubkeyToHue(pubkey); |
||||||
|
const color = `hsla(${hue}, 70%, 60%, 0.3)`; |
||||||
|
|
||||||
|
expect(color).toContain("70%"); |
||||||
|
expect(color).toContain("60%"); |
||||||
|
}); |
||||||
|
|
||||||
|
it("generates valid CSS color strings", () => { |
||||||
|
const pubkeys = Array.from( |
||||||
|
{ length: 20 }, |
||||||
|
(_, i) => String.fromCharCode(97 + i).repeat(64), |
||||||
|
); |
||||||
|
|
||||||
|
pubkeys.forEach((pubkey) => { |
||||||
|
const hue = pubkeyToHue(pubkey); |
||||||
|
const color = `hsla(${hue}, 70%, 60%, 0.3)`; |
||||||
|
|
||||||
|
// Validate CSS color format
|
||||||
|
expect(color).toMatch(/^hsla\(\d+, 70%, 60%, 0\.3\)$/); |
||||||
|
}); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("End-to-End Flow", () => { |
||||||
|
it("complete highlight workflow", () => { |
||||||
|
// 1. Start with no highlights visible
|
||||||
|
let highlightsVisible = false; |
||||||
|
let highlights: any[] = []; |
||||||
|
|
||||||
|
expect(highlightsVisible).toBe(false); |
||||||
|
expect(highlights.length).toBe(0); |
||||||
|
|
||||||
|
// 2. Fetch highlights
|
||||||
|
const mockHighlights = [ |
||||||
|
{ |
||||||
|
id: "h1", |
||||||
|
kind: 9802, |
||||||
|
pubkey: "a".repeat(64), |
||||||
|
content: "first highlight", |
||||||
|
created_at: Date.now(), |
||||||
|
tags: [], |
||||||
|
}, |
||||||
|
{ |
||||||
|
id: "h2", |
||||||
|
kind: 9802, |
||||||
|
pubkey: "b".repeat(64), |
||||||
|
content: "second highlight", |
||||||
|
created_at: Date.now(), |
||||||
|
tags: [], |
||||||
|
}, |
||||||
|
]; |
||||||
|
|
||||||
|
highlights = mockHighlights; |
||||||
|
expect(highlights.length).toBe(2); |
||||||
|
|
||||||
|
// 3. Generate color map
|
||||||
|
const colorMap = new Map<string, string>(); |
||||||
|
highlights.forEach((h) => { |
||||||
|
if (!colorMap.has(h.pubkey)) { |
||||||
|
const hue = pubkeyToHue(h.pubkey); |
||||||
|
colorMap.set(h.pubkey, `hsla(${hue}, 70%, 60%, 0.3)`); |
||||||
|
} |
||||||
|
}); |
||||||
|
|
||||||
|
expect(colorMap.size).toBe(2); |
||||||
|
|
||||||
|
// 4. Toggle visibility
|
||||||
|
highlightsVisible = true; |
||||||
|
expect(highlightsVisible).toBe(true); |
||||||
|
|
||||||
|
// 5. Verify colors are different
|
||||||
|
const colors = Array.from(colorMap.values()); |
||||||
|
expect(colors[0]).not.toBe(colors[1]); |
||||||
|
|
||||||
|
// 6. Toggle off
|
||||||
|
highlightsVisible = false; |
||||||
|
expect(highlightsVisible).toBe(false); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles event updates correctly", () => { |
||||||
|
let eventId = "event1"; |
||||||
|
let highlights: any[] = []; |
||||||
|
|
||||||
|
// Initial load
|
||||||
|
highlights = [ |
||||||
|
{ |
||||||
|
id: "h1", |
||||||
|
kind: 9802, |
||||||
|
pubkey: "a".repeat(64), |
||||||
|
content: "highlight 1", |
||||||
|
created_at: Date.now(), |
||||||
|
tags: [], |
||||||
|
}, |
||||||
|
]; |
||||||
|
|
||||||
|
expect(highlights.length).toBe(1); |
||||||
|
|
||||||
|
// Event changes
|
||||||
|
eventId = "event2"; |
||||||
|
highlights = []; |
||||||
|
|
||||||
|
expect(highlights.length).toBe(0); |
||||||
|
|
||||||
|
// New highlights loaded
|
||||||
|
highlights = [ |
||||||
|
{ |
||||||
|
id: "h2", |
||||||
|
kind: 9802, |
||||||
|
pubkey: "b".repeat(64), |
||||||
|
content: "highlight 2", |
||||||
|
created_at: Date.now(), |
||||||
|
tags: [], |
||||||
|
}, |
||||||
|
]; |
||||||
|
|
||||||
|
expect(highlights.length).toBe(1); |
||||||
|
expect(highlights[0].id).toBe("h2"); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Error Handling", () => { |
||||||
|
it("handles missing event ID and address gracefully", () => { |
||||||
|
const eventId = undefined; |
||||||
|
const eventAddress = undefined; |
||||||
|
|
||||||
|
// Should not attempt to fetch
|
||||||
|
expect(eventId).toBeUndefined(); |
||||||
|
expect(eventAddress).toBeUndefined(); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles subscription errors gracefully", () => { |
||||||
|
const error = new Error("Subscription failed"); |
||||||
|
|
||||||
|
// Should log error but not crash
|
||||||
|
expect(error.message).toBe("Subscription failed"); |
||||||
|
}); |
||||||
|
|
||||||
|
it("handles malformed highlight events", () => { |
||||||
|
const malformedHighlight = { |
||||||
|
id: "h1", |
||||||
|
kind: 9802, |
||||||
|
pubkey: "", // Empty pubkey
|
||||||
|
content: undefined, // Missing content
|
||||||
|
created_at: Date.now(), |
||||||
|
tags: [], |
||||||
|
}; |
||||||
|
|
||||||
|
// Should handle gracefully
|
||||||
|
expect(malformedHighlight.pubkey).toBe(""); |
||||||
|
expect(malformedHighlight.content).toBeUndefined(); |
||||||
|
}); |
||||||
|
}); |
||||||
|
}); |
||||||
@ -0,0 +1,875 @@ |
|||||||
|
import { afterEach, beforeEach, describe, expect, it, vi } from "vitest"; |
||||||
|
import { NDKEvent } from "@nostr-dev-kit/ndk"; |
||||||
|
import type NDK from "@nostr-dev-kit/ndk"; |
||||||
|
|
||||||
|
// Mock flowbite-svelte components
|
||||||
|
vi.mock("flowbite-svelte", () => ({ |
||||||
|
Button: vi.fn().mockImplementation((props) => ({ |
||||||
|
$$render: () => |
||||||
|
`<button data-testid="button">${props.children || ""}</button>`, |
||||||
|
})), |
||||||
|
Modal: vi.fn().mockImplementation(() => ({ |
||||||
|
$$render: () => `<div data-testid="modal"></div>`, |
||||||
|
})), |
||||||
|
Textarea: vi.fn().mockImplementation(() => ({ |
||||||
|
$$render: () => `<textarea data-testid="textarea"></textarea>`, |
||||||
|
})), |
||||||
|
P: vi.fn().mockImplementation(() => ({ |
||||||
|
$$render: () => `<p data-testid="p"></p>`, |
||||||
|
})), |
||||||
|
})); |
||||||
|
|
||||||
|
// Mock flowbite-svelte-icons
|
||||||
|
vi.mock("flowbite-svelte-icons", () => ({ |
||||||
|
FontHighlightOutline: vi.fn().mockImplementation(() => ({ |
||||||
|
$$render: () => `<svg data-testid="highlight-icon"></svg>`, |
||||||
|
})), |
||||||
|
})); |
||||||
|
|
||||||
|
describe("HighlightButton Component Logic", () => { |
||||||
|
let isActive: boolean; |
||||||
|
|
||||||
|
beforeEach(() => { |
||||||
|
isActive = false; |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Initial State", () => { |
||||||
|
it("should initialize with inactive state", () => { |
||||||
|
expect(isActive).toBe(false); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should have correct inactive label", () => { |
||||||
|
const label = isActive ? "Exit Highlight Mode" : "Add Highlight"; |
||||||
|
expect(label).toBe("Add Highlight"); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should have correct inactive title", () => { |
||||||
|
const title = isActive ? "Exit highlight mode" : "Enter highlight mode"; |
||||||
|
expect(title).toBe("Enter highlight mode"); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should have correct inactive color", () => { |
||||||
|
const color = isActive ? "primary" : "light"; |
||||||
|
expect(color).toBe("light"); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should not have ring styling when inactive", () => { |
||||||
|
const ringClass = isActive ? "ring-2 ring-primary-500" : ""; |
||||||
|
expect(ringClass).toBe(""); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Toggle Functionality", () => { |
||||||
|
it("should toggle to active state when clicked", () => { |
||||||
|
// Simulate toggle
|
||||||
|
isActive = !isActive; |
||||||
|
expect(isActive).toBe(true); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should toggle back to inactive state on second click", () => { |
||||||
|
// Simulate two toggles
|
||||||
|
isActive = !isActive; |
||||||
|
isActive = !isActive; |
||||||
|
expect(isActive).toBe(false); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should show correct label when active", () => { |
||||||
|
isActive = true; |
||||||
|
const label = isActive ? "Exit Highlight Mode" : "Add Highlight"; |
||||||
|
expect(label).toBe("Exit Highlight Mode"); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should show correct title when active", () => { |
||||||
|
isActive = true; |
||||||
|
const title = isActive ? "Exit highlight mode" : "Enter highlight mode"; |
||||||
|
expect(title).toBe("Exit highlight mode"); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Active State Styling", () => { |
||||||
|
it("should apply primary color when active", () => { |
||||||
|
isActive = true; |
||||||
|
const color = isActive ? "primary" : "light"; |
||||||
|
expect(color).toBe("primary"); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should apply ring styling when active", () => { |
||||||
|
isActive = true; |
||||||
|
const ringClass = isActive ? "ring-2 ring-primary-500" : ""; |
||||||
|
expect(ringClass).toBe("ring-2 ring-primary-500"); |
||||||
|
}); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("HighlightSelectionHandler Component Logic", () => { |
||||||
|
let mockNDK: NDKEvent; |
||||||
|
let mockUserStore: any; |
||||||
|
let mockSelection: Selection; |
||||||
|
let mockPublicationEvent: NDKEvent; |
||||||
|
let isActive: boolean; |
||||||
|
|
||||||
|
beforeEach(() => { |
||||||
|
// Reset mocks
|
||||||
|
vi.clearAllMocks(); |
||||||
|
isActive = false; |
||||||
|
|
||||||
|
// Mock document and DOM elements
|
||||||
|
const mockElement = { |
||||||
|
createElement: vi.fn((tag: string) => ({ |
||||||
|
tagName: tag.toUpperCase(), |
||||||
|
textContent: "", |
||||||
|
className: "", |
||||||
|
closest: vi.fn(), |
||||||
|
parentElement: null, |
||||||
|
})), |
||||||
|
addEventListener: vi.fn(), |
||||||
|
removeEventListener: vi.fn(), |
||||||
|
body: { |
||||||
|
classList: { |
||||||
|
add: vi.fn(), |
||||||
|
remove: vi.fn(), |
||||||
|
}, |
||||||
|
}, |
||||||
|
}; |
||||||
|
|
||||||
|
global.document = mockElement as any; |
||||||
|
|
||||||
|
// Mock NDK event
|
||||||
|
mockPublicationEvent = { |
||||||
|
id: "test-event-id", |
||||||
|
pubkey: "test-pubkey", |
||||||
|
kind: 30023, |
||||||
|
tagAddress: vi.fn().mockReturnValue("30023:test-pubkey:test-d-tag"), |
||||||
|
tags: [], |
||||||
|
content: "", |
||||||
|
} as unknown as NDKEvent; |
||||||
|
|
||||||
|
// Mock user store
|
||||||
|
mockUserStore = { |
||||||
|
signedIn: true, |
||||||
|
signer: { |
||||||
|
sign: vi.fn().mockResolvedValue(undefined), |
||||||
|
}, |
||||||
|
}; |
||||||
|
|
||||||
|
// Mock window.getSelection
|
||||||
|
const mockParagraph = { |
||||||
|
textContent: "This is the full paragraph context", |
||||||
|
closest: vi.fn(), |
||||||
|
}; |
||||||
|
|
||||||
|
mockSelection = { |
||||||
|
toString: vi.fn().mockReturnValue("Selected text from publication"), |
||||||
|
isCollapsed: false, |
||||||
|
removeAllRanges: vi.fn(), |
||||||
|
anchorNode: { |
||||||
|
parentElement: mockParagraph, |
||||||
|
}, |
||||||
|
} as unknown as Selection; |
||||||
|
|
||||||
|
global.window = { |
||||||
|
getSelection: vi.fn().mockReturnValue(mockSelection), |
||||||
|
} as any; |
||||||
|
}); |
||||||
|
|
||||||
|
afterEach(() => { |
||||||
|
vi.clearAllMocks(); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Selection Detection", () => { |
||||||
|
it("should ignore mouseup events when isActive is false", () => { |
||||||
|
isActive = false; |
||||||
|
const shouldProcess = isActive; |
||||||
|
expect(shouldProcess).toBe(false); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should process mouseup events when isActive is true", () => { |
||||||
|
isActive = true; |
||||||
|
const shouldProcess = isActive; |
||||||
|
expect(shouldProcess).toBe(true); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should ignore collapsed selections", () => { |
||||||
|
const selection = { isCollapsed: true } as Selection; |
||||||
|
const shouldIgnore = selection.isCollapsed; |
||||||
|
expect(shouldIgnore).toBe(true); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should process non-collapsed selections", () => { |
||||||
|
const selection = { isCollapsed: false } as Selection; |
||||||
|
const shouldIgnore = selection.isCollapsed; |
||||||
|
expect(shouldIgnore).toBe(false); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should ignore selections with less than 3 characters", () => { |
||||||
|
const text = "ab"; |
||||||
|
const isValid = text.length >= 3; |
||||||
|
expect(isValid).toBe(false); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should accept selections with 3 or more characters", () => { |
||||||
|
const text = "abc"; |
||||||
|
const isValid = text.length >= 3; |
||||||
|
expect(isValid).toBe(true); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should ignore empty selections after trim", () => { |
||||||
|
const text = " "; |
||||||
|
const trimmed = text.trim(); |
||||||
|
const isValid = trimmed.length >= 3; |
||||||
|
expect(isValid).toBe(false); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("User Authentication", () => { |
||||||
|
it("should reject selection when user not signed in", () => { |
||||||
|
const userStore = { signedIn: false }; |
||||||
|
expect(userStore.signedIn).toBe(false); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should process selection when user signed in", () => { |
||||||
|
const userStore = { signedIn: true }; |
||||||
|
expect(userStore.signedIn).toBe(true); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should check for signer before creating highlight", () => { |
||||||
|
const userStore = { |
||||||
|
signedIn: true, |
||||||
|
signer: { sign: vi.fn() }, |
||||||
|
}; |
||||||
|
expect(userStore.signer).toBeDefined(); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should reject creation without signer", () => { |
||||||
|
const userStore = { |
||||||
|
signedIn: true, |
||||||
|
signer: null, |
||||||
|
}; |
||||||
|
expect(userStore.signer).toBeNull(); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Publication Context Detection", () => { |
||||||
|
it("should detect selection within publication-leather class", () => { |
||||||
|
const mockElement = { |
||||||
|
className: "publication-leather", |
||||||
|
closest: vi.fn((selector: string) => { |
||||||
|
return selector === ".publication-leather" ? mockElement : null; |
||||||
|
}), |
||||||
|
}; |
||||||
|
const target = mockElement; |
||||||
|
const publicationSection = target.closest(".publication-leather"); |
||||||
|
expect(publicationSection).toBeTruthy(); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should reject selection outside publication-leather class", () => { |
||||||
|
const mockElement = { |
||||||
|
className: "other-section", |
||||||
|
closest: vi.fn((selector: string) => { |
||||||
|
return selector === ".publication-leather" ? null : mockElement; |
||||||
|
}), |
||||||
|
}; |
||||||
|
const target = mockElement; |
||||||
|
const publicationSection = target.closest(".publication-leather"); |
||||||
|
expect(publicationSection).toBeNull(); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Context Extraction", () => { |
||||||
|
it("should extract context from parent paragraph", () => { |
||||||
|
const paragraph = { |
||||||
|
textContent: |
||||||
|
"This is the full paragraph context with selected text inside.", |
||||||
|
}; |
||||||
|
|
||||||
|
const context = paragraph.textContent?.trim() || ""; |
||||||
|
expect(context).toBe( |
||||||
|
"This is the full paragraph context with selected text inside.", |
||||||
|
); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should extract context from parent section", () => { |
||||||
|
const section = { |
||||||
|
textContent: "Full section context including selected text.", |
||||||
|
}; |
||||||
|
|
||||||
|
const context = section.textContent?.trim() || ""; |
||||||
|
expect(context).toBe("Full section context including selected text."); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should extract context from parent div", () => { |
||||||
|
const div = { |
||||||
|
textContent: "Full div context including selected text.", |
||||||
|
}; |
||||||
|
|
||||||
|
const context = div.textContent?.trim() || ""; |
||||||
|
expect(context).toBe("Full div context including selected text."); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should handle missing context gracefully", () => { |
||||||
|
const context = ""; |
||||||
|
expect(context).toBe(""); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("NIP-84 Event Creation - Addressable Events", () => { |
||||||
|
it("should use 'a' tag for addressable events", () => { |
||||||
|
const eventAddress = "30023:pubkey:d-tag"; |
||||||
|
const tags: string[][] = []; |
||||||
|
|
||||||
|
if (eventAddress) { |
||||||
|
tags.push(["a", eventAddress, ""]); |
||||||
|
} |
||||||
|
|
||||||
|
expect(tags).toContainEqual(["a", eventAddress, ""]); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should create event with correct kind 9802", () => { |
||||||
|
const event = { |
||||||
|
kind: 9802, |
||||||
|
content: "", |
||||||
|
tags: [], |
||||||
|
}; |
||||||
|
|
||||||
|
expect(event.kind).toBe(9802); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should include selected text as content", () => { |
||||||
|
const selectedText = "This is the selected highlight text"; |
||||||
|
const event = { |
||||||
|
kind: 9802, |
||||||
|
content: selectedText, |
||||||
|
tags: [], |
||||||
|
}; |
||||||
|
|
||||||
|
expect(event.content).toBe(selectedText); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should include context tag", () => { |
||||||
|
const context = "This is the surrounding context"; |
||||||
|
const tags: string[][] = []; |
||||||
|
|
||||||
|
if (context) { |
||||||
|
tags.push(["context", context]); |
||||||
|
} |
||||||
|
|
||||||
|
expect(tags).toContainEqual(["context", context]); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should include author p-tag with role", () => { |
||||||
|
const pubkey = "author-pubkey-hex"; |
||||||
|
const tags: string[][] = []; |
||||||
|
|
||||||
|
if (pubkey) { |
||||||
|
tags.push(["p", pubkey, "", "author"]); |
||||||
|
} |
||||||
|
|
||||||
|
expect(tags).toContainEqual(["p", pubkey, "", "author"]); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should include comment tag when comment provided", () => { |
||||||
|
const comment = "This is my insightful comment"; |
||||||
|
const tags: string[][] = []; |
||||||
|
|
||||||
|
if (comment.trim()) { |
||||||
|
tags.push(["comment", comment.trim()]); |
||||||
|
} |
||||||
|
|
||||||
|
expect(tags).toContainEqual(["comment", comment]); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should not include comment tag when comment is empty", () => { |
||||||
|
const comment = ""; |
||||||
|
const tags: string[][] = []; |
||||||
|
|
||||||
|
if (comment.trim()) { |
||||||
|
tags.push(["comment", comment.trim()]); |
||||||
|
} |
||||||
|
|
||||||
|
expect(tags).not.toContainEqual(["comment", ""]); |
||||||
|
expect(tags.length).toBe(0); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should not include comment tag when comment is only whitespace", () => { |
||||||
|
const comment = " "; |
||||||
|
const tags: string[][] = []; |
||||||
|
|
||||||
|
if (comment.trim()) { |
||||||
|
tags.push(["comment", comment.trim()]); |
||||||
|
} |
||||||
|
|
||||||
|
expect(tags.length).toBe(0); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("NIP-84 Event Creation - Regular Events", () => { |
||||||
|
it("should use 'e' tag for regular events", () => { |
||||||
|
const eventId = "regular-event-id"; |
||||||
|
const eventAddress = null; // No address means regular event
|
||||||
|
const tags: string[][] = []; |
||||||
|
|
||||||
|
if (eventAddress) { |
||||||
|
tags.push(["a", eventAddress, ""]); |
||||||
|
} else { |
||||||
|
tags.push(["e", eventId, ""]); |
||||||
|
} |
||||||
|
|
||||||
|
expect(tags).toContainEqual(["e", eventId, ""]); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should prefer addressable event over regular event", () => { |
||||||
|
const eventId = "regular-event-id"; |
||||||
|
const eventAddress = "30023:pubkey:d-tag"; |
||||||
|
const tags: string[][] = []; |
||||||
|
|
||||||
|
if (eventAddress) { |
||||||
|
tags.push(["a", eventAddress, ""]); |
||||||
|
} else { |
||||||
|
tags.push(["e", eventId, ""]); |
||||||
|
} |
||||||
|
|
||||||
|
expect(tags).toContainEqual(["a", eventAddress, ""]); |
||||||
|
expect(tags).not.toContainEqual(["e", eventId, ""]); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Complete Event Structure", () => { |
||||||
|
it("should create complete highlight event with all required tags", () => { |
||||||
|
const selectedText = "Highlighted text"; |
||||||
|
const context = "Full context paragraph"; |
||||||
|
const pubkey = "author-pubkey"; |
||||||
|
const eventAddress = "30023:pubkey:d-tag"; |
||||||
|
|
||||||
|
const event = { |
||||||
|
kind: 9802, |
||||||
|
content: selectedText, |
||||||
|
tags: [ |
||||||
|
["a", eventAddress, ""], |
||||||
|
["context", context], |
||||||
|
["p", pubkey, "", "author"], |
||||||
|
], |
||||||
|
}; |
||||||
|
|
||||||
|
expect(event.kind).toBe(9802); |
||||||
|
expect(event.content).toBe(selectedText); |
||||||
|
expect(event.tags).toHaveLength(3); |
||||||
|
expect(event.tags[0]).toEqual(["a", eventAddress, ""]); |
||||||
|
expect(event.tags[1]).toEqual(["context", context]); |
||||||
|
expect(event.tags[2]).toEqual(["p", pubkey, "", "author"]); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should create complete quote highlight with comment", () => { |
||||||
|
const selectedText = "Highlighted text"; |
||||||
|
const context = "Full context paragraph"; |
||||||
|
const pubkey = "author-pubkey"; |
||||||
|
const eventAddress = "30023:pubkey:d-tag"; |
||||||
|
const comment = "My thoughtful comment"; |
||||||
|
|
||||||
|
const event = { |
||||||
|
kind: 9802, |
||||||
|
content: selectedText, |
||||||
|
tags: [ |
||||||
|
["a", eventAddress, ""], |
||||||
|
["context", context], |
||||||
|
["p", pubkey, "", "author"], |
||||||
|
["comment", comment], |
||||||
|
], |
||||||
|
}; |
||||||
|
|
||||||
|
expect(event.kind).toBe(9802); |
||||||
|
expect(event.content).toBe(selectedText); |
||||||
|
expect(event.tags).toHaveLength(4); |
||||||
|
expect(event.tags[3]).toEqual(["comment", comment]); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should handle event without context", () => { |
||||||
|
const selectedText = "Highlighted text"; |
||||||
|
const context = ""; |
||||||
|
const pubkey = "author-pubkey"; |
||||||
|
const eventId = "event-id"; |
||||||
|
|
||||||
|
const tags: string[][] = []; |
||||||
|
tags.push(["e", eventId, ""]); |
||||||
|
if (context) { |
||||||
|
tags.push(["context", context]); |
||||||
|
} |
||||||
|
tags.push(["p", pubkey, "", "author"]); |
||||||
|
|
||||||
|
expect(tags).toHaveLength(2); |
||||||
|
expect(tags).not.toContainEqual(["context", ""]); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Event Signing and Publishing", () => { |
||||||
|
it("should sign event before publishing", async () => { |
||||||
|
const mockSigner = { |
||||||
|
sign: vi.fn().mockResolvedValue(undefined), |
||||||
|
}; |
||||||
|
|
||||||
|
const mockEvent = { |
||||||
|
kind: 9802, |
||||||
|
content: "test", |
||||||
|
tags: [], |
||||||
|
sign: vi.fn().mockResolvedValue(undefined), |
||||||
|
publish: vi.fn().mockResolvedValue(undefined), |
||||||
|
}; |
||||||
|
|
||||||
|
await mockEvent.sign(mockSigner); |
||||||
|
expect(mockEvent.sign).toHaveBeenCalledWith(mockSigner); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should publish event after signing", async () => { |
||||||
|
const mockEvent = { |
||||||
|
sign: vi.fn().mockResolvedValue(undefined), |
||||||
|
publish: vi.fn().mockResolvedValue(undefined), |
||||||
|
}; |
||||||
|
|
||||||
|
await mockEvent.sign({}); |
||||||
|
await mockEvent.publish(); |
||||||
|
|
||||||
|
expect(mockEvent.publish).toHaveBeenCalled(); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should handle signing errors", async () => { |
||||||
|
const mockEvent = { |
||||||
|
sign: vi.fn().mockRejectedValue(new Error("Signing failed")), |
||||||
|
}; |
||||||
|
|
||||||
|
await expect(mockEvent.sign({})).rejects.toThrow("Signing failed"); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should handle publishing errors", async () => { |
||||||
|
const mockEvent = { |
||||||
|
sign: vi.fn().mockResolvedValue(undefined), |
||||||
|
publish: vi.fn().mockRejectedValue(new Error("Publishing failed")), |
||||||
|
}; |
||||||
|
|
||||||
|
await mockEvent.sign({}); |
||||||
|
await expect(mockEvent.publish()).rejects.toThrow("Publishing failed"); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Selection Cleanup", () => { |
||||||
|
it("should clear selection after successful highlight creation", () => { |
||||||
|
const mockSelection = { |
||||||
|
removeAllRanges: vi.fn(), |
||||||
|
}; |
||||||
|
|
||||||
|
mockSelection.removeAllRanges(); |
||||||
|
expect(mockSelection.removeAllRanges).toHaveBeenCalled(); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should reset selectedText after creation", () => { |
||||||
|
let selectedText = "Some text"; |
||||||
|
selectedText = ""; |
||||||
|
expect(selectedText).toBe(""); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should reset comment after creation", () => { |
||||||
|
let comment = "Some comment"; |
||||||
|
comment = ""; |
||||||
|
expect(comment).toBe(""); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should reset context after creation", () => { |
||||||
|
let context = "Some context"; |
||||||
|
context = ""; |
||||||
|
expect(context).toBe(""); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should close modal after creation", () => { |
||||||
|
let showModal = true; |
||||||
|
showModal = false; |
||||||
|
expect(showModal).toBe(false); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Cancel Functionality", () => { |
||||||
|
it("should clear selection when cancelled", () => { |
||||||
|
const mockSelection = { |
||||||
|
removeAllRanges: vi.fn(), |
||||||
|
}; |
||||||
|
|
||||||
|
// Simulate cancel
|
||||||
|
mockSelection.removeAllRanges(); |
||||||
|
expect(mockSelection.removeAllRanges).toHaveBeenCalled(); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should reset all state when cancelled", () => { |
||||||
|
let selectedText = "text"; |
||||||
|
let comment = "comment"; |
||||||
|
let context = "context"; |
||||||
|
let showModal = true; |
||||||
|
|
||||||
|
// Simulate cancel
|
||||||
|
selectedText = ""; |
||||||
|
comment = ""; |
||||||
|
context = ""; |
||||||
|
showModal = false; |
||||||
|
|
||||||
|
expect(selectedText).toBe(""); |
||||||
|
expect(comment).toBe(""); |
||||||
|
expect(context).toBe(""); |
||||||
|
expect(showModal).toBe(false); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Feedback Messages", () => { |
||||||
|
it("should show success message after creation", () => { |
||||||
|
const message = "Highlight created successfully!"; |
||||||
|
const type = "success"; |
||||||
|
|
||||||
|
expect(message).toBe("Highlight created successfully!"); |
||||||
|
expect(type).toBe("success"); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should show error message on failure", () => { |
||||||
|
const message = "Failed to create highlight. Please try again."; |
||||||
|
const type = "error"; |
||||||
|
|
||||||
|
expect(message).toBe("Failed to create highlight. Please try again."); |
||||||
|
expect(type).toBe("error"); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should show error when not signed in", () => { |
||||||
|
const message = "Please sign in to create highlights"; |
||||||
|
const type = "error"; |
||||||
|
|
||||||
|
expect(message).toBe("Please sign in to create highlights"); |
||||||
|
expect(type).toBe("error"); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should auto-hide feedback after delay", () => { |
||||||
|
let showFeedback = true; |
||||||
|
|
||||||
|
// Simulate timeout
|
||||||
|
setTimeout(() => { |
||||||
|
showFeedback = false; |
||||||
|
}, 3000); |
||||||
|
|
||||||
|
// Initially shown
|
||||||
|
expect(showFeedback).toBe(true); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Event Listeners", () => { |
||||||
|
it("should add mouseup listener on mount", () => { |
||||||
|
const mockAddEventListener = vi.fn(); |
||||||
|
document.addEventListener = mockAddEventListener; |
||||||
|
|
||||||
|
document.addEventListener("mouseup", () => {}); |
||||||
|
expect(mockAddEventListener).toHaveBeenCalledWith( |
||||||
|
"mouseup", |
||||||
|
expect.any(Function), |
||||||
|
); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should remove mouseup listener on unmount", () => { |
||||||
|
const mockRemoveEventListener = vi.fn(); |
||||||
|
document.removeEventListener = mockRemoveEventListener; |
||||||
|
|
||||||
|
const handler = () => {}; |
||||||
|
document.removeEventListener("mouseup", handler); |
||||||
|
expect(mockRemoveEventListener).toHaveBeenCalledWith("mouseup", handler); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Highlight Mode Body Class", () => { |
||||||
|
it("should add highlight-mode-active class when active", () => { |
||||||
|
const mockClassList = { |
||||||
|
add: vi.fn(), |
||||||
|
remove: vi.fn(), |
||||||
|
}; |
||||||
|
document.body.classList = mockClassList as any; |
||||||
|
|
||||||
|
// Simulate active mode
|
||||||
|
document.body.classList.add("highlight-mode-active"); |
||||||
|
expect(mockClassList.add).toHaveBeenCalledWith("highlight-mode-active"); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should remove highlight-mode-active class when inactive", () => { |
||||||
|
const mockClassList = { |
||||||
|
add: vi.fn(), |
||||||
|
remove: vi.fn(), |
||||||
|
}; |
||||||
|
document.body.classList = mockClassList as any; |
||||||
|
|
||||||
|
// Simulate inactive mode
|
||||||
|
document.body.classList.remove("highlight-mode-active"); |
||||||
|
expect(mockClassList.remove).toHaveBeenCalledWith( |
||||||
|
"highlight-mode-active", |
||||||
|
); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should clean up class on unmount", () => { |
||||||
|
const mockClassList = { |
||||||
|
add: vi.fn(), |
||||||
|
remove: vi.fn(), |
||||||
|
}; |
||||||
|
document.body.classList = mockClassList as any; |
||||||
|
|
||||||
|
// Simulate cleanup
|
||||||
|
document.body.classList.remove("highlight-mode-active"); |
||||||
|
expect(mockClassList.remove).toHaveBeenCalledWith( |
||||||
|
"highlight-mode-active", |
||||||
|
); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Modal Display", () => { |
||||||
|
it("should show modal when text is selected", () => { |
||||||
|
let showModal = false; |
||||||
|
|
||||||
|
// Simulate successful selection
|
||||||
|
showModal = true; |
||||||
|
expect(showModal).toBe(true); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should display selected text in modal", () => { |
||||||
|
const selectedText = "This is the selected text"; |
||||||
|
const displayText = `"${selectedText}"`; |
||||||
|
|
||||||
|
expect(displayText).toBe('"This is the selected text"'); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should provide textarea for optional comment", () => { |
||||||
|
let comment = ""; |
||||||
|
const placeholder = "Share your thoughts about this highlight..."; |
||||||
|
|
||||||
|
expect(placeholder).toBe("Share your thoughts about this highlight..."); |
||||||
|
expect(comment).toBe(""); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should disable buttons while submitting", () => { |
||||||
|
const isSubmitting = true; |
||||||
|
const disabled = isSubmitting; |
||||||
|
|
||||||
|
expect(disabled).toBe(true); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should show 'Creating...' text while submitting", () => { |
||||||
|
const isSubmitting = true; |
||||||
|
const buttonText = isSubmitting ? "Creating..." : "Create Highlight"; |
||||||
|
|
||||||
|
expect(buttonText).toBe("Creating..."); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should show normal text when not submitting", () => { |
||||||
|
const isSubmitting = false; |
||||||
|
const buttonText = isSubmitting ? "Creating..." : "Create Highlight"; |
||||||
|
|
||||||
|
expect(buttonText).toBe("Create Highlight"); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Callback Execution", () => { |
||||||
|
it("should call onHighlightCreated callback after creation", () => { |
||||||
|
const mockCallback = vi.fn(); |
||||||
|
|
||||||
|
// Simulate successful creation
|
||||||
|
mockCallback(); |
||||||
|
|
||||||
|
expect(mockCallback).toHaveBeenCalled(); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should not call callback if creation fails", () => { |
||||||
|
const mockCallback = vi.fn(); |
||||||
|
|
||||||
|
// Simulate failed creation - callback not called
|
||||||
|
expect(mockCallback).not.toHaveBeenCalled(); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
describe("Integration Scenarios", () => { |
||||||
|
it("should handle complete highlight workflow", () => { |
||||||
|
// Setup
|
||||||
|
let isActive = true; |
||||||
|
let showModal = false; |
||||||
|
let selectedText = ""; |
||||||
|
const userSignedIn = true; |
||||||
|
const selection = { |
||||||
|
toString: () => "Selected text for highlighting", |
||||||
|
isCollapsed: false, |
||||||
|
}; |
||||||
|
|
||||||
|
// User selects text
|
||||||
|
if (isActive && userSignedIn && !selection.isCollapsed) { |
||||||
|
selectedText = selection.toString(); |
||||||
|
showModal = true; |
||||||
|
} |
||||||
|
|
||||||
|
expect(selectedText).toBe("Selected text for highlighting"); |
||||||
|
expect(showModal).toBe(true); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should handle complete quote highlight workflow with comment", () => { |
||||||
|
// Setup
|
||||||
|
let isActive = true; |
||||||
|
let showModal = false; |
||||||
|
let selectedText = ""; |
||||||
|
let comment = ""; |
||||||
|
const userSignedIn = true; |
||||||
|
const selection = { |
||||||
|
toString: () => "Selected text", |
||||||
|
isCollapsed: false, |
||||||
|
}; |
||||||
|
|
||||||
|
// User selects text
|
||||||
|
if (isActive && userSignedIn && !selection.isCollapsed) { |
||||||
|
selectedText = selection.toString(); |
||||||
|
showModal = true; |
||||||
|
} |
||||||
|
|
||||||
|
// User adds comment
|
||||||
|
comment = "This is insightful"; |
||||||
|
|
||||||
|
// Create event with comment
|
||||||
|
const tags: string[][] = []; |
||||||
|
if (comment.trim()) { |
||||||
|
tags.push(["comment", comment]); |
||||||
|
} |
||||||
|
|
||||||
|
expect(selectedText).toBe("Selected text"); |
||||||
|
expect(comment).toBe("This is insightful"); |
||||||
|
expect(tags).toContainEqual(["comment", "This is insightful"]); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should reject workflow when user not signed in", () => { |
||||||
|
let isActive = true; |
||||||
|
let showModal = false; |
||||||
|
const userSignedIn = false; |
||||||
|
const selection = { |
||||||
|
toString: () => "Selected text", |
||||||
|
isCollapsed: false, |
||||||
|
}; |
||||||
|
|
||||||
|
// User tries to select text
|
||||||
|
if (isActive && userSignedIn && !selection.isCollapsed) { |
||||||
|
showModal = true; |
||||||
|
} |
||||||
|
|
||||||
|
expect(showModal).toBe(false); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should handle workflow cancellation", () => { |
||||||
|
// Setup initial state
|
||||||
|
let showModal = true; |
||||||
|
let selectedText = "Some text"; |
||||||
|
let comment = "Some comment"; |
||||||
|
const mockSelection = { |
||||||
|
removeAllRanges: vi.fn(), |
||||||
|
}; |
||||||
|
|
||||||
|
// User cancels
|
||||||
|
showModal = false; |
||||||
|
selectedText = ""; |
||||||
|
comment = ""; |
||||||
|
mockSelection.removeAllRanges(); |
||||||
|
|
||||||
|
expect(showModal).toBe(false); |
||||||
|
expect(selectedText).toBe(""); |
||||||
|
expect(comment).toBe(""); |
||||||
|
expect(mockSelection.removeAllRanges).toHaveBeenCalled(); |
||||||
|
}); |
||||||
|
}); |
||||||
|
}); |
||||||
@ -0,0 +1,289 @@ |
|||||||
|
/** |
||||||
|
* TDD Tests for NKBIP-01 Publication Tree Processor |
||||||
|
* |
||||||
|
* Tests the iterative parsing function at different hierarchy levels |
||||||
|
* using deep_hierarchy_test.adoc to verify NKBIP-01 compliance. |
||||||
|
*/ |
||||||
|
|
||||||
|
import { beforeAll, describe, expect, it } from "vitest"; |
||||||
|
import { readFileSync } from "fs"; |
||||||
|
import { |
||||||
|
getSupportedParseLevels, |
||||||
|
parseAsciiDocWithTree, |
||||||
|
validateParseLevel, |
||||||
|
} from "../../src/lib/utils/asciidoc_publication_parser.js"; |
||||||
|
|
||||||
|
// Mock NDK for testing
|
||||||
|
const mockNDK = { |
||||||
|
activeUser: { |
||||||
|
pubkey: "test-pubkey-12345", |
||||||
|
}, |
||||||
|
} as any; |
||||||
|
|
||||||
|
// Read the test document
|
||||||
|
const testDocumentPath = "./test_data/AsciidocFiles/deep_hierarchy_test.adoc"; |
||||||
|
let testContent: string; |
||||||
|
|
||||||
|
try { |
||||||
|
testContent = readFileSync(testDocumentPath, "utf-8"); |
||||||
|
} catch (error) { |
||||||
|
console.error("Failed to read test document:", error); |
||||||
|
testContent = `= Deep Hierarchical Document Test
|
||||||
|
:tags: testing, hierarchy, structure |
||||||
|
:author: Test Author |
||||||
|
:type: technical |
||||||
|
|
||||||
|
This document tests all 6 levels of AsciiDoc hierarchy to validate our parse level system. |
||||||
|
|
||||||
|
== Level 2: Main Sections |
||||||
|
:tags: level2, main |
||||||
|
|
||||||
|
This is a level 2 section that should appear in all parse levels. |
||||||
|
|
||||||
|
=== Level 3: Subsections
|
||||||
|
:tags: level3, subsection |
||||||
|
|
||||||
|
This is a level 3 section that should appear in parse levels 3-6. |
||||||
|
|
||||||
|
==== Level 4: Sub-subsections |
||||||
|
:tags: level4, detailed |
||||||
|
|
||||||
|
This is a level 4 section that should appear in parse levels 4-6. |
||||||
|
|
||||||
|
===== Level 5: Deep Subsections |
||||||
|
:tags: level5, deep |
||||||
|
|
||||||
|
This is a level 5 section that should only appear in parse levels 5-6. |
||||||
|
|
||||||
|
====== Level 6: Deepest Level |
||||||
|
:tags: level6, deepest |
||||||
|
|
||||||
|
This is a level 6 section that should only appear in parse level 6. |
||||||
|
|
||||||
|
Content at the deepest level of our hierarchy. |
||||||
|
|
||||||
|
== Level 2: Second Main Section |
||||||
|
:tags: level2, main, second |
||||||
|
|
||||||
|
A second main section to ensure we have balanced content at the top level.`;
|
||||||
|
} |
||||||
|
|
||||||
|
describe("NKBIP-01 Publication Tree Processor", () => { |
||||||
|
it("should validate parse levels correctly", () => { |
||||||
|
// Test valid parse levels
|
||||||
|
expect(validateParseLevel(2)).toBe(true); |
||||||
|
expect(validateParseLevel(3)).toBe(true); |
||||||
|
expect(validateParseLevel(5)).toBe(true); |
||||||
|
|
||||||
|
// Test invalid parse levels
|
||||||
|
expect(validateParseLevel(1)).toBe(false); |
||||||
|
expect(validateParseLevel(6)).toBe(false); |
||||||
|
expect(validateParseLevel(7)).toBe(false); |
||||||
|
expect(validateParseLevel(2.5)).toBe(false); |
||||||
|
expect(validateParseLevel(-1)).toBe(false); |
||||||
|
|
||||||
|
// Test supported levels array
|
||||||
|
const supportedLevels = getSupportedParseLevels(); |
||||||
|
expect(supportedLevels).toEqual([2, 3, 4, 5]); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should parse Level 2 with NKBIP-01 minimal structure", async () => { |
||||||
|
const result = await parseAsciiDocWithTree(testContent, mockNDK, 2); |
||||||
|
|
||||||
|
// Should be detected as article (has title and sections)
|
||||||
|
expect(result.metadata.contentType).toBe("article"); |
||||||
|
expect(result.metadata.parseLevel).toBe(2); |
||||||
|
expect(result.metadata.title).toBe("Deep Hierarchical Document Test"); |
||||||
|
|
||||||
|
// Should have 1 index event (30040) + 2 content events (30041) for level 2 sections
|
||||||
|
expect(result.indexEvent).toBeDefined(); |
||||||
|
expect(result.indexEvent?.kind).toBe(30040); |
||||||
|
expect(result.contentEvents.length).toBe(2); |
||||||
|
|
||||||
|
// All content events should be kind 30041
|
||||||
|
result.contentEvents.forEach((event) => { |
||||||
|
expect(event.kind).toBe(30041); |
||||||
|
}); |
||||||
|
|
||||||
|
// Check titles of level 2 sections
|
||||||
|
const contentTitles = result.contentEvents.map((e) => |
||||||
|
e.tags.find((t: string[]) => t[0] === "title")?.[1] |
||||||
|
); |
||||||
|
expect(contentTitles).toContain("Level 2: Main Sections"); |
||||||
|
expect(contentTitles).toContain("Level 2: Second Main Section"); |
||||||
|
|
||||||
|
// Content should include all nested subsections as AsciiDoc
|
||||||
|
const firstSectionContent = result.contentEvents[0].content; |
||||||
|
expect(firstSectionContent).toBeDefined(); |
||||||
|
// Should contain level 3, 4, 5 content as nested AsciiDoc markup
|
||||||
|
expect(firstSectionContent.includes("=== Level 3: Subsections")).toBe(true); |
||||||
|
expect(firstSectionContent.includes("==== Level 4: Sub-subsections")).toBe( |
||||||
|
true, |
||||||
|
); |
||||||
|
expect(firstSectionContent.includes("===== Level 5: Deep Subsections")) |
||||||
|
.toBe(true); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should parse Level 3 with NKBIP-01 intermediate structure", async () => { |
||||||
|
const result = await parseAsciiDocWithTree(testContent, mockNDK, 3); |
||||||
|
|
||||||
|
expect(result.metadata.contentType).toBe("article"); |
||||||
|
expect(result.metadata.parseLevel).toBe(3); |
||||||
|
|
||||||
|
// Should have hierarchical structure
|
||||||
|
expect(result.indexEvent).toBeDefined(); |
||||||
|
expect(result.indexEvent?.kind).toBe(30040); |
||||||
|
|
||||||
|
// Should have mix of 30040 (for level 2 sections with children) and 30041 (for content)
|
||||||
|
const kinds = result.contentEvents.map((e) => e.kind); |
||||||
|
expect(kinds).toContain(30040); // Level 2 sections with children
|
||||||
|
expect(kinds).toContain(30041); // Level 3 content sections
|
||||||
|
|
||||||
|
// Level 2 sections with children should be 30040 index events
|
||||||
|
const level2WithChildrenEvents = result.contentEvents.filter((e) => |
||||||
|
e.kind === 30040 && |
||||||
|
e.tags.find((t: string[]) => t[0] === "title")?.[1]?.includes("Level 2:") |
||||||
|
); |
||||||
|
expect(level2WithChildrenEvents.length).toBe(2); // Both level 2 sections have children
|
||||||
|
|
||||||
|
// Should have 30041 events for level 3 content
|
||||||
|
const level3ContentEvents = result.contentEvents.filter((e) => |
||||||
|
e.kind === 30041 && |
||||||
|
e.tags.find((t: string[]) => t[0] === "title")?.[1]?.includes("Level 3:") |
||||||
|
); |
||||||
|
expect(level3ContentEvents.length).toBeGreaterThan(0); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should parse Level 4 with NKBIP-01 detailed structure", async () => { |
||||||
|
const result = await parseAsciiDocWithTree(testContent, mockNDK, 4); |
||||||
|
|
||||||
|
expect(result.metadata.contentType).toBe("article"); |
||||||
|
expect(result.metadata.parseLevel).toBe(4); |
||||||
|
|
||||||
|
// Should have hierarchical structure with mix of 30040 and 30041 events
|
||||||
|
expect(result.indexEvent).toBeDefined(); |
||||||
|
expect(result.indexEvent?.kind).toBe(30040); |
||||||
|
|
||||||
|
const kinds = result.contentEvents.map((e) => e.kind); |
||||||
|
expect(kinds).toContain(30040); // Level 2 sections with children
|
||||||
|
expect(kinds).toContain(30041); // Content sections
|
||||||
|
|
||||||
|
// Check that we have level 4 content sections
|
||||||
|
const contentTitles = result.contentEvents.map((e) => |
||||||
|
e.tags.find((t: string[]) => t[0] === "title")?.[1] |
||||||
|
); |
||||||
|
expect(contentTitles).toContain("Level 4: Sub-subsections"); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should parse Level 5 with NKBIP-01 maximum depth", async () => { |
||||||
|
const result = await parseAsciiDocWithTree(testContent, mockNDK, 5); |
||||||
|
|
||||||
|
expect(result.metadata.contentType).toBe("article"); |
||||||
|
expect(result.metadata.parseLevel).toBe(5); |
||||||
|
|
||||||
|
// Should have hierarchical structure
|
||||||
|
expect(result.indexEvent).toBeDefined(); |
||||||
|
expect(result.indexEvent?.kind).toBe(30040); |
||||||
|
|
||||||
|
// Should include level 5 sections as content events
|
||||||
|
const contentTitles = result.contentEvents.map((e) => |
||||||
|
e.tags.find((t: string[]) => t[0] === "title")?.[1] |
||||||
|
); |
||||||
|
expect(contentTitles).toContain("Level 5: Deep Subsections"); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should validate event structure correctly", async () => { |
||||||
|
const result = await parseAsciiDocWithTree(testContent, mockNDK, 3); |
||||||
|
|
||||||
|
// Test index event structure
|
||||||
|
expect(result.indexEvent).toBeDefined(); |
||||||
|
expect(result.indexEvent?.kind).toBe(30040); |
||||||
|
expect(result.indexEvent?.tags).toBeDefined(); |
||||||
|
|
||||||
|
// Check required tags
|
||||||
|
const indexTags = result.indexEvent!.tags; |
||||||
|
const dTag = indexTags.find((t: string[]) => t[0] === "d"); |
||||||
|
const titleTag = indexTags.find((t: string[]) => t[0] === "title"); |
||||||
|
|
||||||
|
expect(dTag).toBeDefined(); |
||||||
|
expect(titleTag).toBeDefined(); |
||||||
|
expect(titleTag![1]).toBe("Deep Hierarchical Document Test"); |
||||||
|
|
||||||
|
// Test content events structure - mix of 30040 and 30041
|
||||||
|
result.contentEvents.forEach((event) => { |
||||||
|
expect([30040, 30041]).toContain(event.kind); |
||||||
|
expect(event.tags).toBeDefined(); |
||||||
|
expect(event.content).toBeDefined(); |
||||||
|
|
||||||
|
const eventTitleTag = event.tags.find((t: string[]) => t[0] === "title"); |
||||||
|
expect(eventTitleTag).toBeDefined(); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should preserve content as AsciiDoc", async () => { |
||||||
|
const result = await parseAsciiDocWithTree(testContent, mockNDK, 2); |
||||||
|
|
||||||
|
// Content should be preserved as original AsciiDoc, not converted to HTML
|
||||||
|
const firstEvent = result.contentEvents[0]; |
||||||
|
expect(firstEvent.content).toBeDefined(); |
||||||
|
|
||||||
|
// Should contain AsciiDoc markup, not HTML
|
||||||
|
expect(firstEvent.content.includes("<")).toBe(false); |
||||||
|
expect(firstEvent.content.includes("===")).toBe(true); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should handle attributes correctly", async () => { |
||||||
|
const result = await parseAsciiDocWithTree(testContent, mockNDK, 2); |
||||||
|
|
||||||
|
// Document-level attributes should be in index event
|
||||||
|
expect(result.indexEvent).toBeDefined(); |
||||||
|
const indexTags = result.indexEvent!.tags; |
||||||
|
|
||||||
|
// Check for document attributes
|
||||||
|
const authorTag = indexTags.find((t: string[]) => t[0] === "author"); |
||||||
|
const typeTag = indexTags.find((t: string[]) => t[0] === "type"); |
||||||
|
const tagsTag = indexTags.find((t: string[]) => t[0] === "t"); |
||||||
|
|
||||||
|
expect(authorTag?.[1]).toBe("Test Author"); |
||||||
|
expect(typeTag?.[1]).toBe("technical"); |
||||||
|
expect(tagsTag).toBeDefined(); // Should have at least one t-tag
|
||||||
|
}); |
||||||
|
|
||||||
|
it("should handle scattered notes mode", async () => { |
||||||
|
// Test with content that has no document title (scattered notes)
|
||||||
|
const scatteredContent = `== First Note
|
||||||
|
:tags: note1 |
||||||
|
|
||||||
|
Content of first note. |
||||||
|
|
||||||
|
== Second Note
|
||||||
|
:tags: note2 |
||||||
|
|
||||||
|
Content of second note.`;
|
||||||
|
|
||||||
|
const result = await parseAsciiDocWithTree(scatteredContent, mockNDK, 2); |
||||||
|
|
||||||
|
expect(result.metadata.contentType).toBe("scattered-notes"); |
||||||
|
expect(result.indexEvent).toBeNull(); // No index event for scattered notes
|
||||||
|
expect(result.contentEvents.length).toBe(2); |
||||||
|
|
||||||
|
// All events should be 30041 content events
|
||||||
|
result.contentEvents.forEach((event) => { |
||||||
|
expect(event.kind).toBe(30041); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
it("should integrate with PublicationTree structure", async () => { |
||||||
|
const result = await parseAsciiDocWithTree(testContent, mockNDK, 2); |
||||||
|
|
||||||
|
// Should have a PublicationTree instance
|
||||||
|
expect(result.tree).toBeDefined(); |
||||||
|
|
||||||
|
// Tree should have methods for event management
|
||||||
|
expect(typeof result.tree.addEvent).toBe("function"); |
||||||
|
|
||||||
|
// Event structure should be populated
|
||||||
|
expect(result.metadata.eventStructure).toBeDefined(); |
||||||
|
expect(Array.isArray(result.metadata.eventStructure)).toBe(true); |
||||||
|
}); |
||||||
|
}); |
||||||
@ -0,0 +1,660 @@ |
|||||||
|
#!/usr/bin/env node |
||||||
|
|
||||||
|
/** |
||||||
|
* Test-Driven Development for ZettelPublisher Enhancement |
||||||
|
* Based on understanding_knowledge.adoc, desire.adoc, and docreference.md |
||||||
|
* |
||||||
|
* Key Requirements Discovered: |
||||||
|
* 1. ITERATIVE parsing (not recursive): sections at target level become events |
||||||
|
* 2. Level 2: == sections become 30041 events containing ALL subsections (===, ====, etc.) |
||||||
|
* 3. Level 3: == sections become 30040 indices, === sections become 30041 events |
||||||
|
* 4. 30040 metadata: from document level (= title with :attributes:) |
||||||
|
* 5. 30041 metadata: from section level attributes |
||||||
|
* 6. Smart publishing: articles (=) vs scattered notes (==) |
||||||
|
* 7. Custom attributes: all :key: value pairs preserved as event tags |
||||||
|
*/ |
||||||
|
|
||||||
|
import fs from "fs"; |
||||||
|
import path from "path"; |
||||||
|
|
||||||
|
// Test framework
|
||||||
|
interface TestCase { |
||||||
|
name: string; |
||||||
|
fn: () => void | Promise<void>; |
||||||
|
} |
||||||
|
|
||||||
|
class TestFramework { |
||||||
|
private tests: TestCase[] = []; |
||||||
|
private passed: number = 0; |
||||||
|
private failed: number = 0; |
||||||
|
|
||||||
|
test(name: string, fn: () => void | Promise<void>): void { |
||||||
|
this.tests.push({ name, fn }); |
||||||
|
} |
||||||
|
|
||||||
|
expect(actual: any) { |
||||||
|
return { |
||||||
|
toBe: (expected: any) => { |
||||||
|
if (actual === expected) return true; |
||||||
|
throw new Error(`Expected ${expected}, got ${actual}`); |
||||||
|
}, |
||||||
|
toEqual: (expected: any) => { |
||||||
|
if (JSON.stringify(actual) === JSON.stringify(expected)) return true; |
||||||
|
throw new Error( |
||||||
|
`Expected ${JSON.stringify(expected)}, got ${JSON.stringify(actual)}`, |
||||||
|
); |
||||||
|
}, |
||||||
|
toContain: (expected: any) => { |
||||||
|
if (actual && actual.includes && actual.includes(expected)) return true; |
||||||
|
throw new Error(`Expected "${actual}" to contain "${expected}"`); |
||||||
|
}, |
||||||
|
not: { |
||||||
|
toContain: (expected: any) => { |
||||||
|
if (actual && actual.includes && !actual.includes(expected)) { |
||||||
|
return true; |
||||||
|
} |
||||||
|
throw new Error(`Expected "${actual}" NOT to contain "${expected}"`); |
||||||
|
}, |
||||||
|
}, |
||||||
|
toBeTruthy: () => { |
||||||
|
if (actual) return true; |
||||||
|
throw new Error(`Expected truthy value, got ${actual}`); |
||||||
|
}, |
||||||
|
toHaveLength: (expected: number) => { |
||||||
|
if (actual && actual.length === expected) return true; |
||||||
|
throw new Error( |
||||||
|
`Expected length ${expected}, got ${ |
||||||
|
actual ? actual.length : "undefined" |
||||||
|
}`,
|
||||||
|
); |
||||||
|
}, |
||||||
|
}; |
||||||
|
} |
||||||
|
|
||||||
|
async run() { |
||||||
|
console.log(`🧪 Running ${this.tests.length} tests...\n`); |
||||||
|
|
||||||
|
for (const { name, fn } of this.tests) { |
||||||
|
try { |
||||||
|
await fn(); |
||||||
|
console.log(`✅ ${name}`); |
||||||
|
this.passed++; |
||||||
|
} catch (error: unknown) { |
||||||
|
console.log(`❌ ${name}`); |
||||||
|
const message = error instanceof Error ? error.message : String(error); |
||||||
|
console.log(` ${message}\n`); |
||||||
|
this.failed++; |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
console.log(`\n📊 Results: ${this.passed} passed, ${this.failed} failed`); |
||||||
|
return this.failed === 0; |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
const test = new TestFramework(); |
||||||
|
|
||||||
|
// Load test data files
|
||||||
|
const testDataPath = path.join(process.cwd(), "test_data", "AsciidocFiles"); |
||||||
|
const understandingKnowledge = fs.readFileSync( |
||||||
|
path.join(testDataPath, "understanding_knowledge.adoc"), |
||||||
|
"utf-8", |
||||||
|
); |
||||||
|
const desire = fs.readFileSync(path.join(testDataPath, "desire.adoc"), "utf-8"); |
||||||
|
|
||||||
|
// =============================================================================
|
||||||
|
// PHASE 1: Core Data Structure Tests (Based on Real Test Data)
|
||||||
|
// =============================================================================
|
||||||
|
|
||||||
|
test.test("Understanding Knowledge: Document metadata should be extracted from = level", () => { |
||||||
|
// Expected 30040 metadata from understanding_knowledge.adoc
|
||||||
|
const expectedDocMetadata = { |
||||||
|
title: "Understanding Knowledge", |
||||||
|
image: "https://i.nostr.build/IUs0xNyUEf5hXTFL.jpg", |
||||||
|
published: "2025-04-21", |
||||||
|
language: "en, ISO-639-1", |
||||||
|
tags: ["knowledge", "philosophy", "education"], |
||||||
|
type: "text", |
||||||
|
}; |
||||||
|
|
||||||
|
// Test will pass when document parsing extracts these correctly
|
||||||
|
test.expect(expectedDocMetadata.title).toBe("Understanding Knowledge"); |
||||||
|
test.expect(expectedDocMetadata.tags).toHaveLength(3); |
||||||
|
test.expect(expectedDocMetadata.type).toBe("text"); |
||||||
|
}); |
||||||
|
|
||||||
|
test.test("Desire: Document metadata should include all custom attributes", () => { |
||||||
|
// Expected 30040 metadata from desire.adoc
|
||||||
|
const expectedDocMetadata = { |
||||||
|
title: "Desire Part 1: Mimesis", |
||||||
|
image: "https://i.nostr.build/hGzyi4c3YhTwoCCe.png", |
||||||
|
published: "2025-07-02", |
||||||
|
language: "en, ISO-639-1", |
||||||
|
tags: ["memetics", "philosophy", "desire"], |
||||||
|
type: "podcastArticle", |
||||||
|
}; |
||||||
|
|
||||||
|
test.expect(expectedDocMetadata.type).toBe("podcastArticle"); |
||||||
|
test.expect(expectedDocMetadata.tags).toContain("memetics"); |
||||||
|
}); |
||||||
|
|
||||||
|
test.test("Iterative ParsedAsciiDoc interface should support level-based parsing", () => { |
||||||
|
// Test the ITERATIVE interface structure (not recursive)
|
||||||
|
// Based on docreference.md - Level 2 parsing example
|
||||||
|
const mockLevel2Structure = { |
||||||
|
metadata: { |
||||||
|
title: "Programming Fundamentals Guide", |
||||||
|
tags: ["programming", "fundamentals"], |
||||||
|
}, |
||||||
|
content: "This is the main introduction to the programming guide.", |
||||||
|
title: "Programming Fundamentals Guide", |
||||||
|
sections: [ |
||||||
|
{ |
||||||
|
metadata: { |
||||||
|
title: "Data Structures", |
||||||
|
tags: ["arrays", "lists", "trees"], |
||||||
|
difficulty: "intermediate", |
||||||
|
}, |
||||||
|
content: |
||||||
|
`Understanding fundamental data structures is crucial for effective programming.
|
||||||
|
|
||||||
|
=== Arrays and Lists |
||||||
|
|
||||||
|
Arrays are contiguous memory blocks that store elements of the same type. |
||||||
|
Lists provide dynamic sizing capabilities. |
||||||
|
|
||||||
|
==== Dynamic Arrays |
||||||
|
|
||||||
|
Dynamic arrays automatically resize when capacity is exceeded. |
||||||
|
|
||||||
|
==== Linked Lists |
||||||
|
|
||||||
|
Linked lists use pointers to connect elements. |
||||||
|
|
||||||
|
=== Trees and Graphs |
||||||
|
|
||||||
|
Tree and graph structures enable hierarchical and networked data representation.`,
|
||||||
|
title: "Data Structures", |
||||||
|
}, |
||||||
|
{ |
||||||
|
metadata: { |
||||||
|
title: "Algorithms", |
||||||
|
tags: ["sorting", "searching", "optimization"], |
||||||
|
difficulty: "advanced", |
||||||
|
}, |
||||||
|
content: |
||||||
|
`Algorithmic thinking forms the foundation of efficient problem-solving.
|
||||||
|
|
||||||
|
=== Sorting Algorithms |
||||||
|
|
||||||
|
Different sorting algorithms offer various trade-offs between time and space complexity. |
||||||
|
|
||||||
|
==== Bubble Sort |
||||||
|
|
||||||
|
Bubble sort repeatedly steps through the list, compares adjacent elements. |
||||||
|
|
||||||
|
==== Quick Sort |
||||||
|
|
||||||
|
Quick sort uses divide-and-conquer approach with pivot selection.`,
|
||||||
|
title: "Algorithms", |
||||||
|
}, |
||||||
|
], |
||||||
|
}; |
||||||
|
|
||||||
|
// Verify ITERATIVE structure: only level 2 sections, containing ALL subsections
|
||||||
|
test.expect(mockLevel2Structure.sections).toHaveLength(2); |
||||||
|
test.expect(mockLevel2Structure.sections[0].title).toBe("Data Structures"); |
||||||
|
test.expect(mockLevel2Structure.sections[0].content).toContain( |
||||||
|
"=== Arrays and Lists", |
||||||
|
); |
||||||
|
test.expect(mockLevel2Structure.sections[0].content).toContain( |
||||||
|
"==== Dynamic Arrays", |
||||||
|
); |
||||||
|
test.expect(mockLevel2Structure.sections[1].content).toContain( |
||||||
|
"==== Quick Sort", |
||||||
|
); |
||||||
|
}); |
||||||
|
|
||||||
|
// =============================================================================
|
||||||
|
// PHASE 2: Content Processing Tests (Header Separation)
|
||||||
|
// =============================================================================
|
||||||
|
|
||||||
|
test.test("Section content should NOT contain its own header", () => { |
||||||
|
// From understanding_knowledge.adoc: "== Preface" section
|
||||||
|
const expectedPrefaceContent = `[NOTE]
|
||||||
|
This essay was written to outline and elaborate on the purpose of the Nostr client Alexandria. No formal academic citations are included as this serves primarily as a conceptual foundation, inviting readers to experience related ideas connecting and forming as more content becomes uploaded. Traces of AI edits and guidance are left, but the essay style is still my own. Over time this essay may change its wording, structure and content. |
||||||
|
-- liminal`;
|
||||||
|
|
||||||
|
// Should NOT contain "== Preface"
|
||||||
|
test.expect(expectedPrefaceContent).not.toContain("== Preface"); |
||||||
|
test.expect(expectedPrefaceContent).toContain("[NOTE]"); |
||||||
|
}); |
||||||
|
|
||||||
|
test.test("Introduction section should separate from its subsections", () => { |
||||||
|
// From understanding_knowledge.adoc
|
||||||
|
const expectedIntroContent = |
||||||
|
`image:https://i.nostr.build/IUs0xNyUEf5hXTFL.jpg[library]`; |
||||||
|
|
||||||
|
// Should NOT contain subsection content or headers
|
||||||
|
test.expect(expectedIntroContent).not.toContain("=== Why Investigate"); |
||||||
|
test.expect(expectedIntroContent).not.toContain( |
||||||
|
"Understanding the nature of knowledge", |
||||||
|
); |
||||||
|
test.expect(expectedIntroContent).toContain("image:https://i.nostr.build"); |
||||||
|
}); |
||||||
|
|
||||||
|
test.test("Subsection content should be cleanly separated", () => { |
||||||
|
// "=== Why Investigate the Nature of Knowledge?" subsection
|
||||||
|
const expectedSubsectionContent = |
||||||
|
`Understanding the nature of knowledge itself is fundamental, distinct from simply studying how we learn or communicate. Knowledge exests first as representations within individuals, separate from how we interact with it...`; |
||||||
|
|
||||||
|
// Should NOT contain its own header
|
||||||
|
test.expect(expectedSubsectionContent).not.toContain("=== Why Investigate"); |
||||||
|
test.expect(expectedSubsectionContent).toContain("Understanding the nature"); |
||||||
|
}); |
||||||
|
|
||||||
|
test.test("Deep headers (====) should have proper newlines", () => { |
||||||
|
// From "=== The Four Perspectives" section with ==== subsections
|
||||||
|
const expectedFormatted = ` |
||||||
|
==== 1. The Building Blocks (Material Cause) |
||||||
|
|
||||||
|
Just as living organisms are made up of cells, knowledge systems are built from fundamental units of understanding. |
||||||
|
|
||||||
|
==== 2. The Pattern of Organization (Formal Cause) |
||||||
|
|
||||||
|
If you've ever seen how mushrooms connect through underground networks...`;
|
||||||
|
|
||||||
|
test.expect(expectedFormatted).toContain( |
||||||
|
"\n==== 1. The Building Blocks (Material Cause)\n", |
||||||
|
); |
||||||
|
test.expect(expectedFormatted).toContain( |
||||||
|
"\n==== 2. The Pattern of Organization (Formal Cause)\n", |
||||||
|
); |
||||||
|
}); |
||||||
|
|
||||||
|
// =============================================================================
|
||||||
|
// PHASE 3: Publishing Logic Tests (30040/30041 Structure)
|
||||||
|
// =============================================================================
|
||||||
|
|
||||||
|
test.test("Understanding Knowledge should create proper 30040 index event", () => { |
||||||
|
// Expected 30040 index event structure
|
||||||
|
const expectedIndexEvent = { |
||||||
|
kind: 30040, |
||||||
|
content: "", // Index events have empty content
|
||||||
|
tags: [ |
||||||
|
["d", "understanding-knowledge"], |
||||||
|
["title", "Understanding Knowledge"], |
||||||
|
["image", "https://i.nostr.build/IUs0xNyUEf5hXTFL.jpg"], |
||||||
|
["published", "2025-04-21"], |
||||||
|
["language", "en, ISO-639-1"], |
||||||
|
["t", "knowledge"], |
||||||
|
["t", "philosophy"], |
||||||
|
["t", "education"], |
||||||
|
["type", "text"], |
||||||
|
// a-tags referencing sections
|
||||||
|
["a", "30041:pubkey:understanding-knowledge-preface"], |
||||||
|
[ |
||||||
|
"a", |
||||||
|
"30041:pubkey:understanding-knowledge-introduction-knowledge-as-a-living-ecosystem", |
||||||
|
], |
||||||
|
[ |
||||||
|
"a", |
||||||
|
"30041:pubkey:understanding-knowledge-i-material-cause-the-substance-of-knowledge", |
||||||
|
], |
||||||
|
// ... more a-tags for each section
|
||||||
|
], |
||||||
|
}; |
||||||
|
|
||||||
|
test.expect(expectedIndexEvent.kind).toBe(30040); |
||||||
|
test.expect(expectedIndexEvent.content).toBe(""); |
||||||
|
test.expect(expectedIndexEvent.tags.filter(([k]) => k === "t")).toHaveLength( |
||||||
|
3, |
||||||
|
); |
||||||
|
test.expect( |
||||||
|
expectedIndexEvent.tags.find(([k, v]) => k === "type" && v === "text"), |
||||||
|
).toBeTruthy(); |
||||||
|
}); |
||||||
|
|
||||||
|
test.test("Understanding Knowledge sections should create proper 30041 events", () => { |
||||||
|
// Expected 30041 events for main sections
|
||||||
|
const expectedSectionEvents = [ |
||||||
|
{ |
||||||
|
kind: 30041, |
||||||
|
content: |
||||||
|
`[NOTE]\nThis essay was written to outline and elaborate on the purpose of the Nostr client Alexandria...`, |
||||||
|
tags: [ |
||||||
|
["d", "understanding-knowledge-preface"], |
||||||
|
["title", "Preface"], |
||||||
|
], |
||||||
|
}, |
||||||
|
{ |
||||||
|
kind: 30041, |
||||||
|
content: `image:https://i.nostr.build/IUs0xNyUEf5hXTFL.jpg[library]`, |
||||||
|
tags: [ |
||||||
|
[ |
||||||
|
"d", |
||||||
|
"understanding-knowledge-introduction-knowledge-as-a-living-ecosystem", |
||||||
|
], |
||||||
|
["title", "Introduction: Knowledge as a Living Ecosystem"], |
||||||
|
], |
||||||
|
}, |
||||||
|
]; |
||||||
|
|
||||||
|
expectedSectionEvents.forEach((event) => { |
||||||
|
test.expect(event.kind).toBe(30041); |
||||||
|
test.expect(event.content).toBeTruthy(); |
||||||
|
test.expect(event.tags.find(([k]) => k === "d")).toBeTruthy(); |
||||||
|
test.expect(event.tags.find(([k]) => k === "title")).toBeTruthy(); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
test.test("Level-based parsing should create correct 30040/30041 structure", () => { |
||||||
|
// Based on docreference.md examples
|
||||||
|
|
||||||
|
// Level 2 parsing: only == sections become events, containing all subsections
|
||||||
|
const expectedLevel2Events = { |
||||||
|
mainIndex: { |
||||||
|
kind: 30040, |
||||||
|
content: "", |
||||||
|
tags: [ |
||||||
|
["d", "programming-fundamentals-guide"], |
||||||
|
["title", "Programming Fundamentals Guide"], |
||||||
|
["a", "30041:author_pubkey:data-structures"], |
||||||
|
["a", "30041:author_pubkey:algorithms"], |
||||||
|
], |
||||||
|
}, |
||||||
|
dataStructuresSection: { |
||||||
|
kind: 30041, |
||||||
|
content: |
||||||
|
"Understanding fundamental data structures...\n\n=== Arrays and Lists\n\n...==== Dynamic Arrays\n\n...==== Linked Lists\n\n...", |
||||||
|
tags: [ |
||||||
|
["d", "data-structures"], |
||||||
|
["title", "Data Structures"], |
||||||
|
["difficulty", "intermediate"], |
||||||
|
], |
||||||
|
}, |
||||||
|
}; |
||||||
|
|
||||||
|
// Level 3 parsing: == sections become 30040 indices, === sections become 30041 events
|
||||||
|
const expectedLevel3Events = { |
||||||
|
mainIndex: { |
||||||
|
kind: 30040, |
||||||
|
content: "", |
||||||
|
tags: [ |
||||||
|
["d", "programming-fundamentals-guide"], |
||||||
|
["title", "Programming Fundamentals Guide"], |
||||||
|
["a", "30040:author_pubkey:data-structures"], // Now references sub-index
|
||||||
|
["a", "30040:author_pubkey:algorithms"], |
||||||
|
], |
||||||
|
}, |
||||||
|
dataStructuresIndex: { |
||||||
|
kind: 30040, |
||||||
|
content: "", |
||||||
|
tags: [ |
||||||
|
["d", "data-structures"], |
||||||
|
["title", "Data Structures"], |
||||||
|
["a", "30041:author_pubkey:data-structures-content"], |
||||||
|
["a", "30041:author_pubkey:arrays-and-lists"], |
||||||
|
["a", "30041:author_pubkey:trees-and-graphs"], |
||||||
|
], |
||||||
|
}, |
||||||
|
arraysAndListsSection: { |
||||||
|
kind: 30041, |
||||||
|
content: |
||||||
|
"Arrays are contiguous...\n\n==== Dynamic Arrays\n\n...==== Linked Lists\n\n...", |
||||||
|
tags: [ |
||||||
|
["d", "arrays-and-lists"], |
||||||
|
["title", "Arrays and Lists"], |
||||||
|
], |
||||||
|
}, |
||||||
|
}; |
||||||
|
|
||||||
|
test.expect(expectedLevel2Events.mainIndex.kind).toBe(30040); |
||||||
|
test.expect(expectedLevel2Events.dataStructuresSection.kind).toBe(30041); |
||||||
|
test.expect(expectedLevel2Events.dataStructuresSection.content).toContain( |
||||||
|
"=== Arrays and Lists", |
||||||
|
); |
||||||
|
|
||||||
|
test.expect(expectedLevel3Events.dataStructuresIndex.kind).toBe(30040); |
||||||
|
test.expect(expectedLevel3Events.arraysAndListsSection.content).toContain( |
||||||
|
"==== Dynamic Arrays", |
||||||
|
); |
||||||
|
}); |
||||||
|
|
||||||
|
// =============================================================================
|
||||||
|
// PHASE 4: Smart Publishing System Tests
|
||||||
|
// =============================================================================
|
||||||
|
|
||||||
|
test.test("Content type detection should work for both test files", () => { |
||||||
|
const testCases = [ |
||||||
|
{ |
||||||
|
name: "Understanding Knowledge (article)", |
||||||
|
content: understandingKnowledge, |
||||||
|
expected: "article", |
||||||
|
}, |
||||||
|
{ |
||||||
|
name: "Desire (article)", |
||||||
|
content: desire, |
||||||
|
expected: "article", |
||||||
|
}, |
||||||
|
{ |
||||||
|
name: "Scattered notes format", |
||||||
|
content: "== Note 1\nContent\n\n== Note 2\nMore content", |
||||||
|
expected: "scattered-notes", |
||||||
|
}, |
||||||
|
]; |
||||||
|
|
||||||
|
testCases.forEach(({ name, content, expected }) => { |
||||||
|
const hasDocTitle = content.trim().startsWith("=") && |
||||||
|
!content.trim().startsWith("=="); |
||||||
|
const hasSections = content.includes("=="); |
||||||
|
|
||||||
|
let detected; |
||||||
|
if (hasDocTitle) { |
||||||
|
detected = "article"; |
||||||
|
} else if (hasSections) { |
||||||
|
detected = "scattered-notes"; |
||||||
|
} else { |
||||||
|
detected = "none"; |
||||||
|
} |
||||||
|
|
||||||
|
console.log(` ${name}: detected ${detected}`); |
||||||
|
test.expect(detected).toBe(expected); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
test.test("Parse level should affect event structure correctly", () => { |
||||||
|
// Understanding Knowledge has structure: = > == (6 sections) > === (many subsections) > ====
|
||||||
|
// Based on actual content analysis
|
||||||
|
const levelEventCounts = [ |
||||||
|
{ level: 1, description: "Only document index", events: 1 }, |
||||||
|
{ |
||||||
|
level: 2, |
||||||
|
description: "Document index + level 2 sections (==)", |
||||||
|
events: 7, |
||||||
|
}, // 1 index + 6 sections
|
||||||
|
{ |
||||||
|
level: 3, |
||||||
|
description: |
||||||
|
"Document index + section indices + level 3 subsections (===)", |
||||||
|
events: 20, |
||||||
|
}, // More complex
|
||||||
|
{ |
||||||
|
level: 4, |
||||||
|
description: "Full hierarchy including level 4 (====)", |
||||||
|
events: 35, |
||||||
|
}, |
||||||
|
]; |
||||||
|
|
||||||
|
levelEventCounts.forEach(({ level, description, events }) => { |
||||||
|
console.log(` Level ${level}: ${description} (${events} events)`); |
||||||
|
test.expect(events).toBeTruthy(); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
// =============================================================================
|
||||||
|
// PHASE 5: Integration Tests (End-to-End Workflow)
|
||||||
|
// =============================================================================
|
||||||
|
|
||||||
|
test.test("Full Understanding Knowledge publishing workflow (Level 2)", async () => { |
||||||
|
// Mock the complete ITERATIVE workflow
|
||||||
|
const mockWorkflow = { |
||||||
|
parseLevel2: (content: string) => ({ |
||||||
|
metadata: { |
||||||
|
title: "Understanding Knowledge", |
||||||
|
image: "https://i.nostr.build/IUs0xNyUEf5hXTFL.jpg", |
||||||
|
published: "2025-04-21", |
||||||
|
tags: ["knowledge", "philosophy", "education"], |
||||||
|
type: "text", |
||||||
|
}, |
||||||
|
title: "Understanding Knowledge", |
||||||
|
content: "Introduction content before any sections", |
||||||
|
sections: [ |
||||||
|
{ |
||||||
|
title: "Preface", |
||||||
|
content: "[NOTE]\nThis essay was written to outline...", |
||||||
|
metadata: { title: "Preface" }, |
||||||
|
}, |
||||||
|
{ |
||||||
|
title: "Introduction: Knowledge as a Living Ecosystem", |
||||||
|
// Contains ALL subsections (===, ====) in content
|
||||||
|
content: `image:https://i.nostr.build/IUs0xNyUEf5hXTFL.jpg[library]
|
||||||
|
|
||||||
|
=== Why Investigate the Nature of Knowledge? |
||||||
|
|
||||||
|
Understanding the nature of knowledge itself is fundamental... |
||||||
|
|
||||||
|
=== Challenging the Static Perception of Knowledge |
||||||
|
|
||||||
|
Traditionally, knowledge has been perceived as a static repository... |
||||||
|
|
||||||
|
==== The Four Perspectives |
||||||
|
|
||||||
|
===== 1. The Building Blocks (Material Cause) |
||||||
|
|
||||||
|
Just as living organisms are made up of cells...`,
|
||||||
|
metadata: { title: "Introduction: Knowledge as a Living Ecosystem" }, |
||||||
|
}, |
||||||
|
// ... 4 more sections (Material Cause, Formal Cause, Efficient Cause, Final Cause)
|
||||||
|
], |
||||||
|
}), |
||||||
|
|
||||||
|
buildLevel2Events: (parsed: any) => ({ |
||||||
|
indexEvent: { |
||||||
|
kind: 30040, |
||||||
|
content: "", |
||||||
|
tags: [ |
||||||
|
["d", "understanding-knowledge"], |
||||||
|
["title", parsed.title], |
||||||
|
["image", parsed.metadata.image], |
||||||
|
["t", "knowledge"], |
||||||
|
["t", "philosophy"], |
||||||
|
["t", "education"], |
||||||
|
["type", "text"], |
||||||
|
["a", "30041:pubkey:preface"], |
||||||
|
["a", "30041:pubkey:introduction-knowledge-as-a-living-ecosystem"], |
||||||
|
], |
||||||
|
}, |
||||||
|
sectionEvents: parsed.sections.map((s: any) => ({ |
||||||
|
kind: 30041, |
||||||
|
content: s.content, |
||||||
|
tags: [ |
||||||
|
["d", s.title.toLowerCase().replace(/[^a-z0-9]+/g, "-")], |
||||||
|
["title", s.title], |
||||||
|
], |
||||||
|
})), |
||||||
|
}), |
||||||
|
|
||||||
|
publish: (events: any) => ({ |
||||||
|
success: true, |
||||||
|
published: events.sectionEvents.length + 1, |
||||||
|
eventIds: [ |
||||||
|
"main-index", |
||||||
|
...events.sectionEvents.map((_: any, i: number) => `section-${i}`), |
||||||
|
], |
||||||
|
}), |
||||||
|
}; |
||||||
|
|
||||||
|
// Test the full Level 2 workflow
|
||||||
|
const parsed = mockWorkflow.parseLevel2(understandingKnowledge); |
||||||
|
const events = mockWorkflow.buildLevel2Events(parsed); |
||||||
|
const result = mockWorkflow.publish(events); |
||||||
|
|
||||||
|
test.expect(parsed.metadata.title).toBe("Understanding Knowledge"); |
||||||
|
test.expect(parsed.sections).toHaveLength(2); |
||||||
|
test.expect(events.indexEvent.kind).toBe(30040); |
||||||
|
test.expect(events.sectionEvents).toHaveLength(2); |
||||||
|
test.expect(events.sectionEvents[1].content).toContain("=== Why Investigate"); // Contains subsections
|
||||||
|
test.expect(events.sectionEvents[1].content).toContain( |
||||||
|
"===== 1. The Building Blocks", |
||||||
|
); // Contains deeper levels
|
||||||
|
test.expect(result.success).toBeTruthy(); |
||||||
|
test.expect(result.published).toBe(3); // 1 index + 2 sections
|
||||||
|
}); |
||||||
|
|
||||||
|
test.test("Error handling for malformed content", () => { |
||||||
|
const invalidCases = [ |
||||||
|
{ |
||||||
|
content: "== Section\n=== Subsection\n==== Missing content", |
||||||
|
error: "Empty content sections", |
||||||
|
}, |
||||||
|
{ |
||||||
|
content: "= Title\n\n== Section\n==== Skipped level", |
||||||
|
error: "Invalid header nesting", |
||||||
|
}, |
||||||
|
{ content: "", error: "Empty document" }, |
||||||
|
]; |
||||||
|
|
||||||
|
invalidCases.forEach(({ content, error }) => { |
||||||
|
// Mock error detection
|
||||||
|
const hasEmptySections = content.includes("Missing content"); |
||||||
|
const hasSkippedLevels = content.includes("====") && |
||||||
|
!content.includes("==="); |
||||||
|
const isEmpty = content.trim() === ""; |
||||||
|
|
||||||
|
const shouldError = hasEmptySections || hasSkippedLevels || isEmpty; |
||||||
|
test.expect(shouldError).toBeTruthy(); |
||||||
|
}); |
||||||
|
}); |
||||||
|
|
||||||
|
// =============================================================================
|
||||||
|
// Test Execution
|
||||||
|
// =============================================================================
|
||||||
|
|
||||||
|
console.log("🎯 ZettelPublisher Test-Driven Development (ITERATIVE)\n"); |
||||||
|
console.log("📋 Test Data Analysis:"); |
||||||
|
console.log( |
||||||
|
`- Understanding Knowledge: ${ |
||||||
|
understandingKnowledge.split("\n").length |
||||||
|
} lines`,
|
||||||
|
); |
||||||
|
console.log(`- Desire: ${desire.split("\n").length} lines`); |
||||||
|
console.log( |
||||||
|
"- Both files use = document title with metadata directly underneath", |
||||||
|
); |
||||||
|
console.log("- Sections use == with deep nesting (===, ====, =====)"); |
||||||
|
console.log("- Custom attributes like :type: podcastArticle need preservation"); |
||||||
|
console.log( |
||||||
|
"- CRITICAL: Structure is ITERATIVE not recursive (per docreference.md)\n", |
||||||
|
); |
||||||
|
|
||||||
|
test.run().then((success) => { |
||||||
|
if (success) { |
||||||
|
console.log("\n🎉 All tests defined! Ready for ITERATIVE implementation."); |
||||||
|
console.log("\n📋 Implementation Plan:"); |
||||||
|
console.log("1. ✅ Update ParsedAsciiDoc interface for ITERATIVE parsing"); |
||||||
|
console.log( |
||||||
|
"2. ✅ Fix content processing (header separation, custom attributes)", |
||||||
|
); |
||||||
|
console.log( |
||||||
|
"3. ✅ Implement level-based publishing logic (30040/30041 structure)", |
||||||
|
); |
||||||
|
console.log("4. ✅ Add parse-level controlled event generation"); |
||||||
|
console.log("5. ✅ Create context-aware UI with level selector"); |
||||||
|
console.log("\n🔄 Each level can be developed and tested independently!"); |
||||||
|
} else { |
||||||
|
console.log( |
||||||
|
"\n❌ Tests ready - implement ITERATIVE features to make them pass!", |
||||||
|
); |
||||||
|
} |
||||||
|
}).catch(console.error); |
||||||