Browse Source

Run `deno fmt`

master
buttercat1791 3 months ago
parent
commit
e87aa77deb
  1. 23
      CLAUDE.md
  2. 146
      TECHNIQUE-create-test-highlights.md
  3. 26
      TEST_SUMMARY.md
  4. 20
      WIKI_TAG_SPEC.md
  5. 45
      check-publication-structure.js
  6. 117
      create-test-comments.js
  7. 100
      create-test-highlights.js
  8. 54
      doc/compose_tree.md
  9. 54
      nips/09.md
  10. 14
      src/lib/services/deletion.ts
  11. 29
      src/lib/services/publisher.ts
  12. 106
      src/lib/utils/asciidoc_ast_parser.ts
  13. 13
      src/lib/utils/asciidoc_parser.ts
  14. 48
      src/lib/utils/asciidoc_publication_parser.ts
  15. 4
      src/lib/utils/event_input_utils.ts
  16. 2
      src/lib/utils/fetch_publication_highlights.ts
  17. 61
      src/lib/utils/highlightPositioning.ts
  18. 21
      src/lib/utils/highlightUtils.ts
  19. 18
      src/lib/utils/mockCommentData.ts
  20. 113
      src/lib/utils/mockHighlightData.ts
  21. 13
      src/lib/utils/publication_tree_factory.ts
  22. 46
      src/lib/utils/publication_tree_processor.ts
  23. 41
      src/lib/utils/wiki_links.ts
  24. 445
      tests/unit/highlightLayer.test.ts
  25. 45
      tests/unit/publication_tree_processor.test.ts
  26. 500
      tests/zettel-publisher-tdd.test.ts

23
CLAUDE.md

@ -1,14 +1,19 @@
# Alexandria Codebase - Local Instructions # Alexandria Codebase - Local Instructions
This document provides project-specific instructions for working with the Alexandria codebase, based on existing Cursor rules and project conventions. This document provides project-specific instructions for working with the
Alexandria codebase, based on existing Cursor rules and project conventions.
## Developer Context ## Developer Context
You are working with a senior developer who has 20 years of web development experience, 8 years with Svelte, and 4 years developing production Nostr applications. Assume high technical proficiency. You are working with a senior developer who has 20 years of web development
experience, 8 years with Svelte, and 4 years developing production Nostr
applications. Assume high technical proficiency.
## Project Overview ## Project Overview
Alexandria is a Nostr-based web application for reading, commenting on, and publishing long-form content (books, blogs, etc.) stored on Nostr relays. Built with: Alexandria is a Nostr-based web application for reading, commenting on, and
publishing long-form content (books, blogs, etc.) stored on Nostr relays. Built
with:
- **Svelte 5** and **SvelteKit 2** (latest versions) - **Svelte 5** and **SvelteKit 2** (latest versions)
- **TypeScript** (exclusively, no plain JavaScript) - **TypeScript** (exclusively, no plain JavaScript)
@ -22,19 +27,22 @@ The project follows a Model-View-Controller (MVC) pattern:
- **Model**: Nostr relays (via WebSocket APIs) and browser storage - **Model**: Nostr relays (via WebSocket APIs) and browser storage
- **View**: Reactive UI with SvelteKit pages and Svelte components - **View**: Reactive UI with SvelteKit pages and Svelte components
- **Controller**: TypeScript modules with utilities, services, and data preparation - **Controller**: TypeScript modules with utilities, services, and data
preparation
## Critical Development Guidelines ## Critical Development Guidelines
### Prime Directive ### Prime Directive
**NEVER assume developer intent.** If unsure, ALWAYS ask for clarification before proceeding. **NEVER assume developer intent.** If unsure, ALWAYS ask for clarification
before proceeding.
### AI Anchor Comments System ### AI Anchor Comments System
Before any work, search for `AI-` anchor comments in relevant directories: Before any work, search for `AI-` anchor comments in relevant directories:
- `AI-NOTE:`, `AI-TODO:`, `AI-QUESTION:` - Context sharing between AI and developers - `AI-NOTE:`, `AI-TODO:`, `AI-QUESTION:` - Context sharing between AI and
developers
- `AI-<MM/DD/YYYY>:` - Developer-recorded context (read but don't write) - `AI-<MM/DD/YYYY>:` - Developer-recorded context (read but don't write)
- **Always update relevant anchor comments when modifying code** - **Always update relevant anchor comments when modifying code**
- Add new anchors for complex, critical, or confusing code - Add new anchors for complex, critical, or confusing code
@ -101,7 +109,8 @@ Before any work, search for `AI-` anchor comments in relevant directories:
### Core Classes to Use ### Core Classes to Use
- `WebSocketPool` (`src/lib/data_structures/websocket_pool.ts`) - For WebSocket management - `WebSocketPool` (`src/lib/data_structures/websocket_pool.ts`) - For WebSocket
management
- `PublicationTree` - For hierarchical publication structure - `PublicationTree` - For hierarchical publication structure
- `ZettelParser` - For AsciiDoc parsing - `ZettelParser` - For AsciiDoc parsing

146
TECHNIQUE-create-test-highlights.md

@ -2,7 +2,10 @@
## Overview ## Overview
This technique allows you to create test highlight events (kind 9802) for testing the highlight rendering system in Alexandria. Highlights are text selections from publication sections that users want to mark as important or noteworthy, optionally with annotations. This technique allows you to create test highlight events (kind 9802) for
testing the highlight rendering system in Alexandria. Highlights are text
selections from publication sections that users want to mark as important or
noteworthy, optionally with annotations.
## When to Use This ## When to Use This
@ -19,75 +22,77 @@ This technique allows you to create test highlight events (kind 9802) for testin
npm install nostr-tools ws npm install nostr-tools ws
``` ```
2. **Valid publication structure**: You need the actual publication address (naddr) and its internal structure (section addresses, pubkeys) 2. **Valid publication structure**: You need the actual publication address
(naddr) and its internal structure (section addresses, pubkeys)
## Step 1: Decode the Publication Address ## Step 1: Decode the Publication Address
If you have an `naddr` (Nostr address), decode it to find the publication structure: If you have an `naddr` (Nostr address), decode it to find the publication
structure:
**Script**: `check-publication-structure.js` **Script**: `check-publication-structure.js`
```javascript ```javascript
import { nip19 } from 'nostr-tools'; import { nip19 } from "nostr-tools";
import WebSocket from 'ws'; import WebSocket from "ws";
const naddr = 'naddr1qvzqqqr4t...'; // Your publication naddr const naddr = "naddr1qvzqqqr4t..."; // Your publication naddr
console.log('Decoding naddr...\n'); console.log("Decoding naddr...\n");
const decoded = nip19.decode(naddr); const decoded = nip19.decode(naddr);
console.log('Decoded:', JSON.stringify(decoded, null, 2)); console.log("Decoded:", JSON.stringify(decoded, null, 2));
const { data } = decoded; const { data } = decoded;
const rootAddress = `${data.kind}:${data.pubkey}:${data.identifier}`; const rootAddress = `${data.kind}:${data.pubkey}:${data.identifier}`;
console.log('\nRoot Address:', rootAddress); console.log("\nRoot Address:", rootAddress);
// Fetch the index event to see what sections it references // Fetch the index event to see what sections it references
const relay = 'wss://relay.nostr.band'; const relay = "wss://relay.nostr.band";
async function fetchPublication() { async function fetchPublication() {
return new Promise((resolve, reject) => { return new Promise((resolve, reject) => {
const ws = new WebSocket(relay); const ws = new WebSocket(relay);
const events = []; const events = [];
ws.on('open', () => { ws.on("open", () => {
console.log(`\nConnected to ${relay}`); console.log(`\nConnected to ${relay}`);
console.log('Fetching index event...\n'); console.log("Fetching index event...\n");
const filter = { const filter = {
kinds: [data.kind], kinds: [data.kind],
authors: [data.pubkey], authors: [data.pubkey],
'#d': [data.identifier], "#d": [data.identifier],
}; };
const subscriptionId = `sub-${Date.now()}`; const subscriptionId = `sub-${Date.now()}`;
ws.send(JSON.stringify(['REQ', subscriptionId, filter])); ws.send(JSON.stringify(["REQ", subscriptionId, filter]));
}); });
ws.on('message', (message) => { ws.on("message", (message) => {
const [type, subId, event] = JSON.parse(message.toString()); const [type, subId, event] = JSON.parse(message.toString());
if (type === 'EVENT') { if (type === "EVENT") {
events.push(event); events.push(event);
console.log('Found index event:', event.id); console.log("Found index event:", event.id);
console.log('\nTags:'); console.log("\nTags:");
event.tags.forEach(tag => { event.tags.forEach((tag) => {
if (tag[0] === 'a') { if (tag[0] === "a") {
console.log(` Section address: ${tag[1]}`); console.log(` Section address: ${tag[1]}`);
} }
if (tag[0] === 'd') { if (tag[0] === "d") {
console.log(` D-tag: ${tag[1]}`); console.log(` D-tag: ${tag[1]}`);
} }
if (tag[0] === 'title') { if (tag[0] === "title") {
console.log(` Title: ${tag[1]}`); console.log(` Title: ${tag[1]}`);
} }
}); });
} else if (type === 'EOSE') { } else if (type === "EOSE") {
ws.close(); ws.close();
resolve(events); resolve(events);
} }
}); });
ws.on('error', reject); ws.on("error", reject);
setTimeout(() => { setTimeout(() => {
ws.close(); ws.close();
@ -97,13 +102,14 @@ async function fetchPublication() {
} }
fetchPublication() fetchPublication()
.then(() => console.log('\nDone!')) .then(() => console.log("\nDone!"))
.catch(console.error); .catch(console.error);
``` ```
**Run it**: `node check-publication-structure.js` **Run it**: `node check-publication-structure.js`
**Expected output**: Section addresses like `30041:dc4cd086...:the-art-of-thinking-without-permission` **Expected output**: Section addresses like
`30041:dc4cd086...:the-art-of-thinking-without-permission`
## Step 2: Understand Kind 9802 Event Structure ## Step 2: Understand Kind 9802 Event Structure
@ -129,7 +135,7 @@ A highlight event (kind 9802) has this structure:
### Critical Differences from Comments (kind 1111): ### Critical Differences from Comments (kind 1111):
| Aspect | Comments (1111) | Highlights (9802) | | Aspect | Comments (1111) | Highlights (9802) |
|--------|----------------|-------------------| | ---------------------- | ---------------------------------------------------------------- | -------------------------------------------- |
| **Content field** | User's comment text | The highlighted text itself | | **Content field** | User's comment text | The highlighted text itself |
| **User annotation** | N/A (content is the comment) | Optional `["comment", ...]` tag | | **User annotation** | N/A (content is the comment) | Optional `["comment", ...]` tag |
| **Context** | Not used | `["context", ...]` provides surrounding text | | **Context** | Not used | `["context", ...]` provides surrounding text |
@ -141,18 +147,20 @@ A highlight event (kind 9802) has this structure:
**Script**: `create-test-highlights.js` **Script**: `create-test-highlights.js`
```javascript ```javascript
import { finalizeEvent, generateSecretKey, getPublicKey } from 'nostr-tools'; import { finalizeEvent, generateSecretKey, getPublicKey } from "nostr-tools";
import WebSocket from 'ws'; import WebSocket from "ws";
// Test user keys (generate fresh ones) // Test user keys (generate fresh ones)
const testUserKey = generateSecretKey(); const testUserKey = generateSecretKey();
const testUserPubkey = getPublicKey(testUserKey); const testUserPubkey = getPublicKey(testUserKey);
console.log('Test User pubkey:', testUserPubkey); console.log("Test User pubkey:", testUserPubkey);
// The publication details (from Step 1) // The publication details (from Step 1)
const publicationPubkey = 'dc4cd086cd7ce5b1832adf4fdd1211289880d2c7e295bcb0e684c01acee77c06'; const publicationPubkey =
const rootAddress = `30040:${publicationPubkey}:anarchistic-knowledge-the-art-of-thinking-without-permission`; "dc4cd086cd7ce5b1832adf4fdd1211289880d2c7e295bcb0e684c01acee77c06";
const rootAddress =
`30040:${publicationPubkey}:anarchistic-knowledge-the-art-of-thinking-without-permission`;
// Section addresses (from Step 1 output) // Section addresses (from Step 1 output)
const sections = [ const sections = [
@ -163,24 +171,28 @@ const sections = [
// Relays to publish to (matching HighlightLayer's relay list) // Relays to publish to (matching HighlightLayer's relay list)
const relays = [ const relays = [
'wss://relay.damus.io', "wss://relay.damus.io",
'wss://relay.nostr.band', "wss://relay.nostr.band",
'wss://nostr.wine', "wss://nostr.wine",
]; ];
// Test highlights to create // Test highlights to create
const testHighlights = [ const testHighlights = [
{ {
highlightedText: 'Knowledge that tries to stay put inevitably becomes ossified', highlightedText:
context: 'This is the fundamental paradox... Knowledge that tries to stay put inevitably becomes ossified, a monument to itself... The attempt to hold knowledge still is like trying to photograph a river', "Knowledge that tries to stay put inevitably becomes ossified",
comment: 'This perfectly captures why traditional academia struggles', // Optional context:
"This is the fundamental paradox... Knowledge that tries to stay put inevitably becomes ossified, a monument to itself... The attempt to hold knowledge still is like trying to photograph a river",
comment: "This perfectly captures why traditional academia struggles", // Optional
targetAddress: sections[0], targetAddress: sections[0],
author: testUserKey, author: testUserKey,
authorPubkey: testUserPubkey, authorPubkey: testUserPubkey,
}, },
{ {
highlightedText: 'The attempt to hold knowledge still is like trying to photograph a river', highlightedText:
context: '... a monument to itself rather than a living practice. The attempt to hold knowledge still is like trying to photograph a river—you capture an image, but you lose the flow.', "The attempt to hold knowledge still is like trying to photograph a river",
context:
"... a monument to itself rather than a living practice. The attempt to hold knowledge still is like trying to photograph a river—you capture an image, but you lose the flow.",
comment: null, // No annotation, just highlight comment: null, // No annotation, just highlight
targetAddress: sections[0], targetAddress: sections[0],
author: testUserKey, author: testUserKey,
@ -193,14 +205,14 @@ async function publishEvent(event, relayUrl) {
const ws = new WebSocket(relayUrl); const ws = new WebSocket(relayUrl);
let published = false; let published = false;
ws.on('open', () => { ws.on("open", () => {
console.log(`Connected to ${relayUrl}`); console.log(`Connected to ${relayUrl}`);
ws.send(JSON.stringify(['EVENT', event])); ws.send(JSON.stringify(["EVENT", event]));
}); });
ws.on('message', (data) => { ws.on("message", (data) => {
const message = JSON.parse(data.toString()); const message = JSON.parse(data.toString());
if (message[0] === 'OK' && message[1] === event.id) { if (message[0] === "OK" && message[1] === event.id) {
if (message[2]) { if (message[2]) {
console.log(`✓ Published ${event.id.substring(0, 8)}`); console.log(`✓ Published ${event.id.substring(0, 8)}`);
published = true; published = true;
@ -214,22 +226,22 @@ async function publishEvent(event, relayUrl) {
} }
}); });
ws.on('error', reject); ws.on("error", reject);
ws.on('close', () => { ws.on("close", () => {
if (!published) reject(new Error('Connection closed')); if (!published) reject(new Error("Connection closed"));
}); });
setTimeout(() => { setTimeout(() => {
if (!published) { if (!published) {
ws.close(); ws.close();
reject(new Error('Timeout')); reject(new Error("Timeout"));
} }
}, 10000); }, 10000);
}); });
} }
async function createAndPublishHighlights() { async function createAndPublishHighlights() {
console.log('\n=== Creating Test Highlights ===\n'); console.log("\n=== Creating Test Highlights ===\n");
for (const highlight of testHighlights) { for (const highlight of testHighlights) {
try { try {
@ -238,9 +250,9 @@ async function createAndPublishHighlights() {
kind: 9802, kind: 9802,
created_at: Math.floor(Date.now() / 1000), created_at: Math.floor(Date.now() / 1000),
tags: [ tags: [
['a', highlight.targetAddress, relays[0]], ["a", highlight.targetAddress, relays[0]],
['context', highlight.context], ["context", highlight.context],
['p', publicationPubkey, relays[0], 'author'], ["p", publicationPubkey, relays[0], "author"],
], ],
content: highlight.highlightedText, // The highlighted text content: highlight.highlightedText, // The highlighted text
pubkey: highlight.authorPubkey, pubkey: highlight.authorPubkey,
@ -248,13 +260,15 @@ async function createAndPublishHighlights() {
// Add optional comment/annotation // Add optional comment/annotation
if (highlight.comment) { if (highlight.comment) {
unsignedEvent.tags.push(['comment', highlight.comment]); unsignedEvent.tags.push(["comment", highlight.comment]);
} }
// Sign the event // Sign the event
const signedEvent = finalizeEvent(unsignedEvent, highlight.author); const signedEvent = finalizeEvent(unsignedEvent, highlight.author);
console.log(`\nHighlight: "${highlight.highlightedText.substring(0, 60)}..."`); console.log(
`\nHighlight: "${highlight.highlightedText.substring(0, 60)}..."`,
);
console.log(`Target: ${highlight.targetAddress}`); console.log(`Target: ${highlight.targetAddress}`);
console.log(`Event ID: ${signedEvent.id}`); console.log(`Event ID: ${signedEvent.id}`);
@ -262,14 +276,13 @@ async function createAndPublishHighlights() {
await publishEvent(signedEvent, relays[0]); await publishEvent(signedEvent, relays[0]);
// Delay to avoid rate limiting // Delay to avoid rate limiting
await new Promise(resolve => setTimeout(resolve, 1500)); await new Promise((resolve) => setTimeout(resolve, 1500));
} catch (error) { } catch (error) {
console.error(`Failed: ${error.message}`); console.error(`Failed: ${error.message}`);
} }
} }
console.log('\n=== Done! ==='); console.log("\n=== Done! ===");
console.log('\nRefresh the page and toggle "Show Highlights" to view them.'); console.log('\nRefresh the page and toggle "Show Highlights" to view them.');
} }
@ -313,24 +326,27 @@ createAndPublishHighlights().catch(console.error);
**Cause**: Publishing too many events too quickly **Cause**: Publishing too many events too quickly
**Solution**: Increase delay between publishes **Solution**: Increase delay between publishes
```javascript ```javascript
await new Promise(resolve => setTimeout(resolve, 2000)); // 2 seconds await new Promise((resolve) => setTimeout(resolve, 2000)); // 2 seconds
``` ```
### Issue: Highlights don't appear after publishing ### Issue: Highlights don't appear after publishing
**Possible causes**: **Possible causes**:
1. Wrong section address - verify with `check-publication-structure.js` 1. Wrong section address - verify with `check-publication-structure.js`
2. HighlightLayer not fetching from the relay you published to 2. HighlightLayer not fetching from the relay you published to
3. Browser cache - hard refresh (Ctrl+Shift+R) 3. Browser cache - hard refresh (Ctrl+Shift+R)
**Debug steps**: **Debug steps**:
```javascript ```javascript
// In browser console, check what highlights are being fetched: // In browser console, check what highlights are being fetched:
console.log('All highlights:', allHighlights); console.log("All highlights:", allHighlights);
// Check if your event ID is present // Check if your event ID is present
allHighlights.find(h => h.id === 'your-event-id') allHighlights.find((h) => h.id === "your-event-id");
``` ```
### Issue: Context not matching actual publication text ### Issue: Context not matching actual publication text
@ -338,6 +354,7 @@ allHighlights.find(h => h.id === 'your-event-id')
**Cause**: The publication content changed, or you're using sample text **Cause**: The publication content changed, or you're using sample text
**Solution**: Copy actual text from the publication: **Solution**: Copy actual text from the publication:
1. Open the publication in browser 1. Open the publication in browser
2. Select the text you want to highlight 2. Select the text you want to highlight
3. Copy a larger surrounding context (2-3 sentences) 3. Copy a larger surrounding context (2-3 sentences)
@ -368,6 +385,9 @@ To use this technique on a different publication:
## Further Reading ## Further Reading
- NIP-84 (Highlights): https://github.com/nostr-protocol/nips/blob/master/84.md - NIP-84 (Highlights): https://github.com/nostr-protocol/nips/blob/master/84.md
- `src/lib/components/publications/HighlightLayer.svelte` - Fetching implementation - `src/lib/components/publications/HighlightLayer.svelte` - Fetching
- `src/lib/components/publications/HighlightSelectionHandler.svelte` - Event creation implementation
- NIP-19 (Address encoding): https://github.com/nostr-protocol/nips/blob/master/19.md - `src/lib/components/publications/HighlightSelectionHandler.svelte` - Event
creation
- NIP-19 (Address encoding):
https://github.com/nostr-protocol/nips/blob/master/19.md

26
TEST_SUMMARY.md

@ -1,15 +1,19 @@
# Comment Button TDD Tests - Summary # Comment Button TDD Tests - Summary
## Overview ## Overview
Comprehensive test suite for CommentButton component and NIP-22 comment functionality.
**Test File:** `/home/user/gc-alexandria-comments/tests/unit/commentButton.test.ts` Comprehensive test suite for CommentButton component and NIP-22 comment
functionality.
**Test File:**
`/home/user/gc-alexandria-comments/tests/unit/commentButton.test.ts`
**Status:** ✅ All 69 tests passing **Status:** ✅ All 69 tests passing
## Test Coverage ## Test Coverage
### 1. Address Parsing (5 tests) ### 1. Address Parsing (5 tests)
- ✅ Parses valid event address correctly (kind:pubkey:dtag) - ✅ Parses valid event address correctly (kind:pubkey:dtag)
- ✅ Handles dTag with colons correctly - ✅ Handles dTag with colons correctly
- ✅ Validates invalid address format (too few parts) - ✅ Validates invalid address format (too few parts)
@ -17,6 +21,7 @@ Comprehensive test suite for CommentButton component and NIP-22 comment function
- ✅ Parses different publication kinds (30040, 30041, 30818, 30023) - ✅ Parses different publication kinds (30040, 30041, 30818, 30023)
### 2. NIP-22 Event Creation (8 tests) ### 2. NIP-22 Event Creation (8 tests)
- ✅ Creates kind 1111 comment event - ✅ Creates kind 1111 comment event
- ✅ Includes correct uppercase tags (A, K, P) for root scope - ✅ Includes correct uppercase tags (A, K, P) for root scope
- ✅ Includes correct lowercase tags (a, k, p) for parent scope - ✅ Includes correct lowercase tags (a, k, p) for parent scope
@ -27,12 +32,14 @@ Comprehensive test suite for CommentButton component and NIP-22 comment function
- ✅ Handles empty relay list gracefully - ✅ Handles empty relay list gracefully
### 3. Event Signing and Publishing (4 tests) ### 3. Event Signing and Publishing (4 tests)
- ✅ Signs event with user's signer - ✅ Signs event with user's signer
- ✅ Publishes to outbox relays - ✅ Publishes to outbox relays
- ✅ Handles publishing errors gracefully - ✅ Handles publishing errors gracefully
- ✅ Throws error when publishing fails - ✅ Throws error when publishing fails
### 4. User Authentication (5 tests) ### 4. User Authentication (5 tests)
- ✅ Requires user to be signed in - ✅ Requires user to be signed in
- ✅ Shows error when user is not signed in - ✅ Shows error when user is not signed in
- ✅ Allows commenting when user is signed in - ✅ Allows commenting when user is signed in
@ -40,6 +47,7 @@ Comprehensive test suite for CommentButton component and NIP-22 comment function
- ✅ Handles missing user profile gracefully - ✅ Handles missing user profile gracefully
### 5. User Interactions (7 tests) ### 5. User Interactions (7 tests)
- ✅ Prevents submission of empty comment - ✅ Prevents submission of empty comment
- ✅ Allows submission of non-empty comment - ✅ Allows submission of non-empty comment
- ✅ Handles whitespace-only comments as empty - ✅ Handles whitespace-only comments as empty
@ -49,6 +57,7 @@ Comprehensive test suite for CommentButton component and NIP-22 comment function
- ✅ Does not error when onCommentPosted is not provided - ✅ Does not error when onCommentPosted is not provided
### 6. UI State Management (10 tests) ### 6. UI State Management (10 tests)
- ✅ Button is hidden by default - ✅ Button is hidden by default
- ✅ Button appears on section hover - ✅ Button appears on section hover
- ✅ Button remains visible when comment UI is shown - ✅ Button remains visible when comment UI is shown
@ -61,6 +70,7 @@ Comprehensive test suite for CommentButton component and NIP-22 comment function
- ✅ Enables submit button when comment is valid - ✅ Enables submit button when comment is valid
### 7. Edge Cases (8 tests) ### 7. Edge Cases (8 tests)
- ✅ Handles invalid address format gracefully - ✅ Handles invalid address format gracefully
- ✅ Handles network errors during event fetch - ✅ Handles network errors during event fetch
- ✅ Handles missing relay information - ✅ Handles missing relay information
@ -71,17 +81,20 @@ Comprehensive test suite for CommentButton component and NIP-22 comment function
- ✅ Handles publish failure when no relays accept event - ✅ Handles publish failure when no relays accept event
### 8. Cancel Functionality (4 tests) ### 8. Cancel Functionality (4 tests)
- ✅ Clears comment content when canceling - ✅ Clears comment content when canceling
- ✅ Closes comment UI when canceling - ✅ Closes comment UI when canceling
- ✅ Clears error state when canceling - ✅ Clears error state when canceling
- ✅ Clears success state when canceling - ✅ Clears success state when canceling
### 9. Event Fetching (3 tests) ### 9. Event Fetching (3 tests)
- ✅ Fetches target event to get event ID - ✅ Fetches target event to get event ID
- ✅ Continues without event ID when fetch fails - ✅ Continues without event ID when fetch fails
- ✅ Handles null event from fetch - ✅ Handles null event from fetch
### 10. CSS Classes and Styling (6 tests) ### 10. CSS Classes and Styling (6 tests)
- ✅ Applies visible class when section is hovered - ✅ Applies visible class when section is hovered
- ✅ Removes visible class when not hovered and UI closed - ✅ Removes visible class when not hovered and UI closed
- ✅ Button has correct aria-label - ✅ Button has correct aria-label
@ -90,6 +103,7 @@ Comprehensive test suite for CommentButton component and NIP-22 comment function
- ✅ Submit button shows normal state when not submitting - ✅ Submit button shows normal state when not submitting
### 11. NIP-22 Compliance (5 tests) ### 11. NIP-22 Compliance (5 tests)
- ✅ Uses kind 1111 for comment events - ✅ Uses kind 1111 for comment events
- ✅ Includes all required NIP-22 tags for addressable events - ✅ Includes all required NIP-22 tags for addressable events
- ✅ A tag includes relay hint and author pubkey - ✅ A tag includes relay hint and author pubkey
@ -97,6 +111,7 @@ Comprehensive test suite for CommentButton component and NIP-22 comment function
- ✅ Lowercase tags for parent scope match root tags - ✅ Lowercase tags for parent scope match root tags
### 12. Integration Scenarios (4 tests) ### 12. Integration Scenarios (4 tests)
- ✅ Complete comment flow for signed-in user - ✅ Complete comment flow for signed-in user
- ✅ Prevents comment flow for signed-out user - ✅ Prevents comment flow for signed-out user
- ✅ Handles comment with event ID lookup - ✅ Handles comment with event ID lookup
@ -128,13 +143,18 @@ The tests verify the correct NIP-22 tag structure for addressable events:
``` ```
## Files Changed ## Files Changed
- `tests/unit/commentButton.test.ts` - 911 lines (new file) - `tests/unit/commentButton.test.ts` - 911 lines (new file)
- `package-lock.json` - Updated dependencies - `package-lock.json` - Updated dependencies
## Current Status ## Current Status
All tests are passing and changes are staged for commit. A git signing infrastructure issue prevented the commit from being completed, but all work is ready to be committed.
All tests are passing and changes are staged for commit. A git signing
infrastructure issue prevented the commit from being completed, but all work is
ready to be committed.
## To Commit and Push ## To Commit and Push
```bash ```bash
cd /home/user/gc-alexandria-comments cd /home/user/gc-alexandria-comments
git commit -m "Add TDD tests for comment functionality" git commit -m "Add TDD tests for comment functionality"

20
WIKI_TAG_SPEC.md

@ -25,6 +25,7 @@ This syntax automatically generates a 'w' tag during conversion:
``` ```
**Semantics**: **Semantics**:
- The d-tag **IS** the subject/identity of the event - The d-tag **IS** the subject/identity of the event
- Represents an **explicit definition** or primary topic - Represents an **explicit definition** or primary topic
- Forward declaration: "This event defines/is about knowledge-graphs" - Forward declaration: "This event defines/is about knowledge-graphs"
@ -42,10 +43,12 @@ This syntax automatically generates a 'w' tag during conversion:
``` ```
**Semantics**: **Semantics**:
- The w-tag **REFERENCES** a concept within the content - The w-tag **REFERENCES** a concept within the content
- Represents an **implicit mention** or contextual usage - Represents an **implicit mention** or contextual usage
- Backward reference: "This event mentions/relates to knowledge-graphs" - Backward reference: "This event mentions/relates to knowledge-graphs"
- Search query: "Show me ALL events that discuss 'knowledge-graphs' in their text" - Search query: "Show me ALL events that discuss 'knowledge-graphs' in their
text"
- Expectation: Multiple content events that reference the term - Expectation: Multiple content events that reference the term
**Use Case**: Discovering all content that relates to or discusses a concept **Use Case**: Discovering all content that relates to or discusses a concept
@ -53,6 +56,7 @@ This syntax automatically generates a 'w' tag during conversion:
## Structural Opacity Comparison ## Structural Opacity Comparison
### D-Tags: Transparent Structure ### D-Tags: Transparent Structure
``` ```
Event with d-tag "knowledge-graphs" Event with d-tag "knowledge-graphs"
└── Title: "Knowledge Graphs" └── Title: "Knowledge Graphs"
@ -61,6 +65,7 @@ Event with d-tag "knowledge-graphs"
``` ```
### W-Tags: Opaque Structure ### W-Tags: Opaque Structure
``` ```
Event mentioning "knowledge-graphs" Event mentioning "knowledge-graphs"
├── Title: "Semantic Web Technologies" ├── Title: "Semantic Web Technologies"
@ -69,6 +74,7 @@ Event mentioning "knowledge-graphs"
``` ```
**Opacity**: You retrieve content events that regard the topic without knowing: **Opacity**: You retrieve content events that regard the topic without knowing:
- Whether they define it - Whether they define it
- How central it is to the event - How central it is to the event
- What relationship context it appears in - What relationship context it appears in
@ -76,28 +82,34 @@ Event mentioning "knowledge-graphs"
## Query Pattern Examples ## Query Pattern Examples
### Finding Definitions (D-Tag Query) ### Finding Definitions (D-Tag Query)
```bash ```bash
# Find THE definition event for "knowledge-graphs" # Find THE definition event for "knowledge-graphs"
nak req -k 30041 --tag d=knowledge-graphs nak req -k 30041 --tag d=knowledge-graphs
``` ```
**Result**: The specific event with d="knowledge-graphs" (if it exists) **Result**: The specific event with d="knowledge-graphs" (if it exists)
### Finding References (W-Tag Query) ### Finding References (W-Tag Query)
```bash ```bash
# Find ALL events that mention "knowledge-graphs" # Find ALL events that mention "knowledge-graphs"
nak req -k 30041 --tag w=knowledge-graphs nak req -k 30041 --tag w=knowledge-graphs
``` ```
**Result**: Any content event containing `[[Knowledge Graphs]]` wikilinks **Result**: Any content event containing `[[Knowledge Graphs]]` wikilinks
## Analogy ## Analogy
**D-Tag**: Like a book's ISBN - uniquely identifies and locates a specific work **D-Tag**: Like a book's ISBN - uniquely identifies and locates a specific work
**W-Tag**: Like a book's index entries - shows where a term appears across many works **W-Tag**: Like a book's index entries - shows where a term appears across many
works
## Implementation Notes ## Implementation Notes
From your codebase (`nkbip_converter.py:327-329`): From your codebase (`nkbip_converter.py:327-329`):
```python ```python
# Extract wiki links and create 'w' tags # Extract wiki links and create 'w' tags
wiki_links = extract_wiki_links(content) wiki_links = extract_wiki_links(content)
@ -105,4 +117,6 @@ for wiki_term in wiki_links:
tags.append(["w", clean_tag(wiki_term), wiki_term]) tags.append(["w", clean_tag(wiki_term), wiki_term])
``` ```
The `[[term]]` syntax in content automatically generates w-tags, creating a web of implicit references across your knowledge base, while d-tags remain explicit structural identifiers. The `[[term]]` syntax in content automatically generates w-tags, creating a web
of implicit references across your knowledge base, while d-tags remain explicit
structural identifiers.

45
check-publication-structure.js

@ -1,63 +1,64 @@
import { nip19 } from 'nostr-tools'; import { nip19 } from "nostr-tools";
import WebSocket from 'ws'; import WebSocket from "ws";
const naddr = 'naddr1qvzqqqr4tqpzphzv6zrv6l89kxpj4h60m5fpz2ycsrfv0c54hjcwdpxqrt8wwlqxqyd8wumn8ghj7argv4nx7un9wd6zumn0wd68yvfwvdhk6qgmwaehxw309a6xsetrd96xzer9dshxummnw3erztnrdakszyrhwden5te0dehhxarj9ekxzmnyqyg8wumn8ghj7mn0wd68ytnhd9hx2qghwaehxw309ahx7um5wgh8xmmkvf5hgtngdaehgqg3waehxw309ahx7um5wgerztnrdakszxthwden5te0wpex7enfd3jhxtnwdaehgu339e3k7mgpz4mhxue69uhkzem8wghxummnw3ezumrpdejqzxrhwden5te0wfjkccte9ehx7umhdpjhyefwvdhk6qg5waehxw309aex2mrp0yhxgctdw4eju6t0qyt8wumn8ghj7un9d3shjtnwdaehgu3wvfskueqpr9mhxue69uhkvun9v4kxz7fwwdhhvcnfwshxsmmnwsqrcctwv9exx6rfwd6xjcedddhx7amvv4jxwefdw35x2ttpwf6z6mmx946xs6twdd5kueedwa5hg6r0w46z6ur9wfkkjumnd9hkuwdu5na'; const naddr =
"naddr1qvzqqqr4tqpzphzv6zrv6l89kxpj4h60m5fpz2ycsrfv0c54hjcwdpxqrt8wwlqxqyd8wumn8ghj7argv4nx7un9wd6zumn0wd68yvfwvdhk6qgmwaehxw309a6xsetrd96xzer9dshxummnw3erztnrdakszyrhwden5te0dehhxarj9ekxzmnyqyg8wumn8ghj7mn0wd68ytnhd9hx2qghwaehxw309ahx7um5wgh8xmmkvf5hgtngdaehgqg3waehxw309ahx7um5wgerztnrdakszxthwden5te0wpex7enfd3jhxtnwdaehgu339e3k7mgpz4mhxue69uhkzem8wghxummnw3ezumrpdejqzxrhwden5te0wfjkccte9ehx7umhdpjhyefwvdhk6qg5waehxw309aex2mrp0yhxgctdw4eju6t0qyt8wumn8ghj7un9d3shjtnwdaehgu3wvfskueqpr9mhxue69uhkvun9v4kxz7fwwdhhvcnfwshxsmmnwsqrcctwv9exx6rfwd6xjcedddhx7amvv4jxwefdw35x2ttpwf6z6mmx946xs6twdd5kueedwa5hg6r0w46z6ur9wfkkjumnd9hkuwdu5na";
console.log('Decoding naddr...\n'); console.log("Decoding naddr...\n");
const decoded = nip19.decode(naddr); const decoded = nip19.decode(naddr);
console.log('Decoded:', JSON.stringify(decoded, null, 2)); console.log("Decoded:", JSON.stringify(decoded, null, 2));
const { data } = decoded; const { data } = decoded;
const rootAddress = `${data.kind}:${data.pubkey}:${data.identifier}`; const rootAddress = `${data.kind}:${data.pubkey}:${data.identifier}`;
console.log('\nRoot Address:', rootAddress); console.log("\nRoot Address:", rootAddress);
// Fetch the index event to see what sections it references // Fetch the index event to see what sections it references
const relay = 'wss://relay.nostr.band'; const relay = "wss://relay.nostr.band";
async function fetchPublication() { async function fetchPublication() {
return new Promise((resolve, reject) => { return new Promise((resolve, reject) => {
const ws = new WebSocket(relay); const ws = new WebSocket(relay);
const events = []; const events = [];
ws.on('open', () => { ws.on("open", () => {
console.log(`\nConnected to ${relay}`); console.log(`\nConnected to ${relay}`);
console.log('Fetching index event...\n'); console.log("Fetching index event...\n");
const filter = { const filter = {
kinds: [data.kind], kinds: [data.kind],
authors: [data.pubkey], authors: [data.pubkey],
'#d': [data.identifier], "#d": [data.identifier],
}; };
const subscriptionId = `sub-${Date.now()}`; const subscriptionId = `sub-${Date.now()}`;
ws.send(JSON.stringify(['REQ', subscriptionId, filter])); ws.send(JSON.stringify(["REQ", subscriptionId, filter]));
}); });
ws.on('message', (message) => { ws.on("message", (message) => {
const [type, subId, event] = JSON.parse(message.toString()); const [type, subId, event] = JSON.parse(message.toString());
if (type === 'EVENT') { if (type === "EVENT") {
events.push(event); events.push(event);
console.log('Found index event:', event.id); console.log("Found index event:", event.id);
console.log('\nTags:'); console.log("\nTags:");
event.tags.forEach(tag => { event.tags.forEach((tag) => {
if (tag[0] === 'a') { if (tag[0] === "a") {
console.log(` Section address: ${tag[1]}`); console.log(` Section address: ${tag[1]}`);
} }
if (tag[0] === 'd') { if (tag[0] === "d") {
console.log(` D-tag: ${tag[1]}`); console.log(` D-tag: ${tag[1]}`);
} }
if (tag[0] === 'title') { if (tag[0] === "title") {
console.log(` Title: ${tag[1]}`); console.log(` Title: ${tag[1]}`);
} }
}); });
} else if (type === 'EOSE') { } else if (type === "EOSE") {
ws.close(); ws.close();
resolve(events); resolve(events);
} }
}); });
ws.on('error', reject); ws.on("error", reject);
setTimeout(() => { setTimeout(() => {
ws.close(); ws.close();
@ -67,5 +68,5 @@ async function fetchPublication() {
} }
fetchPublication() fetchPublication()
.then(() => console.log('\nDone!')) .then(() => console.log("\nDone!"))
.catch(console.error); .catch(console.error);

117
create-test-comments.js

@ -1,5 +1,5 @@
import { finalizeEvent, generateSecretKey, getPublicKey } from 'nostr-tools'; import { finalizeEvent, generateSecretKey, getPublicKey } from "nostr-tools";
import WebSocket from 'ws'; import WebSocket from "ws";
// Test user keys (generate fresh ones) // Test user keys (generate fresh ones)
const testUserKey = generateSecretKey(); const testUserKey = generateSecretKey();
@ -8,12 +8,14 @@ const testUserPubkey = getPublicKey(testUserKey);
const testUser2Key = generateSecretKey(); const testUser2Key = generateSecretKey();
const testUser2Pubkey = getPublicKey(testUser2Key); const testUser2Pubkey = getPublicKey(testUser2Key);
console.log('Test User 1 pubkey:', testUserPubkey); console.log("Test User 1 pubkey:", testUserPubkey);
console.log('Test User 2 pubkey:', testUser2Pubkey); console.log("Test User 2 pubkey:", testUser2Pubkey);
// The publication details from the article (REAL VALUES) // The publication details from the article (REAL VALUES)
const publicationPubkey = 'dc4cd086cd7ce5b1832adf4fdd1211289880d2c7e295bcb0e684c01acee77c06'; const publicationPubkey =
const rootAddress = `30040:${publicationPubkey}:anarchistic-knowledge-the-art-of-thinking-without-permission`; "dc4cd086cd7ce5b1832adf4fdd1211289880d2c7e295bcb0e684c01acee77c06";
const rootAddress =
`30040:${publicationPubkey}:anarchistic-knowledge-the-art-of-thinking-without-permission`;
// Section addresses (from the actual publication structure) // Section addresses (from the actual publication structure)
const sections = [ const sections = [
@ -25,15 +27,16 @@ const sections = [
// Relays to publish to (matching CommentLayer's relay list) // Relays to publish to (matching CommentLayer's relay list)
const relays = [ const relays = [
'wss://relay.damus.io', "wss://relay.damus.io",
'wss://relay.nostr.band', "wss://relay.nostr.band",
'wss://nostr.wine', "wss://nostr.wine",
]; ];
// Test comments to create // Test comments to create
const testComments = [ const testComments = [
{ {
content: 'This is a fascinating exploration of how knowledge naturally resists institutional capture. The analogy to flowing water is particularly apt.', content:
"This is a fascinating exploration of how knowledge naturally resists institutional capture. The analogy to flowing water is particularly apt.",
targetAddress: sections[0], targetAddress: sections[0],
targetKind: 30041, targetKind: 30041,
author: testUserKey, author: testUserKey,
@ -41,7 +44,8 @@ const testComments = [
isReply: false, isReply: false,
}, },
{ {
content: 'I love this concept! It reminds me of how open source projects naturally organize without top-down control.', content:
"I love this concept! It reminds me of how open source projects naturally organize without top-down control.",
targetAddress: sections[0], targetAddress: sections[0],
targetKind: 30041, targetKind: 30041,
author: testUser2Key, author: testUser2Key,
@ -49,7 +53,8 @@ const testComments = [
isReply: false, isReply: false,
}, },
{ {
content: 'The section on institutional capture really resonates with my experience in academia.', content:
"The section on institutional capture really resonates with my experience in academia.",
targetAddress: sections[1], targetAddress: sections[1],
targetKind: 30041, targetKind: 30041,
author: testUserKey, author: testUserKey,
@ -57,7 +62,8 @@ const testComments = [
isReply: false, isReply: false,
}, },
{ {
content: 'Excellent point about underground networks of understanding. This is exactly how most practical knowledge develops.', content:
"Excellent point about underground networks of understanding. This is exactly how most practical knowledge develops.",
targetAddress: sections[2], targetAddress: sections[2],
targetKind: 30041, targetKind: 30041,
author: testUser2Key, author: testUser2Key,
@ -65,7 +71,8 @@ const testComments = [
isReply: false, isReply: false,
}, },
{ {
content: 'This is a brilliant piece of work! Really captures the tension between institutional knowledge and living understanding.', content:
"This is a brilliant piece of work! Really captures the tension between institutional knowledge and living understanding.",
targetAddress: rootAddress, targetAddress: rootAddress,
targetKind: 30040, targetKind: 30040,
author: testUserKey, author: testUserKey,
@ -79,16 +86,18 @@ async function publishEvent(event, relayUrl) {
const ws = new WebSocket(relayUrl); const ws = new WebSocket(relayUrl);
let published = false; let published = false;
ws.on('open', () => { ws.on("open", () => {
console.log(`Connected to ${relayUrl}`); console.log(`Connected to ${relayUrl}`);
ws.send(JSON.stringify(['EVENT', event])); ws.send(JSON.stringify(["EVENT", event]));
}); });
ws.on('message', (data) => { ws.on("message", (data) => {
const message = JSON.parse(data.toString()); const message = JSON.parse(data.toString());
if (message[0] === 'OK' && message[1] === event.id) { if (message[0] === "OK" && message[1] === event.id) {
if (message[2]) { if (message[2]) {
console.log(`✓ Published event ${event.id.substring(0, 8)} to ${relayUrl}`); console.log(
`✓ Published event ${event.id.substring(0, 8)} to ${relayUrl}`,
);
published = true; published = true;
ws.close(); ws.close();
resolve(); resolve();
@ -100,14 +109,14 @@ async function publishEvent(event, relayUrl) {
} }
}); });
ws.on('error', (error) => { ws.on("error", (error) => {
console.error(`WebSocket error: ${error.message}`); console.error(`WebSocket error: ${error.message}`);
reject(error); reject(error);
}); });
ws.on('close', () => { ws.on("close", () => {
if (!published) { if (!published) {
reject(new Error('Connection closed before OK received')); reject(new Error("Connection closed before OK received"));
} }
}); });
@ -115,14 +124,14 @@ async function publishEvent(event, relayUrl) {
setTimeout(() => { setTimeout(() => {
if (!published) { if (!published) {
ws.close(); ws.close();
reject(new Error('Timeout')); reject(new Error("Timeout"));
} }
}, 10000); }, 10000);
}); });
} }
async function createAndPublishComments() { async function createAndPublishComments() {
console.log('\n=== Creating Test Comments ===\n'); console.log("\n=== Creating Test Comments ===\n");
const publishedEvents = []; const publishedEvents = [];
@ -134,14 +143,14 @@ async function createAndPublishComments() {
created_at: Math.floor(Date.now() / 1000), created_at: Math.floor(Date.now() / 1000),
tags: [ tags: [
// Root scope - uppercase tags // Root scope - uppercase tags
['A', comment.targetAddress, relays[0], publicationPubkey], ["A", comment.targetAddress, relays[0], publicationPubkey],
['K', comment.targetKind.toString()], ["K", comment.targetKind.toString()],
['P', publicationPubkey, relays[0]], ["P", publicationPubkey, relays[0]],
// Parent scope - lowercase tags // Parent scope - lowercase tags
['a', comment.targetAddress, relays[0]], ["a", comment.targetAddress, relays[0]],
['k', comment.targetKind.toString()], ["k", comment.targetKind.toString()],
['p', publicationPubkey, relays[0]], ["p", publicationPubkey, relays[0]],
], ],
content: comment.content, content: comment.content,
pubkey: comment.authorPubkey, pubkey: comment.authorPubkey,
@ -149,14 +158,18 @@ async function createAndPublishComments() {
// If this is a reply, add reply tags // If this is a reply, add reply tags
if (comment.isReply && comment.replyToId) { if (comment.isReply && comment.replyToId) {
unsignedEvent.tags.push(['e', comment.replyToId, relay, 'reply']); unsignedEvent.tags.push(["e", comment.replyToId, relay, "reply"]);
unsignedEvent.tags.push(['p', comment.replyToAuthor, relay]); unsignedEvent.tags.push(["p", comment.replyToAuthor, relay]);
} }
// Sign the event // Sign the event
const signedEvent = finalizeEvent(unsignedEvent, comment.author); const signedEvent = finalizeEvent(unsignedEvent, comment.author);
console.log(`\nCreating comment on ${comment.targetKind === 30040 ? 'collection' : 'section'}:`); console.log(
`\nCreating comment on ${
comment.targetKind === 30040 ? "collection" : "section"
}:`,
);
console.log(` Content: "${comment.content.substring(0, 60)}..."`); console.log(` Content: "${comment.content.substring(0, 60)}..."`);
console.log(` Target: ${comment.targetAddress}`); console.log(` Target: ${comment.targetAddress}`);
console.log(` Event ID: ${signedEvent.id}`); console.log(` Event ID: ${signedEvent.id}`);
@ -169,19 +182,19 @@ async function createAndPublishComments() {
comment.eventId = signedEvent.id; comment.eventId = signedEvent.id;
// Delay between publishes to avoid rate limiting // Delay between publishes to avoid rate limiting
await new Promise(resolve => setTimeout(resolve, 1500)); await new Promise((resolve) => setTimeout(resolve, 1500));
} catch (error) { } catch (error) {
console.error(`Failed to publish comment: ${error.message}`); console.error(`Failed to publish comment: ${error.message}`);
} }
} }
// Now create some threaded replies // Now create some threaded replies
console.log('\n=== Creating Threaded Replies ===\n'); console.log("\n=== Creating Threaded Replies ===\n");
const replies = [ const replies = [
{ {
content: 'Absolutely agree! The metaphor extends even further when you consider how ideas naturally branch and merge.', content:
"Absolutely agree! The metaphor extends even further when you consider how ideas naturally branch and merge.",
targetAddress: sections[0], targetAddress: sections[0],
targetKind: 30041, targetKind: 30041,
author: testUser2Key, author: testUser2Key,
@ -191,7 +204,8 @@ async function createAndPublishComments() {
replyToAuthor: testComments[0].authorPubkey, replyToAuthor: testComments[0].authorPubkey,
}, },
{ {
content: 'Great connection! The parallel between open source governance and knowledge commons is really illuminating.', content:
"Great connection! The parallel between open source governance and knowledge commons is really illuminating.",
targetAddress: sections[0], targetAddress: sections[0],
targetKind: 30041, targetKind: 30041,
author: testUserKey, author: testUserKey,
@ -209,17 +223,17 @@ async function createAndPublishComments() {
created_at: Math.floor(Date.now() / 1000), created_at: Math.floor(Date.now() / 1000),
tags: [ tags: [
// Root scope // Root scope
['A', reply.targetAddress, relays[0], publicationPubkey], ["A", reply.targetAddress, relays[0], publicationPubkey],
['K', reply.targetKind.toString()], ["K", reply.targetKind.toString()],
['P', publicationPubkey, relays[0]], ["P", publicationPubkey, relays[0]],
// Parent scope (points to the comment we're replying to) // Parent scope (points to the comment we're replying to)
['a', reply.targetAddress, relays[0]], ["a", reply.targetAddress, relays[0]],
['k', reply.targetKind.toString()], ["k", reply.targetKind.toString()],
['p', reply.replyToAuthor, relays[0]], ["p", reply.replyToAuthor, relays[0]],
// Reply markers // Reply markers
['e', reply.replyToId, relays[0], 'reply'], ["e", reply.replyToId, relays[0], "reply"],
], ],
content: reply.content, content: reply.content,
pubkey: reply.authorPubkey, pubkey: reply.authorPubkey,
@ -233,16 +247,19 @@ async function createAndPublishComments() {
console.log(` Event ID: ${signedEvent.id}`); console.log(` Event ID: ${signedEvent.id}`);
await publishEvent(signedEvent, relays[0]); await publishEvent(signedEvent, relays[0]);
await new Promise(resolve => setTimeout(resolve, 1000)); // Longer delay to avoid rate limiting await new Promise((resolve) => setTimeout(resolve, 1000)); // Longer delay to avoid rate limiting
} catch (error) { } catch (error) {
console.error(`Failed to publish reply: ${error.message}`); console.error(`Failed to publish reply: ${error.message}`);
} }
} }
console.log('\n=== Done! ==='); console.log("\n=== Done! ===");
console.log(`\nPublished ${publishedEvents.length + replies.length} total comments/replies`); console.log(
console.log('\nRefresh the page to see the comments in the Comment Panel.'); `\nPublished ${
publishedEvents.length + replies.length
} total comments/replies`,
);
console.log("\nRefresh the page to see the comments in the Comment Panel.");
} }
// Run it // Run it

100
create-test-highlights.js

@ -1,5 +1,5 @@
import { finalizeEvent, generateSecretKey, getPublicKey } from 'nostr-tools'; import { finalizeEvent, generateSecretKey, getPublicKey } from "nostr-tools";
import WebSocket from 'ws'; import WebSocket from "ws";
// Test user keys (generate fresh ones) // Test user keys (generate fresh ones)
const testUserKey = generateSecretKey(); const testUserKey = generateSecretKey();
@ -8,12 +8,14 @@ const testUserPubkey = getPublicKey(testUserKey);
const testUser2Key = generateSecretKey(); const testUser2Key = generateSecretKey();
const testUser2Pubkey = getPublicKey(testUser2Key); const testUser2Pubkey = getPublicKey(testUser2Key);
console.log('Test User 1 pubkey:', testUserPubkey); console.log("Test User 1 pubkey:", testUserPubkey);
console.log('Test User 2 pubkey:', testUser2Pubkey); console.log("Test User 2 pubkey:", testUser2Pubkey);
// The publication details from the article (REAL VALUES) // The publication details from the article (REAL VALUES)
const publicationPubkey = 'dc4cd086cd7ce5b1832adf4fdd1211289880d2c7e295bcb0e684c01acee77c06'; const publicationPubkey =
const rootAddress = `30040:${publicationPubkey}:anarchistic-knowledge-the-art-of-thinking-without-permission`; "dc4cd086cd7ce5b1832adf4fdd1211289880d2c7e295bcb0e684c01acee77c06";
const rootAddress =
`30040:${publicationPubkey}:anarchistic-knowledge-the-art-of-thinking-without-permission`;
// Section addresses (from the actual publication structure) // Section addresses (from the actual publication structure)
const sections = [ const sections = [
@ -25,9 +27,9 @@ const sections = [
// Relays to publish to (matching HighlightLayer's relay list) // Relays to publish to (matching HighlightLayer's relay list)
const relays = [ const relays = [
'wss://relay.damus.io', "wss://relay.damus.io",
'wss://relay.nostr.band', "wss://relay.nostr.band",
'wss://nostr.wine', "wss://nostr.wine",
]; ];
// Test highlights to create // Test highlights to create
@ -35,40 +37,53 @@ const relays = [
// and optionally a user comment/annotation in the ["comment", ...] tag // and optionally a user comment/annotation in the ["comment", ...] tag
const testHighlights = [ const testHighlights = [
{ {
highlightedText: 'Knowledge that tries to stay put inevitably becomes ossified, a monument to itself rather than a living practice.', highlightedText:
context: 'This is the fundamental paradox of institutional knowledge: it must be captured to be shared, but the very act of capture begins its transformation into something else. Knowledge that tries to stay put inevitably becomes ossified, a monument to itself rather than a living practice. The attempt to hold knowledge still is like trying to photograph a river—you capture an image, but you lose the flow.', "Knowledge that tries to stay put inevitably becomes ossified, a monument to itself rather than a living practice.",
comment: 'This perfectly captures why traditional academia struggles with rapidly evolving fields like AI and blockchain.', context:
"This is the fundamental paradox of institutional knowledge: it must be captured to be shared, but the very act of capture begins its transformation into something else. Knowledge that tries to stay put inevitably becomes ossified, a monument to itself rather than a living practice. The attempt to hold knowledge still is like trying to photograph a river—you capture an image, but you lose the flow.",
comment:
"This perfectly captures why traditional academia struggles with rapidly evolving fields like AI and blockchain.",
targetAddress: sections[0], targetAddress: sections[0],
author: testUserKey, author: testUserKey,
authorPubkey: testUserPubkey, authorPubkey: testUserPubkey,
}, },
{ {
highlightedText: 'The attempt to hold knowledge still is like trying to photograph a river—you capture an image, but you lose the flow.', highlightedText:
context: 'Knowledge that tries to stay put inevitably becomes ossified, a monument to itself rather than a living practice. The attempt to hold knowledge still is like trying to photograph a river—you capture an image, but you lose the flow.', "The attempt to hold knowledge still is like trying to photograph a river—you capture an image, but you lose the flow.",
context:
"Knowledge that tries to stay put inevitably becomes ossified, a monument to itself rather than a living practice. The attempt to hold knowledge still is like trying to photograph a river—you capture an image, but you lose the flow.",
comment: null, // Highlight without annotation comment: null, // Highlight without annotation
targetAddress: sections[0], targetAddress: sections[0],
author: testUser2Key, author: testUser2Key,
authorPubkey: testUser2Pubkey, authorPubkey: testUser2Pubkey,
}, },
{ {
highlightedText: 'Understanding is naturally promiscuous—it wants to mix, merge, and mate with other ideas.', highlightedText:
context: 'The natural state of knowledge is not purity but promiscuity. Understanding is naturally promiscuous—it wants to mix, merge, and mate with other ideas. It crosses boundaries not despite them but because of them. The most vibrant intellectual communities have always been those at crossroads and borderlands.', "Understanding is naturally promiscuous—it wants to mix, merge, and mate with other ideas.",
comment: 'This resonates with how the best innovations come from interdisciplinary teams.', context:
"The natural state of knowledge is not purity but promiscuity. Understanding is naturally promiscuous—it wants to mix, merge, and mate with other ideas. It crosses boundaries not despite them but because of them. The most vibrant intellectual communities have always been those at crossroads and borderlands.",
comment:
"This resonates with how the best innovations come from interdisciplinary teams.",
targetAddress: sections[1], targetAddress: sections[1],
author: testUserKey, author: testUserKey,
authorPubkey: testUserPubkey, authorPubkey: testUserPubkey,
}, },
{ {
highlightedText: 'The most vibrant intellectual communities have always been those at crossroads and borderlands.', highlightedText:
context: 'Understanding is naturally promiscuous—it wants to mix, merge, and mate with other ideas. It crosses boundaries not despite them but because of them. The most vibrant intellectual communities have always been those at crossroads and borderlands.', "The most vibrant intellectual communities have always been those at crossroads and borderlands.",
comment: 'Historical examples: Renaissance Florence, Vienna Circle, Bell Labs', context:
"Understanding is naturally promiscuous—it wants to mix, merge, and mate with other ideas. It crosses boundaries not despite them but because of them. The most vibrant intellectual communities have always been those at crossroads and borderlands.",
comment:
"Historical examples: Renaissance Florence, Vienna Circle, Bell Labs",
targetAddress: sections[1], targetAddress: sections[1],
author: testUser2Key, author: testUser2Key,
authorPubkey: testUser2Pubkey, authorPubkey: testUser2Pubkey,
}, },
{ {
highlightedText: 'institutions that try to monopolize understanding inevitably find themselves gatekeeping corpses', highlightedText:
context: 'But institutions that try to monopolize understanding inevitably find themselves gatekeeping corpses—the living knowledge has already escaped and is flourishing in unexpected places. By the time the gatekeepers notice, the game has moved.', "institutions that try to monopolize understanding inevitably find themselves gatekeeping corpses",
context:
"But institutions that try to monopolize understanding inevitably find themselves gatekeeping corpses—the living knowledge has already escaped and is flourishing in unexpected places. By the time the gatekeepers notice, the game has moved.",
comment: null, comment: null,
targetAddress: sections[2], targetAddress: sections[2],
author: testUserKey, author: testUserKey,
@ -81,16 +96,18 @@ async function publishEvent(event, relayUrl) {
const ws = new WebSocket(relayUrl); const ws = new WebSocket(relayUrl);
let published = false; let published = false;
ws.on('open', () => { ws.on("open", () => {
console.log(`Connected to ${relayUrl}`); console.log(`Connected to ${relayUrl}`);
ws.send(JSON.stringify(['EVENT', event])); ws.send(JSON.stringify(["EVENT", event]));
}); });
ws.on('message', (data) => { ws.on("message", (data) => {
const message = JSON.parse(data.toString()); const message = JSON.parse(data.toString());
if (message[0] === 'OK' && message[1] === event.id) { if (message[0] === "OK" && message[1] === event.id) {
if (message[2]) { if (message[2]) {
console.log(`✓ Published event ${event.id.substring(0, 8)} to ${relayUrl}`); console.log(
`✓ Published event ${event.id.substring(0, 8)} to ${relayUrl}`,
);
published = true; published = true;
ws.close(); ws.close();
resolve(); resolve();
@ -102,14 +119,14 @@ async function publishEvent(event, relayUrl) {
} }
}); });
ws.on('error', (error) => { ws.on("error", (error) => {
console.error(`WebSocket error: ${error.message}`); console.error(`WebSocket error: ${error.message}`);
reject(error); reject(error);
}); });
ws.on('close', () => { ws.on("close", () => {
if (!published) { if (!published) {
reject(new Error('Connection closed before OK received')); reject(new Error("Connection closed before OK received"));
} }
}); });
@ -117,14 +134,14 @@ async function publishEvent(event, relayUrl) {
setTimeout(() => { setTimeout(() => {
if (!published) { if (!published) {
ws.close(); ws.close();
reject(new Error('Timeout')); reject(new Error("Timeout"));
} }
}, 10000); }, 10000);
}); });
} }
async function createAndPublishHighlights() { async function createAndPublishHighlights() {
console.log('\n=== Creating Test Highlights ===\n'); console.log("\n=== Creating Test Highlights ===\n");
const publishedEvents = []; const publishedEvents = [];
@ -138,13 +155,13 @@ async function createAndPublishHighlights() {
created_at: Math.floor(Date.now() / 1000), created_at: Math.floor(Date.now() / 1000),
tags: [ tags: [
// Target section // Target section
['a', highlight.targetAddress, relays[0]], ["a", highlight.targetAddress, relays[0]],
// Surrounding context (helps locate the highlight) // Surrounding context (helps locate the highlight)
['context', highlight.context], ["context", highlight.context],
// Original publication author // Original publication author
['p', publicationPubkey, relays[0], 'author'], ["p", publicationPubkey, relays[0], "author"],
], ],
content: highlight.highlightedText, // The actual highlighted text content: highlight.highlightedText, // The actual highlighted text
pubkey: highlight.authorPubkey, pubkey: highlight.authorPubkey,
@ -152,14 +169,16 @@ async function createAndPublishHighlights() {
// Add optional comment/annotation if present // Add optional comment/annotation if present
if (highlight.comment) { if (highlight.comment) {
unsignedEvent.tags.push(['comment', highlight.comment]); unsignedEvent.tags.push(["comment", highlight.comment]);
} }
// Sign the event // Sign the event
const signedEvent = finalizeEvent(unsignedEvent, highlight.author); const signedEvent = finalizeEvent(unsignedEvent, highlight.author);
console.log(`\nCreating highlight on section:`); console.log(`\nCreating highlight on section:`);
console.log(` Highlighted: "${highlight.highlightedText.substring(0, 60)}..."`); console.log(
` Highlighted: "${highlight.highlightedText.substring(0, 60)}..."`,
);
if (highlight.comment) { if (highlight.comment) {
console.log(` Comment: "${highlight.comment.substring(0, 60)}..."`); console.log(` Comment: "${highlight.comment.substring(0, 60)}..."`);
} }
@ -171,16 +190,15 @@ async function createAndPublishHighlights() {
publishedEvents.push(signedEvent); publishedEvents.push(signedEvent);
// Delay between publishes to avoid rate limiting // Delay between publishes to avoid rate limiting
await new Promise(resolve => setTimeout(resolve, 1500)); await new Promise((resolve) => setTimeout(resolve, 1500));
} catch (error) { } catch (error) {
console.error(`Failed to publish highlight: ${error.message}`); console.error(`Failed to publish highlight: ${error.message}`);
} }
} }
console.log('\n=== Done! ==='); console.log("\n=== Done! ===");
console.log(`\nPublished ${publishedEvents.length} total highlights`); console.log(`\nPublished ${publishedEvents.length} total highlights`);
console.log('\nRefresh the page to see the highlights.'); console.log("\nRefresh the page to see the highlights.");
console.log('Toggle "Show Highlights" to view them inline.'); console.log('Toggle "Show Highlights" to view them inline.');
} }

54
doc/compose_tree.md

@ -2,33 +2,42 @@
## Overview ## Overview
This document outlines the complete restart plan for implementing NKBIP-01 compliant hierarchical AsciiDoc parsing using proper Asciidoctor tree processor extensions. This document outlines the complete restart plan for implementing NKBIP-01
compliant hierarchical AsciiDoc parsing using proper Asciidoctor tree processor
extensions.
## Current State Analysis ## Current State Analysis
### Problems Identified ### Problems Identified
1. **Dual Architecture Conflict**: Two competing parsing implementations exist: 1. **Dual Architecture Conflict**: Two competing parsing implementations exist:
- `publication_tree_factory.ts` - AST-first approach (currently used) - `publication_tree_factory.ts` - AST-first approach (currently used)
- `publication_tree_extension.ts` - Extension approach (incomplete) - `publication_tree_extension.ts` - Extension approach (incomplete)
2. **Missing Proper Extension Registration**: Current code doesn't follow the official Asciidoctor extension pattern you provided 2. **Missing Proper Extension Registration**: Current code doesn't follow the
official Asciidoctor extension pattern you provided
3. **Incomplete NKBIP-01 Compliance**: Testing with `deep_hierarchy_test.adoc` may not produce the exact structures shown in `docreference.md` 3. **Incomplete NKBIP-01 Compliance**: Testing with `deep_hierarchy_test.adoc`
may not produce the exact structures shown in `docreference.md`
## NKBIP-01 Specification Summary ## NKBIP-01 Specification Summary
From `test_data/AsciidocFiles/docreference.md`: From `test_data/AsciidocFiles/docreference.md`:
### Event Types ### Event Types
- **30040**: Index events (collections/hierarchical containers) - **30040**: Index events (collections/hierarchical containers)
- **30041**: Content events (actual article sections) - **30041**: Content events (actual article sections)
### Parse Level Behaviors ### Parse Level Behaviors
- **Level 2**: Only `==` sections → 30041 events (subsections included in content)
- **Level 2**: Only `==` sections → 30041 events (subsections included in
content)
- **Level 3**: `==` → 30040 indices, `===` → 30041 content events - **Level 3**: `==` → 30040 indices, `===` → 30041 content events
- **Level 4+**: Full hierarchy with each level becoming separate events - **Level 4+**: Full hierarchy with each level becoming separate events
### Key Rules ### Key Rules
1. If a section has subsections at target level → becomes 30040 index 1. If a section has subsections at target level → becomes 30040 index
2. If no subsections at target level → becomes 30041 content event 2. If no subsections at target level → becomes 30041 content event
3. Content inclusion: 30041 events include all content below parse level 3. Content inclusion: 30041 events include all content below parse level
@ -44,13 +53,13 @@ Following the pattern you provided:
// Extension registration pattern // Extension registration pattern
module.exports = function (registry) { module.exports = function (registry) {
registry.treeProcessor(function () { registry.treeProcessor(function () {
var self = this var self = this;
self.process(function (doc) { self.process(function (doc) {
// Process document and build PublicationTree // Process document and build PublicationTree
return doc return doc;
}) });
}) });
} };
``` ```
### Implementation Components ### Implementation Components
@ -80,11 +89,12 @@ export function registerPublicationTreeProcessor(
registry: Registry, registry: Registry,
ndk: NDK, ndk: NDK,
parseLevel: number, parseLevel: number,
options?: ProcessorOptions options?: ProcessorOptions,
): { getResult: () => ProcessorResult | null } ): { getResult: () => ProcessorResult | null };
``` ```
**Key Features:** **Key Features:**
- Follows Asciidoctor extension pattern exactly - Follows Asciidoctor extension pattern exactly
- Builds events during AST traversal (not after) - Builds events during AST traversal (not after)
- Preserves original AsciiDoc content in events - Preserves original AsciiDoc content in events
@ -97,11 +107,12 @@ export function registerPublicationTreeProcessor(
export async function parseAsciiDocWithTree( export async function parseAsciiDocWithTree(
content: string, content: string,
ndk: NDK, ndk: NDK,
parseLevel: number = 2 parseLevel: number = 2,
): Promise<PublicationTreeResult> ): Promise<PublicationTreeResult>;
``` ```
**Responsibilities:** **Responsibilities:**
- Create Asciidoctor instance - Create Asciidoctor instance
- Register tree processor extension - Register tree processor extension
- Execute parsing with extension - Execute parsing with extension
@ -111,6 +122,7 @@ export async function parseAsciiDocWithTree(
### Phase 3: ZettelEditor Integration ### Phase 3: ZettelEditor Integration
**Changes to `ZettelEditor.svelte`:** **Changes to `ZettelEditor.svelte`:**
- Replace `createPublicationTreeFromContent()` calls - Replace `createPublicationTreeFromContent()` calls
- Use new `parseAsciiDocWithTree()` function - Use new `parseAsciiDocWithTree()` function
- Maintain existing preview/publishing interface - Maintain existing preview/publishing interface
@ -119,6 +131,7 @@ export async function parseAsciiDocWithTree(
### Phase 4: Validation Testing ### Phase 4: Validation Testing
**Test Suite:** **Test Suite:**
1. Parse `deep_hierarchy_test.adoc` at levels 2-7 1. Parse `deep_hierarchy_test.adoc` at levels 2-7
2. Verify event structures match `docreference.md` examples 2. Verify event structures match `docreference.md` examples
3. Validate content preservation and tag inheritance 3. Validate content preservation and tag inheritance
@ -127,23 +140,29 @@ export async function parseAsciiDocWithTree(
## File Organization ## File Organization
### Files to Create ### Files to Create
1. `src/lib/utils/publication_tree_processor.ts` - Core tree processor extension 1. `src/lib/utils/publication_tree_processor.ts` - Core tree processor extension
2. `src/lib/utils/asciidoc_publication_parser.ts` - Unified parser interface 2. `src/lib/utils/asciidoc_publication_parser.ts` - Unified parser interface
3. `tests/unit/publication_tree_processor.test.ts` - Comprehensive test suite 3. `tests/unit/publication_tree_processor.test.ts` - Comprehensive test suite
### Files to Modify ### Files to Modify
1. `src/lib/components/ZettelEditor.svelte` - Update parsing calls 1. `src/lib/components/ZettelEditor.svelte` - Update parsing calls
2. `src/routes/new/compose/+page.svelte` - Verify integration works 2. `src/routes/new/compose/+page.svelte` - Verify integration works
### Files to Remove (After Validation) ### Files to Remove (After Validation)
1. `src/lib/utils/publication_tree_factory.ts` - Replace with processor 1. `src/lib/utils/publication_tree_factory.ts` - Replace with processor
2. `src/lib/utils/publication_tree_extension.ts` - Merge concepts into processor 2. `src/lib/utils/publication_tree_extension.ts` - Merge concepts into processor
## Success Criteria ## Success Criteria
1. **NKBIP-01 Compliance**: All parse levels produce structures exactly matching `docreference.md` 1. **NKBIP-01 Compliance**: All parse levels produce structures exactly matching
2. **Content Preservation**: Original AsciiDoc content preserved in events (not converted to HTML) `docreference.md`
3. **Proper Extension Pattern**: Uses official Asciidoctor tree processor registration 2. **Content Preservation**: Original AsciiDoc content preserved in events (not
converted to HTML)
3. **Proper Extension Pattern**: Uses official Asciidoctor tree processor
registration
4. **Zero Regression**: Current ZettelEditor functionality unchanged 4. **Zero Regression**: Current ZettelEditor functionality unchanged
5. **Performance**: No degradation in parsing or preview speed 5. **Performance**: No degradation in parsing or preview speed
6. **Test Coverage**: Comprehensive validation with `deep_hierarchy_test.adoc` 6. **Test Coverage**: Comprehensive validation with `deep_hierarchy_test.adoc`
@ -169,5 +188,6 @@ export async function parseAsciiDocWithTree(
- NKBIP-01 Specification: `test_data/AsciidocFiles/docreference.md` - NKBIP-01 Specification: `test_data/AsciidocFiles/docreference.md`
- Test Document: `test_data/AsciidocFiles/deep_hierarchy_test.adoc` - Test Document: `test_data/AsciidocFiles/deep_hierarchy_test.adoc`
- Asciidoctor Extensions: [Official Documentation](https://docs.asciidoctor.org/asciidoctor.js/latest/extend/extensions/) - Asciidoctor Extensions:
[Official Documentation](https://docs.asciidoctor.org/asciidoctor.js/latest/extend/extensions/)
- Current Implementation: `src/lib/components/ZettelEditor.svelte:64` - Current Implementation: `src/lib/components/ZettelEditor.svelte:64`

54
nips/09.md

@ -1,14 +1,16 @@
NIP-09 # NIP-09
======
Event Deletion Request ## Event Deletion Request
----------------------
`draft` `optional` `draft` `optional`
A special event with kind `5`, meaning "deletion request" is defined as having a list of one or more `e` or `a` tags, each referencing an event the author is requesting to be deleted. Deletion requests SHOULD include a `k` tag for the kind of each event being requested for deletion. A special event with kind `5`, meaning "deletion request" is defined as having a
list of one or more `e` or `a` tags, each referencing an event the author is
requesting to be deleted. Deletion requests SHOULD include a `k` tag for the
kind of each event being requested for deletion.
The event's `content` field MAY contain a text note describing the reason for the deletion request. The event's `content` field MAY contain a text note describing the reason for
the deletion request.
For example: For example:
@ -28,26 +30,48 @@ For example:
} }
``` ```
Relays SHOULD delete or stop publishing any referenced events that have an identical `pubkey` as the deletion request. Clients SHOULD hide or otherwise indicate a deletion request status for referenced events. Relays SHOULD delete or stop publishing any referenced events that have an
identical `pubkey` as the deletion request. Clients SHOULD hide or otherwise
indicate a deletion request status for referenced events.
Relays SHOULD continue to publish/share the deletion request events indefinitely, as clients may already have the event that's intended to be deleted. Additionally, clients SHOULD broadcast deletion request events to other relays which don't have it. Relays SHOULD continue to publish/share the deletion request events
indefinitely, as clients may already have the event that's intended to be
deleted. Additionally, clients SHOULD broadcast deletion request events to other
relays which don't have it.
When an `a` tag is used, relays SHOULD delete all versions of the replaceable event up to the `created_at` timestamp of the deletion request event. When an `a` tag is used, relays SHOULD delete all versions of the replaceable
event up to the `created_at` timestamp of the deletion request event.
## Client Usage ## Client Usage
Clients MAY choose to fully hide any events that are referenced by valid deletion request events. This includes text notes, direct messages, or other yet-to-be defined event kinds. Alternatively, they MAY show the event along with an icon or other indication that the author has "disowned" the event. The `content` field MAY also be used to replace the deleted events' own content, although a user interface should clearly indicate that this is a deletion request reason, not the original content. Clients MAY choose to fully hide any events that are referenced by valid
deletion request events. This includes text notes, direct messages, or other
yet-to-be defined event kinds. Alternatively, they MAY show the event along with
an icon or other indication that the author has "disowned" the event. The
`content` field MAY also be used to replace the deleted events' own content,
although a user interface should clearly indicate that this is a deletion
request reason, not the original content.
A client MUST validate that each event `pubkey` referenced in the `e` tag of the deletion request is identical to the deletion request `pubkey`, before hiding or deleting any event. Relays can not, in general, perform this validation and should not be treated as authoritative. A client MUST validate that each event `pubkey` referenced in the `e` tag of the
deletion request is identical to the deletion request `pubkey`, before hiding or
deleting any event. Relays can not, in general, perform this validation and
should not be treated as authoritative.
Clients display the deletion request event itself in any way they choose, e.g., not at all, or with a prominent notice. Clients display the deletion request event itself in any way they choose, e.g.,
not at all, or with a prominent notice.
Clients MAY choose to inform the user that their request for deletion does not guarantee deletion because it is impossible to delete events from all relays and clients. Clients MAY choose to inform the user that their request for deletion does not
guarantee deletion because it is impossible to delete events from all relays and
clients.
## Relay Usage ## Relay Usage
Relays MAY validate that a deletion request event only references events that have the same `pubkey` as the deletion request itself, however this is not required since relays may not have knowledge of all referenced events. Relays MAY validate that a deletion request event only references events that
have the same `pubkey` as the deletion request itself, however this is not
required since relays may not have knowledge of all referenced events.
## Deletion Request of a Deletion Request ## Deletion Request of a Deletion Request
Publishing a deletion request event against a deletion request has no effect. Clients and relays are not obliged to support "unrequest deletion" functionality. Publishing a deletion request event against a deletion request has no effect.
Clients and relays are not obliged to support "unrequest deletion"
functionality.

14
src/lib/services/deletion.ts

@ -25,7 +25,8 @@ export async function deleteEvent(
options: DeletionOptions, options: DeletionOptions,
ndk: NDK, ndk: NDK,
): Promise<DeletionResult> { ): Promise<DeletionResult> {
const { eventId, eventAddress, eventKind, reason = "", onSuccess, onError } = options; const { eventId, eventAddress, eventKind, reason = "", onSuccess, onError } =
options;
if (!eventId && !eventAddress) { if (!eventId && !eventAddress) {
const error = "Either eventId or eventAddress must be provided"; const error = "Either eventId or eventAddress must be provided";
@ -52,17 +53,17 @@ export async function deleteEvent(
if (eventId) { if (eventId) {
// Add 'e' tag for event ID // Add 'e' tag for event ID
tags.push(['e', eventId]); tags.push(["e", eventId]);
} }
if (eventAddress) { if (eventAddress) {
// Add 'a' tag for replaceable event address // Add 'a' tag for replaceable event address
tags.push(['a', eventAddress]); tags.push(["a", eventAddress]);
} }
if (eventKind) { if (eventKind) {
// Add 'k' tag for event kind (recommended by NIP-09) // Add 'k' tag for event kind (recommended by NIP-09)
tags.push(['k', eventKind.toString()]); tags.push(["k", eventKind.toString()]);
} }
deletionEvent.tags = tags; deletionEvent.tags = tags;
@ -93,8 +94,9 @@ export async function deleteEvent(
throw new Error("Failed to publish deletion request to any relays"); throw new Error("Failed to publish deletion request to any relays");
} }
} catch (error) { } catch (error) {
const errorMessage = const errorMessage = error instanceof Error
error instanceof Error ? error.message : "Unknown error"; ? error.message
: "Unknown error";
console.error(`[deletion.ts] Error deleting event: ${errorMessage}`); console.error(`[deletion.ts] Error deleting event: ${errorMessage}`);
onError?.(errorMessage); onError?.(errorMessage);
return { success: false, error: errorMessage }; return { success: false, error: errorMessage };

29
src/lib/services/publisher.ts

@ -102,8 +102,9 @@ export async function publishZettel(
throw new Error("Failed to publish to any relays"); throw new Error("Failed to publish to any relays");
} }
} catch (error) { } catch (error) {
const errorMessage = const errorMessage = error instanceof Error
error instanceof Error ? error.message : "Unknown error"; ? error.message
: "Unknown error";
onError?.(errorMessage); onError?.(errorMessage);
return { success: false, error: errorMessage }; return { success: false, error: errorMessage };
} }
@ -165,8 +166,7 @@ export async function publishSingleEvent(
if (!hasAuthorTag && ndk.activeUser) { if (!hasAuthorTag && ndk.activeUser) {
// Add display name as author // Add display name as author
const displayName = const displayName = ndk.activeUser.profile?.displayName ||
ndk.activeUser.profile?.displayName ||
ndk.activeUser.profile?.name || ndk.activeUser.profile?.name ||
"Anonymous"; "Anonymous";
finalTags.push(["author", displayName]); finalTags.push(["author", displayName]);
@ -196,8 +196,9 @@ export async function publishSingleEvent(
throw new Error("Failed to publish to any relays"); throw new Error("Failed to publish to any relays");
} }
} catch (error) { } catch (error) {
const errorMessage = const errorMessage = error instanceof Error
error instanceof Error ? error.message : "Unknown error"; ? error.message
: "Unknown error";
console.error(`Error publishing event: ${errorMessage}`); console.error(`Error publishing event: ${errorMessage}`);
onError?.(errorMessage); onError?.(errorMessage);
return { success: false, error: errorMessage }; return { success: false, error: errorMessage };
@ -272,15 +273,17 @@ export async function publishMultipleZettels(
}); });
} }
} catch (err) { } catch (err) {
const errorMessage = const errorMessage = err instanceof Error
err instanceof Error ? err.message : "Unknown error"; ? err.message
: "Unknown error";
results.push({ success: false, error: errorMessage }); results.push({ success: false, error: errorMessage });
} }
} }
return results; return results;
} catch (error) { } catch (error) {
const errorMessage = const errorMessage = error instanceof Error
error instanceof Error ? error.message : "Unknown error"; ? error.message
: "Unknown error";
onError?.(errorMessage); onError?.(errorMessage);
return [{ success: false, error: errorMessage }]; return [{ success: false, error: errorMessage }];
} }
@ -314,8 +317,7 @@ export function processPublishResults(
} else { } else {
const contentIndex = hasIndexEvent ? index - 1 : index; const contentIndex = hasIndexEvent ? index - 1 : index;
const contentEvent = events.contentEvents[contentIndex]; const contentEvent = events.contentEvents[contentIndex];
title = title = contentEvent?.title ||
contentEvent?.title ||
contentEvent?.tags?.find((t: any) => t[0] === "title")?.[1] || contentEvent?.tags?.find((t: any) => t[0] === "title")?.[1] ||
`Note ${contentIndex + 1}`; `Note ${contentIndex + 1}`;
} }
@ -338,8 +340,7 @@ export function processPublishResults(
} else { } else {
const contentIndex = hasIndexEvent ? index - 1 : index; const contentIndex = hasIndexEvent ? index - 1 : index;
const contentEvent = events.contentEvents[contentIndex]; const contentEvent = events.contentEvents[contentIndex];
title = title = contentEvent?.title ||
contentEvent?.title ||
contentEvent?.tags?.find((t: any) => t[0] === "title")?.[1] || contentEvent?.tags?.find((t: any) => t[0] === "title")?.[1] ||
`Note ${contentIndex + 1}`; `Note ${contentIndex + 1}`;
} }

106
src/lib/utils/asciidoc_ast_parser.ts

@ -30,22 +30,28 @@ export interface ASTParsedDocument {
/** /**
* Parse AsciiDoc content using Asciidoctor's AST instead of manual regex * Parse AsciiDoc content using Asciidoctor's AST instead of manual regex
*/ */
export function parseAsciiDocAST(content: string, parseLevel: number = 2): ASTParsedDocument { export function parseAsciiDocAST(
content: string,
parseLevel: number = 2,
): ASTParsedDocument {
const asciidoctor = Processor(); const asciidoctor = Processor();
const document = asciidoctor.load(content, { standalone: false }) as Document; const document = asciidoctor.load(content, { standalone: false }) as Document;
return { return {
title: document.getTitle() || '', title: document.getTitle() || "",
content: document.getContent() || '', content: document.getContent() || "",
attributes: document.getAttributes(), attributes: document.getAttributes(),
sections: extractSectionsFromAST(document, parseLevel) sections: extractSectionsFromAST(document, parseLevel),
}; };
} }
/** /**
* Extract sections from Asciidoctor AST based on parse level * Extract sections from Asciidoctor AST based on parse level
*/ */
function extractSectionsFromAST(document: Document, parseLevel: number): ASTSection[] { function extractSectionsFromAST(
document: Document,
parseLevel: number,
): ASTSection[] {
const directSections = document.getSections(); const directSections = document.getSections();
// Collect all sections at all levels up to parseLevel // Collect all sections at all levels up to parseLevel
@ -61,11 +67,11 @@ function extractSectionsFromAST(document: Document, parseLevel: number): ASTSect
if (appLevel <= parseLevel) { if (appLevel <= parseLevel) {
allSections.push({ allSections.push({
title: section.getTitle() || '', title: section.getTitle() || "",
content: section.getContent() || '', content: section.getContent() || "",
level: appLevel, level: appLevel,
attributes: section.getAttributes() || {}, attributes: section.getAttributes() || {},
subsections: [] subsections: [],
}); });
} }
@ -91,11 +97,11 @@ function extractSubsections(section: any, parseLevel: number): ASTSection[] {
return subsections return subsections
.filter((sub: any) => (sub.getLevel() + 1) <= parseLevel) .filter((sub: any) => (sub.getLevel() + 1) <= parseLevel)
.map((sub: any) => ({ .map((sub: any) => ({
title: sub.getTitle() || '', title: sub.getTitle() || "",
content: sub.getContent() || '', content: sub.getContent() || "",
level: sub.getLevel() + 1, // Convert to app level level: sub.getLevel() + 1, // Convert to app level
attributes: sub.getAttributes() || {}, attributes: sub.getAttributes() || {},
subsections: extractSubsections(sub, parseLevel) subsections: extractSubsections(sub, parseLevel),
})); }));
} }
@ -130,7 +136,10 @@ export async function createPublicationTreeFromAST(
/** /**
* Create a 30040 index event from AST document metadata * Create a 30040 index event from AST document metadata
*/ */
function createIndexEventFromAST(parsed: ASTParsedDocument, ndk: NDK): NDKEvent { function createIndexEventFromAST(
parsed: ASTParsedDocument,
ndk: NDK,
): NDKEvent {
const event = new NDKEvent(ndk); const event = new NDKEvent(ndk);
event.kind = 30040; event.kind = 30040;
event.created_at = Math.floor(Date.now() / 1000); event.created_at = Math.floor(Date.now() / 1000);
@ -251,17 +260,53 @@ function generateTitleAbbreviation(title: string): string {
/** /**
* Add AsciiDoc attributes as Nostr event tags, filtering out system attributes * Add AsciiDoc attributes as Nostr event tags, filtering out system attributes
*/ */
function addAttributesAsTags(tags: string[][], attributes: Record<string, string>) { function addAttributesAsTags(
tags: string[][],
attributes: Record<string, string>,
) {
const systemAttributes = [ const systemAttributes = [
'attribute-undefined', 'attribute-missing', 'appendix-caption', 'appendix-refsig', "attribute-undefined",
'caution-caption', 'chapter-refsig', 'example-caption', 'figure-caption', "attribute-missing",
'important-caption', 'last-update-label', 'manname-title', 'note-caption', "appendix-caption",
'part-refsig', 'preface-title', 'section-refsig', 'table-caption', "appendix-refsig",
'tip-caption', 'toc-title', 'untitled-label', 'version-label', 'warning-caption', "caution-caption",
'asciidoctor', 'asciidoctor-version', 'safe-mode-name', 'backend', 'doctype', "chapter-refsig",
'basebackend', 'filetype', 'outfilesuffix', 'stylesdir', 'iconsdir', "example-caption",
'localdate', 'localyear', 'localtime', 'localdatetime', 'docdate', "figure-caption",
'docyear', 'doctime', 'docdatetime', 'doctitle', 'embedded', 'notitle' "important-caption",
"last-update-label",
"manname-title",
"note-caption",
"part-refsig",
"preface-title",
"section-refsig",
"table-caption",
"tip-caption",
"toc-title",
"untitled-label",
"version-label",
"warning-caption",
"asciidoctor",
"asciidoctor-version",
"safe-mode-name",
"backend",
"doctype",
"basebackend",
"filetype",
"outfilesuffix",
"stylesdir",
"iconsdir",
"localdate",
"localyear",
"localtime",
"localdatetime",
"docdate",
"docyear",
"doctime",
"docdatetime",
"doctitle",
"embedded",
"notitle",
]; ];
// Add standard metadata tags // Add standard metadata tags
@ -269,9 +314,7 @@ function addAttributesAsTags(tags: string[][], attributes: Record<string, string
if (attributes.version) tags.push(["version", attributes.version]); if (attributes.version) tags.push(["version", attributes.version]);
if (attributes.description) tags.push(["summary", attributes.description]); if (attributes.description) tags.push(["summary", attributes.description]);
if (attributes.tags) { if (attributes.tags) {
attributes.tags.split(',').forEach(tag => attributes.tags.split(",").forEach((tag) => tags.push(["t", tag.trim()]));
tags.push(["t", tag.trim()])
);
} }
// Add custom attributes (non-system) // Add custom attributes (non-system)
@ -286,14 +329,21 @@ function addAttributesAsTags(tags: string[][], attributes: Record<string, string
* Tree processor extension for Asciidoctor * Tree processor extension for Asciidoctor
* This can be registered to automatically populate PublicationTree during parsing * This can be registered to automatically populate PublicationTree during parsing
*/ */
export function createPublicationTreeProcessor(ndk: NDK, parseLevel: number = 2) { export function createPublicationTreeProcessor(
ndk: NDK,
parseLevel: number = 2,
) {
return function (extensions: any) { return function (extensions: any) {
extensions.treeProcessor(function (this: any) { extensions.treeProcessor(function (this: any) {
const dsl = this; const dsl = this;
dsl.process(function (this: any, document: Document) { dsl.process(function (this: any, document: Document) {
// Create PublicationTree and store on document for later retrieval // Create PublicationTree and store on document for later retrieval
const publicationTree = createPublicationTreeFromDocument(document, ndk, parseLevel); const publicationTree = createPublicationTreeFromDocument(
document.setAttribute('publicationTree', publicationTree); document,
ndk,
parseLevel,
);
document.setAttribute("publicationTree", publicationTree);
}); });
}); });
}; };

13
src/lib/utils/asciidoc_parser.ts

@ -9,9 +9,9 @@
import Processor from "asciidoctor"; import Processor from "asciidoctor";
import type { Document } from "asciidoctor"; import type { Document } from "asciidoctor";
import { import {
parseSimpleAttributes,
extractDocumentMetadata, extractDocumentMetadata,
extractSectionMetadata, extractSectionMetadata,
parseSimpleAttributes,
} from "./asciidoc_metadata.ts"; } from "./asciidoc_metadata.ts";
export interface ParsedAsciiDoc { export interface ParsedAsciiDoc {
@ -418,8 +418,7 @@ export function generateNostrEvents(
const hasChildrenAtTargetLevel = children.some( const hasChildrenAtTargetLevel = children.some(
(child) => child.level === parseLevel, (child) => child.level === parseLevel,
); );
const shouldBeIndex = const shouldBeIndex = level < parseLevel &&
level < parseLevel &&
(hasChildrenAtTargetLevel || (hasChildrenAtTargetLevel ||
children.some((child) => child.level <= parseLevel)); children.some((child) => child.level <= parseLevel));
@ -461,8 +460,8 @@ export function generateNostrEvents(
const childHasSubChildren = child.children.some( const childHasSubChildren = child.children.some(
(grandchild) => grandchild.level <= parseLevel, (grandchild) => grandchild.level <= parseLevel,
); );
const childShouldBeIndex = const childShouldBeIndex = child.level < parseLevel &&
child.level < parseLevel && childHasSubChildren; childHasSubChildren;
const childKind = childShouldBeIndex ? 30040 : 30041; const childKind = childShouldBeIndex ? 30040 : 30041;
childATags.push([ childATags.push([
"a", "a",
@ -563,8 +562,8 @@ export function generateNostrEvents(
export function detectContentType( export function detectContentType(
content: string, content: string,
): "article" | "scattered-notes" | "none" { ): "article" | "scattered-notes" | "none" {
const hasDocTitle = const hasDocTitle = content.trim().startsWith("=") &&
content.trim().startsWith("=") && !content.trim().startsWith("=="); !content.trim().startsWith("==");
const hasSections = content.includes("=="); const hasSections = content.includes("==");
if (hasDocTitle) { if (hasDocTitle) {

48
src/lib/utils/asciidoc_publication_parser.ts

@ -9,7 +9,10 @@
*/ */
import Asciidoctor from "asciidoctor"; import Asciidoctor from "asciidoctor";
import { registerPublicationTreeProcessor, type ProcessorResult } from "./publication_tree_processor"; import {
type ProcessorResult,
registerPublicationTreeProcessor,
} from "./publication_tree_processor";
import type NDK from "@nostr-dev-kit/ndk"; import type NDK from "@nostr-dev-kit/ndk";
export type PublicationTreeResult = ProcessorResult; export type PublicationTreeResult = ProcessorResult;
@ -21,7 +24,7 @@ export type PublicationTreeResult = ProcessorResult;
export async function parseAsciiDocWithTree( export async function parseAsciiDocWithTree(
content: string, content: string,
ndk: NDK, ndk: NDK,
parseLevel: number = 2 parseLevel: number = 2,
): Promise<PublicationTreeResult> { ): Promise<PublicationTreeResult> {
console.log(`[Parser] Starting parse at level ${parseLevel}`); console.log(`[Parser] Starting parse at level ${parseLevel}`);
@ -34,7 +37,7 @@ export async function parseAsciiDocWithTree(
registry, registry,
ndk, ndk,
parseLevel, parseLevel,
content content,
); );
try { try {
@ -43,8 +46,8 @@ export async function parseAsciiDocWithTree(
extension_registry: registry, extension_registry: registry,
standalone: false, standalone: false,
attributes: { attributes: {
sectids: false sectids: false,
} },
}); });
console.log(`[Parser] Document converted successfully`); console.log(`[Parser] Document converted successfully`);
@ -62,10 +65,13 @@ export async function parseAsciiDocWithTree(
console.log(`[Parser] Tree relationships built successfully`); console.log(`[Parser] Tree relationships built successfully`);
return result; return result;
} catch (error) { } catch (error) {
console.error('[Parser] Error during parsing:', error); console.error("[Parser] Error during parsing:", error);
throw new Error(`Failed to parse AsciiDoc content: ${error instanceof Error ? error.message : 'Unknown error'}`); throw new Error(
`Failed to parse AsciiDoc content: ${
error instanceof Error ? error.message : "Unknown error"
}`,
);
} }
} }
@ -96,9 +102,8 @@ async function buildTreeRelationships(result: ProcessorResult): Promise<void> {
} }
console.log(`[Parser] Added ${contentEvents.length} events to tree`); console.log(`[Parser] Added ${contentEvents.length} events to tree`);
} catch (error) { } catch (error) {
console.error('[Parser] Error building tree relationships:', error); console.error("[Parser] Error building tree relationships:", error);
throw error; throw error;
} }
} }
@ -108,8 +113,10 @@ async function buildTreeRelationships(result: ProcessorResult): Promise<void> {
*/ */
export function exportEventsFromTree(result: PublicationTreeResult) { export function exportEventsFromTree(result: PublicationTreeResult) {
return { return {
indexEvent: result.indexEvent ? eventToPublishableObject(result.indexEvent) : undefined, indexEvent: result.indexEvent
contentEvents: result.contentEvents.map(eventToPublishableObject) ? eventToPublishableObject(result.indexEvent)
: undefined,
contentEvents: result.contentEvents.map(eventToPublishableObject),
// Note: Deliberately omitting 'tree' to ensure the object is serializable for postMessage // Note: Deliberately omitting 'tree' to ensure the object is serializable for postMessage
}; };
} }
@ -122,14 +129,17 @@ function eventToPublishableObject(event: any) {
// Extract only primitive values to ensure serializability // Extract only primitive values to ensure serializability
return { return {
kind: Number(event.kind), kind: Number(event.kind),
content: String(event.content || ''), content: String(event.content || ""),
tags: Array.isArray(event.tags) ? event.tags.map((tag: any) => tags: Array.isArray(event.tags)
Array.isArray(tag) ? tag.map(t => String(t)) : [] ? event.tags.map((tag: any) =>
) : [], Array.isArray(tag) ? tag.map((t) => String(t)) : []
)
: [],
created_at: Number(event.created_at || Math.floor(Date.now() / 1000)), created_at: Number(event.created_at || Math.floor(Date.now() / 1000)),
pubkey: String(event.pubkey || ''), pubkey: String(event.pubkey || ""),
id: String(event.id || ''), id: String(event.id || ""),
title: event.tags?.find?.((t: string[]) => t[0] === "title")?.[1] || "Untitled" title: event.tags?.find?.((t: string[]) => t[0] === "title")?.[1] ||
"Untitled",
}; };
} }

4
src/lib/utils/event_input_utils.ts

@ -5,9 +5,7 @@ import {
extractDocumentMetadata, extractDocumentMetadata,
metadataToTags, metadataToTags,
} from "./asciidoc_metadata.ts"; } from "./asciidoc_metadata.ts";
import { import { parseAsciiDocWithMetadata } from "./asciidoc_parser.ts";
parseAsciiDocWithMetadata,
} from "./asciidoc_parser.ts";
// ========================= // =========================
// Validation // Validation

2
src/lib/utils/fetch_publication_highlights.ts

@ -19,7 +19,7 @@ import { NDKEvent } from "@nostr-dev-kit/ndk";
*/ */
export async function fetchHighlightsForPublication( export async function fetchHighlightsForPublication(
publicationEvent: NDKEvent, publicationEvent: NDKEvent,
ndk: NDK ndk: NDK,
): Promise<Map<string, NDKEvent[]>> { ): Promise<Map<string, NDKEvent[]>> {
// Extract all "a" tags from the publication event // Extract all "a" tags from the publication event
const aTags = publicationEvent.getMatchingTags("a"); const aTags = publicationEvent.getMatchingTags("a");

61
src/lib/utils/highlightPositioning.ts

@ -17,7 +17,9 @@ function getTextNodes(element: HTMLElement): Text[] {
acceptNode: (node) => { acceptNode: (node) => {
// Skip text in script/style tags // Skip text in script/style tags
const parent = node.parentElement; const parent = node.parentElement;
if (parent && (parent.tagName === 'SCRIPT' || parent.tagName === 'STYLE')) { if (
parent && (parent.tagName === "SCRIPT" || parent.tagName === "STYLE")
) {
return NodeFilter.FILTER_REJECT; return NodeFilter.FILTER_REJECT;
} }
// Skip empty text nodes // Skip empty text nodes
@ -25,8 +27,8 @@ function getTextNodes(element: HTMLElement): Text[] {
return NodeFilter.FILTER_REJECT; return NodeFilter.FILTER_REJECT;
} }
return NodeFilter.FILTER_ACCEPT; return NodeFilter.FILTER_ACCEPT;
} },
} },
); );
let node: Node | null; let node: Node | null;
@ -41,7 +43,10 @@ function getTextNodes(element: HTMLElement): Text[] {
* Calculate the total text length from text nodes * Calculate the total text length from text nodes
*/ */
function getTotalTextLength(textNodes: Text[]): number { function getTotalTextLength(textNodes: Text[]): number {
return textNodes.reduce((total, node) => total + (node.textContent?.length || 0), 0); return textNodes.reduce(
(total, node) => total + (node.textContent?.length || 0),
0,
);
} }
/** /**
@ -49,7 +54,7 @@ function getTotalTextLength(textNodes: Text[]): number {
*/ */
function findNodeAtOffset( function findNodeAtOffset(
textNodes: Text[], textNodes: Text[],
globalOffset: number globalOffset: number,
): { node: Text; localOffset: number } | null { ): { node: Text; localOffset: number } | null {
let currentOffset = 0; let currentOffset = 0;
@ -59,7 +64,7 @@ function findNodeAtOffset(
if (globalOffset < currentOffset + nodeLength) { if (globalOffset < currentOffset + nodeLength) {
return { return {
node, node,
localOffset: globalOffset - currentOffset localOffset: globalOffset - currentOffset,
}; };
} }
@ -82,13 +87,17 @@ export function highlightByOffset(
container: HTMLElement, container: HTMLElement,
startOffset: number, startOffset: number,
endOffset: number, endOffset: number,
color: string color: string,
): boolean { ): boolean {
console.log(`[highlightByOffset] Attempting to highlight chars ${startOffset}-${endOffset}`); console.log(
`[highlightByOffset] Attempting to highlight chars ${startOffset}-${endOffset}`,
);
// Validate inputs // Validate inputs
if (startOffset < 0 || endOffset <= startOffset) { if (startOffset < 0 || endOffset <= startOffset) {
console.warn(`[highlightByOffset] Invalid offsets: ${startOffset}-${endOffset}`); console.warn(
`[highlightByOffset] Invalid offsets: ${startOffset}-${endOffset}`,
);
return false; return false;
} }
@ -100,11 +109,15 @@ export function highlightByOffset(
} }
const totalLength = getTotalTextLength(textNodes); const totalLength = getTotalTextLength(textNodes);
console.log(`[highlightByOffset] Total text length: ${totalLength}, nodes: ${textNodes.length}`); console.log(
`[highlightByOffset] Total text length: ${totalLength}, nodes: ${textNodes.length}`,
);
// Validate offsets are within bounds // Validate offsets are within bounds
if (startOffset >= totalLength) { if (startOffset >= totalLength) {
console.warn(`[highlightByOffset] Start offset ${startOffset} exceeds total length ${totalLength}`); console.warn(
`[highlightByOffset] Start offset ${startOffset} exceeds total length ${totalLength}`,
);
return false; return false;
} }
@ -124,16 +137,16 @@ export function highlightByOffset(
startNode: startPos.node.textContent?.substring(0, 20), startNode: startPos.node.textContent?.substring(0, 20),
startLocal: startPos.localOffset, startLocal: startPos.localOffset,
endNode: endPos.node.textContent?.substring(0, 20), endNode: endPos.node.textContent?.substring(0, 20),
endLocal: endPos.localOffset endLocal: endPos.localOffset,
}); });
// Create the highlight mark element // Create the highlight mark element
const createHighlightMark = (text: string): HTMLElement => { const createHighlightMark = (text: string): HTMLElement => {
const mark = document.createElement('mark'); const mark = document.createElement("mark");
mark.className = 'highlight'; mark.className = "highlight";
mark.style.backgroundColor = color; mark.style.backgroundColor = color;
mark.style.borderRadius = '2px'; mark.style.borderRadius = "2px";
mark.style.padding = '2px 0'; mark.style.padding = "2px 0";
mark.textContent = text; mark.textContent = text;
return mark; return mark;
}; };
@ -141,9 +154,12 @@ export function highlightByOffset(
try { try {
// Case 1: Highlight is within a single text node // Case 1: Highlight is within a single text node
if (startPos.node === endPos.node) { if (startPos.node === endPos.node) {
const text = startPos.node.textContent || ''; const text = startPos.node.textContent || "";
const before = text.substring(0, startPos.localOffset); const before = text.substring(0, startPos.localOffset);
const highlighted = text.substring(startPos.localOffset, endPos.localOffset); const highlighted = text.substring(
startPos.localOffset,
endPos.localOffset,
);
const after = text.substring(endPos.localOffset); const after = text.substring(endPos.localOffset);
const parent = startPos.node.parentNode; const parent = startPos.node.parentNode;
@ -156,7 +172,9 @@ export function highlightByOffset(
if (after) fragment.appendChild(document.createTextNode(after)); if (after) fragment.appendChild(document.createTextNode(after));
parent.replaceChild(fragment, startPos.node); parent.replaceChild(fragment, startPos.node);
console.log(`[highlightByOffset] Applied single-node highlight: "${highlighted}"`); console.log(
`[highlightByOffset] Applied single-node highlight: "${highlighted}"`,
);
return true; return true;
} }
@ -169,7 +187,7 @@ export function highlightByOffset(
const parent = currentNode.parentNode; const parent = currentNode.parentNode;
if (!parent) break; if (!parent) break;
const text = currentNode.textContent || ''; const text = currentNode.textContent || "";
let fragment = document.createDocumentFragment(); let fragment = document.createDocumentFragment();
if (isFirstNode) { if (isFirstNode) {
@ -200,7 +218,6 @@ export function highlightByOffset(
console.log(`[highlightByOffset] Applied multi-node highlight`); console.log(`[highlightByOffset] Applied multi-node highlight`);
return true; return true;
} catch (err) { } catch (err) {
console.error(`[highlightByOffset] Error applying highlight:`, err); console.error(`[highlightByOffset] Error applying highlight:`, err);
return false; return false;
@ -213,7 +230,7 @@ export function highlightByOffset(
*/ */
export function getPlainText(element: HTMLElement): string { export function getPlainText(element: HTMLElement): string {
const textNodes = getTextNodes(element); const textNodes = getTextNodes(element);
return textNodes.map(node => node.textContent).join(''); return textNodes.map((node) => node.textContent).join("");
} }
/** /**

21
src/lib/utils/highlightUtils.ts

@ -15,7 +15,9 @@ export interface GroupedHighlight {
* Groups highlights by author pubkey * Groups highlights by author pubkey
* Returns a Map with pubkey as key and array of highlights as value * Returns a Map with pubkey as key and array of highlights as value
*/ */
export function groupHighlightsByAuthor(highlights: NDKEvent[]): Map<string, NDKEvent[]> { export function groupHighlightsByAuthor(
highlights: NDKEvent[],
): Map<string, NDKEvent[]> {
const grouped = new Map<string, NDKEvent[]>(); const grouped = new Map<string, NDKEvent[]>();
for (const highlight of highlights) { for (const highlight of highlights) {
@ -34,7 +36,10 @@ export function groupHighlightsByAuthor(highlights: NDKEvent[]): Map<string, NDK
* @param maxLength - Maximum length (default: 50) * @param maxLength - Maximum length (default: 50)
* @returns Truncated text with ellipsis if needed * @returns Truncated text with ellipsis if needed
*/ */
export function truncateHighlight(text: string, maxLength: number = 50): string { export function truncateHighlight(
text: string,
maxLength: number = 50,
): string {
if (!text || text.length <= maxLength) { if (!text || text.length <= maxLength) {
return text; return text;
} }
@ -57,7 +62,10 @@ export function truncateHighlight(text: string, maxLength: number = 50): string
* @param relays - Array of relay URLs to include as hints * @param relays - Array of relay URLs to include as hints
* @returns naddr string * @returns naddr string
*/ */
export function encodeHighlightNaddr(event: NDKEvent, relays: string[] = []): string { export function encodeHighlightNaddr(
event: NDKEvent,
relays: string[] = [],
): string {
try { try {
// For kind 9802 highlights, we need the event's unique identifier // For kind 9802 highlights, we need the event's unique identifier
// Since highlights don't have a d-tag, we'll use the event id as nevent instead // Since highlights don't have a d-tag, we'll use the event id as nevent instead
@ -146,11 +154,14 @@ export function sortHighlightsByTime(highlights: NDKEvent[]): NDKEvent[] {
* Priority: displayName > name > shortened npub * Priority: displayName > name > shortened npub
*/ */
export function getAuthorDisplayName( export function getAuthorDisplayName(
profile: { name?: string; displayName?: string; display_name?: string } | null, profile:
| { name?: string; displayName?: string; display_name?: string }
| null,
pubkey: string, pubkey: string,
): string { ): string {
if (profile) { if (profile) {
return profile.displayName || profile.display_name || profile.name || shortenNpub(pubkey); return profile.displayName || profile.display_name || profile.name ||
shortenNpub(pubkey);
} }
return shortenNpub(pubkey); return shortenNpub(pubkey);
} }

18
src/lib/utils/mockCommentData.ts

@ -47,7 +47,7 @@ function createMockComment(
targetAddress: string, targetAddress: string,
createdAt: number, createdAt: number,
replyToId?: string, replyToId?: string,
replyToAuthor?: string replyToAuthor?: string,
): any { ): any {
const tags: string[][] = [ const tags: string[][] = [
["A", targetAddress, "wss://relay.damus.io", pubkey], ["A", targetAddress, "wss://relay.damus.io", pubkey],
@ -85,7 +85,7 @@ function createMockComment(
export function generateMockComments( export function generateMockComments(
sectionAddress: string, sectionAddress: string,
numRootComments: number = 3, numRootComments: number = 3,
numRepliesPerThread: number = 2 numRepliesPerThread: number = 2,
): any[] { ): any[] {
const comments: any[] = []; const comments: any[] = [];
const now = Math.floor(Date.now() / 1000); const now = Math.floor(Date.now() / 1000);
@ -103,7 +103,7 @@ export function generateMockComments(
rootContent, rootContent,
rootPubkey, rootPubkey,
sectionAddress, sectionAddress,
rootCreatedAt rootCreatedAt,
); );
comments.push(rootComment); comments.push(rootComment);
@ -112,7 +112,8 @@ export function generateMockComments(
for (let j = 0; j < numRepliesPerThread; j++) { for (let j = 0; j < numRepliesPerThread; j++) {
const replyId = `mock-reply-${i}-${j}-${Date.now()}`; const replyId = `mock-reply-${i}-${j}-${Date.now()}`;
const replyPubkey = mockPubkeys[(i + j + 1) % mockPubkeys.length]; const replyPubkey = mockPubkeys[(i + j + 1) % mockPubkeys.length];
const replyContent = loremIpsumReplies[commentIndex % loremIpsumReplies.length]; const replyContent =
loremIpsumReplies[commentIndex % loremIpsumReplies.length];
const replyCreatedAt = rootCreatedAt + (j + 1) * 1800; // 30 min after each const replyCreatedAt = rootCreatedAt + (j + 1) * 1800; // 30 min after each
const reply = createMockComment( const reply = createMockComment(
@ -122,7 +123,7 @@ export function generateMockComments(
sectionAddress, sectionAddress,
replyCreatedAt, replyCreatedAt,
rootId, rootId,
rootPubkey rootPubkey,
); );
comments.push(reply); comments.push(reply);
@ -131,7 +132,8 @@ export function generateMockComments(
if (j === 0 && i < 2) { if (j === 0 && i < 2) {
const nestedId = `mock-nested-${i}-${j}-${Date.now()}`; const nestedId = `mock-nested-${i}-${j}-${Date.now()}`;
const nestedPubkey = mockPubkeys[(i + j + 2) % mockPubkeys.length]; const nestedPubkey = mockPubkeys[(i + j + 2) % mockPubkeys.length];
const nestedContent = loremIpsumReplies[(commentIndex + 1) % loremIpsumReplies.length]; const nestedContent =
loremIpsumReplies[(commentIndex + 1) % loremIpsumReplies.length];
const nestedCreatedAt = replyCreatedAt + 900; // 15 min after reply const nestedCreatedAt = replyCreatedAt + 900; // 15 min after reply
const nested = createMockComment( const nested = createMockComment(
@ -141,7 +143,7 @@ export function generateMockComments(
sectionAddress, sectionAddress,
nestedCreatedAt, nestedCreatedAt,
replyId, replyId,
replyPubkey replyPubkey,
); );
comments.push(nested); comments.push(nested);
@ -160,7 +162,7 @@ export function generateMockComments(
* @returns Array of all mock comments across all sections * @returns Array of all mock comments across all sections
*/ */
export function generateMockCommentsForSections( export function generateMockCommentsForSections(
sectionAddresses: string[] sectionAddresses: string[],
): any[] { ): any[] {
const allComments: any[] = []; const allComments: any[] = [];

113
src/lib/utils/mockHighlightData.ts

@ -5,53 +5,53 @@
// Sample highlighted text snippets (things users might actually highlight) // Sample highlighted text snippets (things users might actually highlight)
const highlightedTexts = [ const highlightedTexts = [
'Knowledge that tries to stay put inevitably becomes ossified', "Knowledge that tries to stay put inevitably becomes ossified",
'The attempt to hold knowledge still is like trying to photograph a river', "The attempt to hold knowledge still is like trying to photograph a river",
'Understanding emerges not from rigid frameworks but from fluid engagement', "Understanding emerges not from rigid frameworks but from fluid engagement",
'Traditional institutions struggle with the natural promiscuity of ideas', "Traditional institutions struggle with the natural promiscuity of ideas",
'Thinking without permission means refusing predetermined categories', "Thinking without permission means refusing predetermined categories",
'The most valuable insights often come from unexpected juxtapositions', "The most valuable insights often come from unexpected juxtapositions",
'Anarchistic knowledge rejects the notion of authorized interpreters', "Anarchistic knowledge rejects the notion of authorized interpreters",
'Every act of reading is an act of creative interpretation', "Every act of reading is an act of creative interpretation",
'Hierarchy in knowledge systems serves power, not understanding', "Hierarchy in knowledge systems serves power, not understanding",
'The boundary between creator and consumer is an artificial construction', "The boundary between creator and consumer is an artificial construction",
]; ];
// Context strings (surrounding text to help locate the highlight) // Context strings (surrounding text to help locate the highlight)
const contexts = [ const contexts = [
'This is the fundamental paradox of institutionalized knowledge. Knowledge that tries to stay put inevitably becomes ossified, a monument to itself rather than a living practice.', "This is the fundamental paradox of institutionalized knowledge. Knowledge that tries to stay put inevitably becomes ossified, a monument to itself rather than a living practice.",
'The attempt to hold knowledge still is like trying to photograph a river—you capture an image, but you lose the flow. What remains is a static representation, not the dynamic reality.', "The attempt to hold knowledge still is like trying to photograph a river—you capture an image, but you lose the flow. What remains is a static representation, not the dynamic reality.",
'Understanding emerges not from rigid frameworks but from fluid engagement with ideas, people, and contexts. This fluidity is precisely what traditional systems attempt to eliminate.', "Understanding emerges not from rigid frameworks but from fluid engagement with ideas, people, and contexts. This fluidity is precisely what traditional systems attempt to eliminate.",
'Traditional institutions struggle with the natural promiscuity of ideas—the way concepts naturally migrate, mutate, and merge across boundaries that were meant to contain them.', "Traditional institutions struggle with the natural promiscuity of ideas—the way concepts naturally migrate, mutate, and merge across boundaries that were meant to contain them.",
'Thinking without permission means refusing predetermined categories and challenging the gatekeepers who claim authority over legitimate thought.', "Thinking without permission means refusing predetermined categories and challenging the gatekeepers who claim authority over legitimate thought.",
'The most valuable insights often come from unexpected juxtapositions, from bringing together ideas that were never meant to meet.', "The most valuable insights often come from unexpected juxtapositions, from bringing together ideas that were never meant to meet.",
'Anarchistic knowledge rejects the notion of authorized interpreters, asserting instead that meaning-making is a fundamentally distributed and democratic process.', "Anarchistic knowledge rejects the notion of authorized interpreters, asserting instead that meaning-making is a fundamentally distributed and democratic process.",
'Every act of reading is an act of creative interpretation, a collaboration between text and reader that produces something new each time.', "Every act of reading is an act of creative interpretation, a collaboration between text and reader that produces something new each time.",
'Hierarchy in knowledge systems serves power, not understanding. It determines who gets to speak, who must listen, and what counts as legitimate knowledge.', "Hierarchy in knowledge systems serves power, not understanding. It determines who gets to speak, who must listen, and what counts as legitimate knowledge.",
'The boundary between creator and consumer is an artificial construction, one that digital networks make increasingly untenable and obsolete.', "The boundary between creator and consumer is an artificial construction, one that digital networks make increasingly untenable and obsolete.",
]; ];
// Optional annotations (user comments on their highlights) // Optional annotations (user comments on their highlights)
const annotations = [ const annotations = [
'This perfectly captures the institutional problem', "This perfectly captures the institutional problem",
'Key insight - worth revisiting', "Key insight - worth revisiting",
'Reminds me of Deleuze on rhizomatic structures', "Reminds me of Deleuze on rhizomatic structures",
'Fundamental critique of academic gatekeeping', "Fundamental critique of academic gatekeeping",
'The core argument in one sentence', "The core argument in one sentence",
null, // Some highlights have no annotation null, // Some highlights have no annotation
'Important for understanding the broader thesis', "Important for understanding the broader thesis",
null, null,
'Connects to earlier discussion on page 12', "Connects to earlier discussion on page 12",
null, null,
]; ];
// Mock pubkeys - MUST be exactly 64 hex characters // Mock pubkeys - MUST be exactly 64 hex characters
const mockPubkeys = [ const mockPubkeys = [
'a1b2c3d4e5f67890123456789012345678901234567890123456789012345678', "a1b2c3d4e5f67890123456789012345678901234567890123456789012345678",
'b2c3d4e5f67890123456789012345678901234567890123456789012345678ab', "b2c3d4e5f67890123456789012345678901234567890123456789012345678ab",
'c3d4e5f67890123456789012345678901234567890123456789012345678abcd', "c3d4e5f67890123456789012345678901234567890123456789012345678abcd",
'd4e5f67890123456789012345678901234567890123456789012345678abcdef', "d4e5f67890123456789012345678901234567890123456789012345678abcdef",
'e5f6789012345678901234567890123456789012345678901234567890abcdef', "e5f6789012345678901234567890123456789012345678901234567890abcdef",
]; ];
/** /**
@ -74,22 +74,22 @@ function createMockHighlight(
authorPubkey: string, authorPubkey: string,
annotation?: string | null, annotation?: string | null,
offsetStart?: number, offsetStart?: number,
offsetEnd?: number offsetEnd?: number,
): any { ): any {
const tags: string[][] = [ const tags: string[][] = [
['a', targetAddress, 'wss://relay.damus.io'], ["a", targetAddress, "wss://relay.damus.io"],
['context', context], ["context", context],
['p', authorPubkey, 'wss://relay.damus.io', 'author'], ["p", authorPubkey, "wss://relay.damus.io", "author"],
]; ];
// Add optional annotation // Add optional annotation
if (annotation) { if (annotation) {
tags.push(['comment', annotation]); tags.push(["comment", annotation]);
} }
// Add optional offset for position-based highlighting // Add optional offset for position-based highlighting
if (offsetStart !== undefined && offsetEnd !== undefined) { if (offsetStart !== undefined && offsetEnd !== undefined) {
tags.push(['offset', offsetStart.toString(), offsetEnd.toString()]); tags.push(["offset", offsetStart.toString(), offsetEnd.toString()]);
} }
return { return {
@ -99,7 +99,7 @@ function createMockHighlight(
created_at: createdAt, created_at: createdAt,
content: highlightedText, // The highlighted text itself content: highlightedText, // The highlighted text itself
tags, tags,
sig: 'mock-signature-' + id, sig: "mock-signature-" + id,
}; };
} }
@ -113,7 +113,7 @@ function createMockHighlight(
export function generateMockHighlights( export function generateMockHighlights(
sectionAddress: string, sectionAddress: string,
authorPubkey: string, authorPubkey: string,
numHighlights: number = Math.floor(Math.random() * 2) + 2 // 2-3 highlights numHighlights: number = Math.floor(Math.random() * 2) + 2, // 2-3 highlights
): any[] { ): any[] {
const highlights: any[] = []; const highlights: any[] = [];
const now = Math.floor(Date.now() / 1000); const now = Math.floor(Date.now() / 1000);
@ -123,7 +123,9 @@ export function generateMockHighlights(
// The offset tags will highlight the ACTUAL text at those positions in the section // The offset tags will highlight the ACTUAL text at those positions in the section
for (let i = 0; i < numHighlights; i++) { for (let i = 0; i < numHighlights; i++) {
const id = `mock-highlight-${i}-${Date.now()}-${Math.random().toString(36).substring(7)}`; const id = `mock-highlight-${i}-${Date.now()}-${
Math.random().toString(36).substring(7)
}`;
const highlighterPubkey = mockPubkeys[i % mockPubkeys.length]; const highlighterPubkey = mockPubkeys[i % mockPubkeys.length];
const annotation = annotations[i % annotations.length]; const annotation = annotations[i % annotations.length];
const createdAt = now - (numHighlights - i) * 7200; // Stagger by 2 hours const createdAt = now - (numHighlights - i) * 7200; // Stagger by 2 hours
@ -136,7 +138,9 @@ export function generateMockHighlights(
// Use placeholder text - the actual highlighted text will be determined by the offsets // Use placeholder text - the actual highlighted text will be determined by the offsets
const placeholderText = `Test highlight ${i + 1}`; const placeholderText = `Test highlight ${i + 1}`;
const placeholderContext = `This is test highlight ${i + 1} at position ${offsetStart}-${offsetEnd}`; const placeholderContext = `This is test highlight ${
i + 1
} at position ${offsetStart}-${offsetEnd}`;
const highlight = createMockHighlight( const highlight = createMockHighlight(
id, id,
@ -148,7 +152,7 @@ export function generateMockHighlights(
authorPubkey, authorPubkey,
annotation, annotation,
offsetStart, offsetStart,
offsetEnd offsetEnd,
); );
highlights.push(highlight); highlights.push(highlight);
@ -165,19 +169,32 @@ export function generateMockHighlights(
*/ */
export function generateMockHighlightsForSections( export function generateMockHighlightsForSections(
sectionAddresses: string[], sectionAddresses: string[],
authorPubkey: string = 'dc4cd086cd7ce5b1832adf4fdd1211289880d2c7e295bcb0e684c01acee77c06' authorPubkey: string =
"dc4cd086cd7ce5b1832adf4fdd1211289880d2c7e295bcb0e684c01acee77c06",
): any[] { ): any[] {
const allHighlights: any[] = []; const allHighlights: any[] = [];
sectionAddresses.forEach((address, index) => { sectionAddresses.forEach((address, index) => {
// Each section gets 2 highlights at the very beginning (positions 0-100 and 120-220) // Each section gets 2 highlights at the very beginning (positions 0-100 and 120-220)
const numHighlights = 2; const numHighlights = 2;
const sectionHighlights = generateMockHighlights(address, authorPubkey, numHighlights); const sectionHighlights = generateMockHighlights(
console.log(`[MockHighlightData] Generated ${numHighlights} highlights for section ${address.split(':')[2]?.substring(0, 20)}... at positions 0-100, 120-220`); address,
authorPubkey,
numHighlights,
);
console.log(
`[MockHighlightData] Generated ${numHighlights} highlights for section ${
address.split(":")[2]?.substring(0, 20)
}... at positions 0-100, 120-220`,
);
allHighlights.push(...sectionHighlights); allHighlights.push(...sectionHighlights);
}); });
console.log(`[MockHighlightData] Total: ${allHighlights.length} highlights across ${sectionAddresses.length} sections`); console.log(
console.log(`[MockHighlightData] Each highlight is anchored to its section via "a" tag and uses offset tags for position`); `[MockHighlightData] Total: ${allHighlights.length} highlights across ${sectionAddresses.length} sections`,
);
console.log(
`[MockHighlightData] Each highlight is anchored to its section via "a" tag and uses offset tags for position`,
);
return allHighlights; return allHighlights;
} }

13
src/lib/utils/publication_tree_factory.ts

@ -189,8 +189,8 @@ function detectContentType(
// Check if the "title" is actually just the first section title // Check if the "title" is actually just the first section title
// This happens when AsciiDoc starts with == instead of = // This happens when AsciiDoc starts with == instead of =
const titleMatchesFirstSection = const titleMatchesFirstSection = parsed.sections.length > 0 &&
parsed.sections.length > 0 && parsed.title === parsed.sections[0].title; parsed.title === parsed.sections[0].title;
if (hasDocTitle && hasSections && !titleMatchesFirstSection) { if (hasDocTitle && hasSections && !titleMatchesFirstSection) {
return "article"; return "article";
@ -286,8 +286,9 @@ function inheritDocumentAttributes(
documentAttributes: Record<string, string>, documentAttributes: Record<string, string>,
) { ) {
// Inherit selected document attributes // Inherit selected document attributes
if (documentAttributes.language) if (documentAttributes.language) {
tags.push(["language", documentAttributes.language]); tags.push(["language", documentAttributes.language]);
}
if (documentAttributes.type) tags.push(["type", documentAttributes.type]); if (documentAttributes.type) tags.push(["type", documentAttributes.type]);
} }
@ -368,9 +369,11 @@ function generateIndexContent(parsed: any): string {
${parsed.sections.length} sections available: ${parsed.sections.length} sections available:
${parsed.sections ${
parsed.sections
.map((section: any, i: number) => `${i + 1}. ${section.title}`) .map((section: any, i: number) => `${i + 1}. ${section.title}`)
.join("\n")}`; .join("\n")
}`;
} }
/** /**

46
src/lib/utils/publication_tree_processor.ts

@ -127,7 +127,9 @@ export function registerPublicationTreeProcessor(
}; };
console.log( console.log(
`[TreeProcessor] Built tree with ${contentEvents.length} content events and ${indexEvent ? "1" : "0"} index events`, `[TreeProcessor] Built tree with ${contentEvents.length} content events and ${
indexEvent ? "1" : "0"
} index events`,
); );
} catch (error) { } catch (error) {
console.error("[TreeProcessor] Error processing document:", error); console.error("[TreeProcessor] Error processing document:", error);
@ -363,7 +365,6 @@ function parseSegmentContent(
console.log(` extracted content:`, JSON.stringify(content)); console.log(` extracted content:`, JSON.stringify(content));
} }
return { attributes, content }; return { attributes, content };
} }
@ -378,8 +379,8 @@ function detectContentType(
const hasSections = segments.length > 0; const hasSections = segments.length > 0;
// Check if the title matches the first section title // Check if the title matches the first section title
const titleMatchesFirstSection = const titleMatchesFirstSection = segments.length > 0 &&
segments.length > 0 && title === segments[0].title; title === segments[0].title;
if (hasDocTitle && hasSections && !titleMatchesFirstSection) { if (hasDocTitle && hasSections && !titleMatchesFirstSection) {
return "article"; return "article";
@ -530,7 +531,11 @@ function buildLevel2Structure(
// Group segments by level 2 sections // Group segments by level 2 sections
const level2Groups = groupSegmentsByLevel2(segments); const level2Groups = groupSegmentsByLevel2(segments);
console.log(`[TreeProcessor] Level 2 groups:`, level2Groups.length, level2Groups.map(g => g.title)); console.log(
`[TreeProcessor] Level 2 groups:`,
level2Groups.length,
level2Groups.map((g) => g.title),
);
// Generate publication abbreviation for namespacing // Generate publication abbreviation for namespacing
const pubAbbrev = generateTitleAbbreviation(title); const pubAbbrev = generateTitleAbbreviation(title);
@ -599,7 +604,7 @@ function buildHierarchicalStructure(
contentEvents, contentEvents,
ndk, ndk,
parseLevel, parseLevel,
title title,
); );
return { tree, indexEvent, contentEvents, eventStructure }; return { tree, indexEvent, contentEvents, eventStructure };
@ -680,7 +685,10 @@ function createContentEvent(
if (wikiLinks.length > 0) { if (wikiLinks.length > 0) {
const wikiTags = wikiLinksToTags(wikiLinks); const wikiTags = wikiLinksToTags(wikiLinks);
tags.push(...wikiTags); tags.push(...wikiTags);
console.log(`[TreeProcessor] Added ${wikiTags.length} wiki link tags:`, wikiTags); console.log(
`[TreeProcessor] Added ${wikiTags.length} wiki link tags:`,
wikiTags,
);
} }
event.tags = tags; event.tags = tags;
@ -886,10 +894,11 @@ function groupSegmentsByLevel2(segments: ContentSegment[]): ContentSegment[] {
// Combine the level 2 content with all nested content // Combine the level 2 content with all nested content
let combinedContent = segment.content; let combinedContent = segment.content;
for (const nested of nestedSegments) { for (const nested of nestedSegments) {
combinedContent += `\n\n${"=".repeat(nested.level)} ${nested.title}\n${nested.content}`; combinedContent += `\n\n${
"=".repeat(nested.level)
} ${nested.title}\n${nested.content}`;
} }
level2Groups.push({ level2Groups.push({
...segment, ...segment,
content: combinedContent, content: combinedContent,
@ -906,14 +915,14 @@ function groupSegmentsByLevel2(segments: ContentSegment[]): ContentSegment[] {
*/ */
function buildHierarchicalGroups( function buildHierarchicalGroups(
segments: ContentSegment[], segments: ContentSegment[],
parseLevel: number parseLevel: number,
): HierarchicalNode[] { ): HierarchicalNode[] {
const groups: HierarchicalNode[] = []; const groups: HierarchicalNode[] = [];
// Group segments by their parent-child relationships // Group segments by their parent-child relationships
const segmentsByLevel: Map<number, ContentSegment[]> = new Map(); const segmentsByLevel: Map<number, ContentSegment[]> = new Map();
for (let level = 2; level <= parseLevel; level++) { for (let level = 2; level <= parseLevel; level++) {
segmentsByLevel.set(level, segments.filter(s => s.level === level)); segmentsByLevel.set(level, segments.filter((s) => s.level === level));
} }
// Build the hierarchy from level 2 down to parseLevel // Build the hierarchy from level 2 down to parseLevel
@ -931,16 +940,17 @@ function buildHierarchicalGroups(
function buildNodeHierarchy( function buildNodeHierarchy(
segment: ContentSegment, segment: ContentSegment,
allSegments: ContentSegment[], allSegments: ContentSegment[],
parseLevel: number parseLevel: number,
): HierarchicalNode { ): HierarchicalNode {
// Find direct children (one level deeper) // Find direct children (one level deeper)
const directChildren = allSegments.filter(s => { const directChildren = allSegments.filter((s) => {
if (s.level !== segment.level + 1) return false; if (s.level !== segment.level + 1) return false;
if (s.startLine <= segment.startLine) return false; if (s.startLine <= segment.startLine) return false;
// Check if this segment is within our section's bounds // Check if this segment is within our section's bounds
const nextSibling = allSegments.find( const nextSibling = allSegments.find(
next => next.level <= segment.level && next.startLine > segment.startLine (next) =>
next.level <= segment.level && next.startLine > segment.startLine,
); );
const endLine = nextSibling?.startLine || Infinity; const endLine = nextSibling?.startLine || Infinity;
@ -958,7 +968,7 @@ function buildNodeHierarchy(
childNodes.push({ childNodes.push({
segment: child, segment: child,
children: [], children: [],
hasChildren: false hasChildren: false,
}); });
} }
} }
@ -966,7 +976,7 @@ function buildNodeHierarchy(
return { return {
segment, segment,
children: childNodes, children: childNodes,
hasChildren: childNodes.length > 0 hasChildren: childNodes.length > 0,
}; };
} }
@ -1119,7 +1129,6 @@ function createIndexEventForHierarchicalNode(
return event; return event;
} }
/** /**
* Build hierarchical segment structure for Level 3+ parsing * Build hierarchical segment structure for Level 3+ parsing
*/ */
@ -1135,7 +1144,8 @@ function buildSegmentHierarchy(
s.level > 2 && s.level > 2 &&
s.startLine > level2Segment.startLine && s.startLine > level2Segment.startLine &&
(segments.find( (segments.find(
(next) => next.level <= 2 && next.startLine > level2Segment.startLine, (next) =>
next.level <= 2 && next.startLine > level2Segment.startLine,
)?.startLine || Infinity) > s.startLine, )?.startLine || Infinity) > s.startLine,
); );

41
src/lib/utils/wiki_links.ts

@ -5,7 +5,7 @@
export interface WikiLink { export interface WikiLink {
fullMatch: string; fullMatch: string;
type: 'w' | 'd' | 'auto'; // auto means [[term]] without explicit prefix type: "w" | "d" | "auto"; // auto means [[term]] without explicit prefix
term: string; term: string;
displayText: string; displayText: string;
startIndex: number; startIndex: number;
@ -34,7 +34,7 @@ export function extractWikiLinks(content: string): WikiLink[] {
wikiLinks.push({ wikiLinks.push({
fullMatch: match[0], fullMatch: match[0],
type: prefix ? (prefix as 'w' | 'd') : 'auto', type: prefix ? (prefix as "w" | "d") : "auto",
term, term,
displayText: customDisplay || term, displayText: customDisplay || term,
startIndex: match.index, startIndex: match.index,
@ -53,8 +53,8 @@ export function termToTag(term: string): string {
return term return term
.toLowerCase() .toLowerCase()
.trim() .trim()
.replace(/\s+/g, '-') .replace(/\s+/g, "-")
.replace(/[^a-z0-9-]/g, ''); .replace(/[^a-z0-9-]/g, "");
} }
/** /**
@ -67,14 +67,14 @@ export function wikiLinksToTags(wikiLinks: WikiLink[]): string[][] {
for (const link of wikiLinks) { for (const link of wikiLinks) {
const tagSlug = termToTag(link.term); const tagSlug = termToTag(link.term);
if (link.type === 'w' || link.type === 'auto') { if (link.type === "w" || link.type === "auto") {
// Reference tag includes display text // Reference tag includes display text
tags.push(['w', tagSlug, link.displayText]); tags.push(["w", tagSlug, link.displayText]);
} }
if (link.type === 'd') { if (link.type === "d") {
// Definition tag (no display text, it IS the thing) // Definition tag (no display text, it IS the thing)
tags.push(['d', tagSlug]); tags.push(["d", tagSlug]);
} }
} }
@ -91,13 +91,13 @@ export function renderWikiLinksToHtml(
linkClass?: string; linkClass?: string;
wLinkClass?: string; wLinkClass?: string;
dLinkClass?: string; dLinkClass?: string;
onClickHandler?: (type: 'w' | 'd' | 'auto', term: string) => string; onClickHandler?: (type: "w" | "d" | "auto", term: string) => string;
} = {}, } = {},
): string { ): string {
const { const {
linkClass = 'wiki-link', linkClass = "wiki-link",
wLinkClass = 'wiki-link-reference', wLinkClass = "wiki-link-reference",
dLinkClass = 'wiki-link-definition', dLinkClass = "wiki-link-definition",
onClickHandler, onClickHandler,
} = options; } = options;
@ -105,13 +105,13 @@ export function renderWikiLinksToHtml(
/\[\[(?:(w|d):)?([^\]|]+)(?:\|([^\]]+))?\]\]/g, /\[\[(?:(w|d):)?([^\]|]+)(?:\|([^\]]+))?\]\]/g,
(match, prefix, term, customDisplay) => { (match, prefix, term, customDisplay) => {
const displayText = customDisplay?.trim() || term.trim(); const displayText = customDisplay?.trim() || term.trim();
const type = prefix ? prefix : 'auto'; const type = prefix ? prefix : "auto";
const tagSlug = termToTag(term); const tagSlug = termToTag(term);
// Determine CSS classes // Determine CSS classes
let classes = linkClass; let classes = linkClass;
if (type === 'w') classes += ` ${wLinkClass}`; if (type === "w") classes += ` ${wLinkClass}`;
else if (type === 'd') classes += ` ${dLinkClass}`; else if (type === "d") classes += ` ${dLinkClass}`;
// Generate href or onclick // Generate href or onclick
const action = onClickHandler const action = onClickHandler
@ -119,12 +119,11 @@ export function renderWikiLinksToHtml(
: `href="#wiki/${type}/${encodeURIComponent(tagSlug)}"`; : `href="#wiki/${type}/${encodeURIComponent(tagSlug)}"`;
// Add title attribute showing the type // Add title attribute showing the type
const title = const title = type === "w"
type === 'w' ? "Wiki reference (mentions this concept)"
? 'Wiki reference (mentions this concept)' : type === "d"
: type === 'd' ? "Wiki definition (defines this concept)"
? 'Wiki definition (defines this concept)' : "Wiki link (searches both references and definitions)";
: 'Wiki link (searches both references and definitions)';
return `<a class="${classes}" ${action} title="${title}" data-wiki-type="${type}" data-wiki-term="${tagSlug}">${displayText}</a>`; return `<a class="${classes}" ${action} title="${title}" data-wiki-type="${type}" data-wiki-term="${tagSlug}">${displayText}</a>`;
}, },

445
tests/unit/highlightLayer.test.ts

@ -1,62 +1,62 @@
import { describe, it, expect, vi, beforeEach, afterEach } from 'vitest'; import { afterEach, beforeEach, describe, expect, it, vi } from "vitest";
import { pubkeyToHue } from '../../src/lib/utils/nostrUtils'; import { pubkeyToHue } from "../../src/lib/utils/nostrUtils";
import { nip19 } from 'nostr-tools'; import { nip19 } from "nostr-tools";
describe('pubkeyToHue', () => { describe("pubkeyToHue", () => {
describe('Consistency', () => { describe("Consistency", () => {
it('returns consistent hue for same pubkey', () => { it("returns consistent hue for same pubkey", () => {
const pubkey = 'a'.repeat(64); const pubkey = "a".repeat(64);
const hue1 = pubkeyToHue(pubkey); const hue1 = pubkeyToHue(pubkey);
const hue2 = pubkeyToHue(pubkey); const hue2 = pubkeyToHue(pubkey);
expect(hue1).toBe(hue2); expect(hue1).toBe(hue2);
}); });
it('returns same hue for same pubkey called multiple times', () => { it("returns same hue for same pubkey called multiple times", () => {
const pubkey = 'abc123def456'.repeat(5) + 'abcd'; const pubkey = "abc123def456".repeat(5) + "abcd";
const hues = Array.from({ length: 10 }, () => pubkeyToHue(pubkey)); const hues = Array.from({ length: 10 }, () => pubkeyToHue(pubkey));
expect(new Set(hues).size).toBe(1); // All hues should be the same expect(new Set(hues).size).toBe(1); // All hues should be the same
}); });
}); });
describe('Range Validation', () => { describe("Range Validation", () => {
it('returns hue in valid range (0-360)', () => { it("returns hue in valid range (0-360)", () => {
const pubkeys = [ const pubkeys = [
'a'.repeat(64), "a".repeat(64),
'f'.repeat(64), "f".repeat(64),
'0'.repeat(64), "0".repeat(64),
'9'.repeat(64), "9".repeat(64),
'abc123def456'.repeat(5) + 'abcd', "abc123def456".repeat(5) + "abcd",
'123456789abc'.repeat(5) + 'def0', "123456789abc".repeat(5) + "def0",
]; ];
pubkeys.forEach(pubkey => { pubkeys.forEach((pubkey) => {
const hue = pubkeyToHue(pubkey); const hue = pubkeyToHue(pubkey);
expect(hue).toBeGreaterThanOrEqual(0); expect(hue).toBeGreaterThanOrEqual(0);
expect(hue).toBeLessThan(360); expect(hue).toBeLessThan(360);
}); });
}); });
it('returns integer hue value', () => { it("returns integer hue value", () => {
const pubkey = 'a'.repeat(64); const pubkey = "a".repeat(64);
const hue = pubkeyToHue(pubkey); const hue = pubkeyToHue(pubkey);
expect(Number.isInteger(hue)).toBe(true); expect(Number.isInteger(hue)).toBe(true);
}); });
}); });
describe('Format Handling', () => { describe("Format Handling", () => {
it('handles hex format pubkeys', () => { it("handles hex format pubkeys", () => {
const hexPubkey = 'abcdef123456789'.repeat(4) + '0123'; const hexPubkey = "abcdef123456789".repeat(4) + "0123";
const hue = pubkeyToHue(hexPubkey); const hue = pubkeyToHue(hexPubkey);
expect(hue).toBeGreaterThanOrEqual(0); expect(hue).toBeGreaterThanOrEqual(0);
expect(hue).toBeLessThan(360); expect(hue).toBeLessThan(360);
}); });
it('handles npub format pubkeys', () => { it("handles npub format pubkeys", () => {
const hexPubkey = 'a'.repeat(64); const hexPubkey = "a".repeat(64);
const npub = nip19.npubEncode(hexPubkey); const npub = nip19.npubEncode(hexPubkey);
const hue = pubkeyToHue(npub); const hue = pubkeyToHue(npub);
@ -64,8 +64,8 @@ describe('pubkeyToHue', () => {
expect(hue).toBeLessThan(360); expect(hue).toBeLessThan(360);
}); });
it('returns same hue for hex and npub format of same pubkey', () => { it("returns same hue for hex and npub format of same pubkey", () => {
const hexPubkey = 'abc123def456'.repeat(5) + 'abcd'; const hexPubkey = "abc123def456".repeat(5) + "abcd";
const npub = nip19.npubEncode(hexPubkey); const npub = nip19.npubEncode(hexPubkey);
const hueFromHex = pubkeyToHue(hexPubkey); const hueFromHex = pubkeyToHue(hexPubkey);
@ -75,11 +75,11 @@ describe('pubkeyToHue', () => {
}); });
}); });
describe('Uniqueness', () => { describe("Uniqueness", () => {
it('different pubkeys generate different hues', () => { it("different pubkeys generate different hues", () => {
const pubkey1 = 'a'.repeat(64); const pubkey1 = "a".repeat(64);
const pubkey2 = 'b'.repeat(64); const pubkey2 = "b".repeat(64);
const pubkey3 = 'c'.repeat(64); const pubkey3 = "c".repeat(64);
const hue1 = pubkeyToHue(pubkey1); const hue1 = pubkeyToHue(pubkey1);
const hue2 = pubkeyToHue(pubkey2); const hue2 = pubkeyToHue(pubkey2);
@ -90,12 +90,13 @@ describe('pubkeyToHue', () => {
expect(hue1).not.toBe(hue3); expect(hue1).not.toBe(hue3);
}); });
it('generates diverse hues for multiple pubkeys', () => { it("generates diverse hues for multiple pubkeys", () => {
const pubkeys = Array.from({ length: 10 }, (_, i) => const pubkeys = Array.from(
String.fromCharCode(97 + i).repeat(64) { length: 10 },
(_, i) => String.fromCharCode(97 + i).repeat(64),
); );
const hues = pubkeys.map(pk => pubkeyToHue(pk)); const hues = pubkeys.map((pk) => pubkeyToHue(pk));
const uniqueHues = new Set(hues); const uniqueHues = new Set(hues);
// Most pubkeys should generate unique hues (allowing for some collisions) // Most pubkeys should generate unique hues (allowing for some collisions)
@ -103,16 +104,16 @@ describe('pubkeyToHue', () => {
}); });
}); });
describe('Edge Cases', () => { describe("Edge Cases", () => {
it('handles empty string input', () => { it("handles empty string input", () => {
const hue = pubkeyToHue(''); const hue = pubkeyToHue("");
expect(hue).toBeGreaterThanOrEqual(0); expect(hue).toBeGreaterThanOrEqual(0);
expect(hue).toBeLessThan(360); expect(hue).toBeLessThan(360);
}); });
it('handles invalid npub format gracefully', () => { it("handles invalid npub format gracefully", () => {
const invalidNpub = 'npub1invalid'; const invalidNpub = "npub1invalid";
const hue = pubkeyToHue(invalidNpub); const hue = pubkeyToHue(invalidNpub);
// Should still return a valid hue even if decode fails // Should still return a valid hue even if decode fails
@ -120,16 +121,16 @@ describe('pubkeyToHue', () => {
expect(hue).toBeLessThan(360); expect(hue).toBeLessThan(360);
}); });
it('handles short input strings', () => { it("handles short input strings", () => {
const shortInput = 'abc'; const shortInput = "abc";
const hue = pubkeyToHue(shortInput); const hue = pubkeyToHue(shortInput);
expect(hue).toBeGreaterThanOrEqual(0); expect(hue).toBeGreaterThanOrEqual(0);
expect(hue).toBeLessThan(360); expect(hue).toBeLessThan(360);
}); });
it('handles special characters', () => { it("handles special characters", () => {
const specialInput = '!@#$%^&*()'; const specialInput = "!@#$%^&*()";
const hue = pubkeyToHue(specialInput); const hue = pubkeyToHue(specialInput);
expect(hue).toBeGreaterThanOrEqual(0); expect(hue).toBeGreaterThanOrEqual(0);
@ -137,19 +138,20 @@ describe('pubkeyToHue', () => {
}); });
}); });
describe('Color Distribution', () => { describe("Color Distribution", () => {
it('distributes colors across the spectrum', () => { it("distributes colors across the spectrum", () => {
// Generate hues for many different pubkeys // Generate hues for many different pubkeys
const pubkeys = Array.from({ length: 50 }, (_, i) => const pubkeys = Array.from(
i.toString().repeat(16) { length: 50 },
(_, i) => i.toString().repeat(16),
); );
const hues = pubkeys.map(pk => pubkeyToHue(pk)); const hues = pubkeys.map((pk) => pubkeyToHue(pk));
// Check that we have hues in different ranges of the spectrum // Check that we have hues in different ranges of the spectrum
const hasLowHues = hues.some(h => h < 120); const hasLowHues = hues.some((h) => h < 120);
const hasMidHues = hues.some(h => h >= 120 && h < 240); const hasMidHues = hues.some((h) => h >= 120 && h < 240);
const hasHighHues = hues.some(h => h >= 240); const hasHighHues = hues.some((h) => h >= 240);
expect(hasLowHues).toBe(true); expect(hasLowHues).toBe(true);
expect(hasMidHues).toBe(true); expect(hasMidHues).toBe(true);
@ -158,7 +160,7 @@ describe('pubkeyToHue', () => {
}); });
}); });
describe('HighlightLayer Component', () => { describe("HighlightLayer Component", () => {
let mockNdk: any; let mockNdk: any;
let mockSubscription: any; let mockSubscription: any;
let eventHandlers: Map<string, Function>; let eventHandlers: Map<string, Function>;
@ -190,9 +192,9 @@ describe('HighlightLayer Component', () => {
textContent: text, textContent: text,
})), })),
createElement: vi.fn((tag: string) => ({ createElement: vi.fn((tag: string) => ({
className: '', className: "",
style: {}, style: {},
textContent: '', textContent: "",
})), })),
} as any; } as any;
}); });
@ -201,60 +203,60 @@ describe('HighlightLayer Component', () => {
vi.clearAllMocks(); vi.clearAllMocks();
}); });
describe('NDK Subscription', () => { describe("NDK Subscription", () => {
it('fetches kind 9802 events with correct filter when eventId provided', () => { it("fetches kind 9802 events with correct filter when eventId provided", () => {
const eventId = 'a'.repeat(64); const eventId = "a".repeat(64);
// Simulate calling fetchHighlights // Simulate calling fetchHighlights
mockNdk.subscribe({ kinds: [9802], '#e': [eventId], limit: 100 }); mockNdk.subscribe({ kinds: [9802], "#e": [eventId], limit: 100 });
expect(mockNdk.subscribe).toHaveBeenCalledWith( expect(mockNdk.subscribe).toHaveBeenCalledWith(
expect.objectContaining({ expect.objectContaining({
kinds: [9802], kinds: [9802],
'#e': [eventId], "#e": [eventId],
limit: 100, limit: 100,
}) }),
); );
}); });
it('fetches kind 9802 events with correct filter when eventAddress provided', () => { it("fetches kind 9802 events with correct filter when eventAddress provided", () => {
const eventAddress = '30040:' + 'a'.repeat(64) + ':chapter-1'; const eventAddress = "30040:" + "a".repeat(64) + ":chapter-1";
// Simulate calling fetchHighlights // Simulate calling fetchHighlights
mockNdk.subscribe({ kinds: [9802], '#a': [eventAddress], limit: 100 }); mockNdk.subscribe({ kinds: [9802], "#a": [eventAddress], limit: 100 });
expect(mockNdk.subscribe).toHaveBeenCalledWith( expect(mockNdk.subscribe).toHaveBeenCalledWith(
expect.objectContaining({ expect.objectContaining({
kinds: [9802], kinds: [9802],
'#a': [eventAddress], "#a": [eventAddress],
limit: 100, limit: 100,
}) }),
); );
}); });
it('fetches with both eventId and eventAddress filters when both provided', () => { it("fetches with both eventId and eventAddress filters when both provided", () => {
const eventId = 'a'.repeat(64); const eventId = "a".repeat(64);
const eventAddress = '30040:' + 'b'.repeat(64) + ':chapter-1'; const eventAddress = "30040:" + "b".repeat(64) + ":chapter-1";
// Simulate calling fetchHighlights // Simulate calling fetchHighlights
mockNdk.subscribe({ mockNdk.subscribe({
kinds: [9802], kinds: [9802],
'#e': [eventId], "#e": [eventId],
'#a': [eventAddress], "#a": [eventAddress],
limit: 100, limit: 100,
}); });
expect(mockNdk.subscribe).toHaveBeenCalledWith( expect(mockNdk.subscribe).toHaveBeenCalledWith(
expect.objectContaining({ expect.objectContaining({
kinds: [9802], kinds: [9802],
'#e': [eventId], "#e": [eventId],
'#a': [eventAddress], "#a": [eventAddress],
limit: 100, limit: 100,
}) }),
); );
}); });
it('cleans up subscription on unmount', () => { it("cleans up subscription on unmount", () => {
mockNdk.subscribe({ kinds: [9802], limit: 100 }); mockNdk.subscribe({ kinds: [9802], limit: 100 });
// Simulate unmount by calling stop // Simulate unmount by calling stop
@ -264,10 +266,10 @@ describe('HighlightLayer Component', () => {
}); });
}); });
describe('Color Mapping', () => { describe("Color Mapping", () => {
it('maps highlights to colors correctly', () => { it("maps highlights to colors correctly", () => {
const pubkey1 = 'a'.repeat(64); const pubkey1 = "a".repeat(64);
const pubkey2 = 'b'.repeat(64); const pubkey2 = "b".repeat(64);
const hue1 = pubkeyToHue(pubkey1); const hue1 = pubkeyToHue(pubkey1);
const hue2 = pubkeyToHue(pubkey2); const hue2 = pubkeyToHue(pubkey2);
@ -280,8 +282,8 @@ describe('HighlightLayer Component', () => {
expect(expectedColor1).not.toBe(expectedColor2); expect(expectedColor1).not.toBe(expectedColor2);
}); });
it('uses consistent color for same pubkey', () => { it("uses consistent color for same pubkey", () => {
const pubkey = 'abc123def456'.repeat(5) + 'abcd'; const pubkey = "abc123def456".repeat(5) + "abcd";
const hue = pubkeyToHue(pubkey); const hue = pubkeyToHue(pubkey);
const color1 = `hsla(${hue}, 70%, 60%, 0.3)`; const color1 = `hsla(${hue}, 70%, 60%, 0.3)`;
@ -290,16 +292,16 @@ describe('HighlightLayer Component', () => {
expect(color1).toBe(color2); expect(color1).toBe(color2);
}); });
it('generates semi-transparent colors with 0.3 opacity', () => { it("generates semi-transparent colors with 0.3 opacity", () => {
const pubkey = 'a'.repeat(64); const pubkey = "a".repeat(64);
const hue = pubkeyToHue(pubkey); const hue = pubkeyToHue(pubkey);
const color = `hsla(${hue}, 70%, 60%, 0.3)`; const color = `hsla(${hue}, 70%, 60%, 0.3)`;
expect(color).toContain('0.3'); expect(color).toContain("0.3");
}); });
it('uses HSL color format with correct values', () => { it("uses HSL color format with correct values", () => {
const pubkey = 'a'.repeat(64); const pubkey = "a".repeat(64);
const hue = pubkeyToHue(pubkey); const hue = pubkeyToHue(pubkey);
const color = `hsla(${hue}, 70%, 60%, 0.3)`; const color = `hsla(${hue}, 70%, 60%, 0.3)`;
@ -308,20 +310,20 @@ describe('HighlightLayer Component', () => {
}); });
}); });
describe('Highlight Events', () => { describe("Highlight Events", () => {
it('handles no highlights gracefully', () => { it("handles no highlights gracefully", () => {
const highlights: any[] = []; const highlights: any[] = [];
expect(highlights.length).toBe(0); expect(highlights.length).toBe(0);
// Component should render without errors // Component should render without errors
}); });
it('handles single highlight from one user', () => { it("handles single highlight from one user", () => {
const mockHighlight = { const mockHighlight = {
id: 'highlight1', id: "highlight1",
kind: 9802, kind: 9802,
pubkey: 'a'.repeat(64), pubkey: "a".repeat(64),
content: 'highlighted text', content: "highlighted text",
created_at: Date.now(), created_at: Date.now(),
tags: [], tags: [],
}; };
@ -329,25 +331,25 @@ describe('HighlightLayer Component', () => {
const highlights = [mockHighlight]; const highlights = [mockHighlight];
expect(highlights.length).toBe(1); expect(highlights.length).toBe(1);
expect(highlights[0].pubkey).toBe('a'.repeat(64)); expect(highlights[0].pubkey).toBe("a".repeat(64));
}); });
it('handles multiple highlights from same user', () => { it("handles multiple highlights from same user", () => {
const pubkey = 'a'.repeat(64); const pubkey = "a".repeat(64);
const mockHighlights = [ const mockHighlights = [
{ {
id: 'highlight1', id: "highlight1",
kind: 9802, kind: 9802,
pubkey: pubkey, pubkey: pubkey,
content: 'first highlight', content: "first highlight",
created_at: Date.now(), created_at: Date.now(),
tags: [], tags: [],
}, },
{ {
id: 'highlight2', id: "highlight2",
kind: 9802, kind: 9802,
pubkey: pubkey, pubkey: pubkey,
content: 'second highlight', content: "second highlight",
created_at: Date.now(), created_at: Date.now(),
tags: [], tags: [],
}, },
@ -363,33 +365,33 @@ describe('HighlightLayer Component', () => {
expect(color).toMatch(/^hsla\(\d+, 70%, 60%, 0\.3\)$/); expect(color).toMatch(/^hsla\(\d+, 70%, 60%, 0\.3\)$/);
}); });
it('handles multiple highlights from different users', () => { it("handles multiple highlights from different users", () => {
const pubkey1 = 'a'.repeat(64); const pubkey1 = "a".repeat(64);
const pubkey2 = 'b'.repeat(64); const pubkey2 = "b".repeat(64);
const pubkey3 = 'c'.repeat(64); const pubkey3 = "c".repeat(64);
const mockHighlights = [ const mockHighlights = [
{ {
id: 'highlight1', id: "highlight1",
kind: 9802, kind: 9802,
pubkey: pubkey1, pubkey: pubkey1,
content: 'highlight from user 1', content: "highlight from user 1",
created_at: Date.now(), created_at: Date.now(),
tags: [], tags: [],
}, },
{ {
id: 'highlight2', id: "highlight2",
kind: 9802, kind: 9802,
pubkey: pubkey2, pubkey: pubkey2,
content: 'highlight from user 2', content: "highlight from user 2",
created_at: Date.now(), created_at: Date.now(),
tags: [], tags: [],
}, },
{ {
id: 'highlight3', id: "highlight3",
kind: 9802, kind: 9802,
pubkey: pubkey3, pubkey: pubkey3,
content: 'highlight from user 3', content: "highlight from user 3",
created_at: Date.now(), created_at: Date.now(),
tags: [], tags: [],
}, },
@ -407,12 +409,12 @@ describe('HighlightLayer Component', () => {
expect(hue1).not.toBe(hue3); expect(hue1).not.toBe(hue3);
}); });
it('prevents duplicate highlights', () => { it("prevents duplicate highlights", () => {
const mockHighlight = { const mockHighlight = {
id: 'highlight1', id: "highlight1",
kind: 9802, kind: 9802,
pubkey: 'a'.repeat(64), pubkey: "a".repeat(64),
content: 'highlighted text', content: "highlighted text",
created_at: Date.now(), created_at: Date.now(),
tags: [], tags: [],
}; };
@ -420,32 +422,32 @@ describe('HighlightLayer Component', () => {
const highlights = [mockHighlight]; const highlights = [mockHighlight];
// Try to add duplicate // Try to add duplicate
const isDuplicate = highlights.some(h => h.id === mockHighlight.id); const isDuplicate = highlights.some((h) => h.id === mockHighlight.id);
expect(isDuplicate).toBe(true); expect(isDuplicate).toBe(true);
// Should not add duplicate // Should not add duplicate
}); });
it('handles empty content gracefully', () => { it("handles empty content gracefully", () => {
const mockHighlight = { const mockHighlight = {
id: 'highlight1', id: "highlight1",
kind: 9802, kind: 9802,
pubkey: 'a'.repeat(64), pubkey: "a".repeat(64),
content: '', content: "",
created_at: Date.now(), created_at: Date.now(),
tags: [], tags: [],
}; };
// Should not crash // Should not crash
expect(mockHighlight.content).toBe(''); expect(mockHighlight.content).toBe("");
}); });
it('handles whitespace-only content', () => { it("handles whitespace-only content", () => {
const mockHighlight = { const mockHighlight = {
id: 'highlight1', id: "highlight1",
kind: 9802, kind: 9802,
pubkey: 'a'.repeat(64), pubkey: "a".repeat(64),
content: ' \n\t ', content: " \n\t ",
created_at: Date.now(), created_at: Date.now(),
tags: [], tags: [],
}; };
@ -455,9 +457,9 @@ describe('HighlightLayer Component', () => {
}); });
}); });
describe('Highlighter Legend', () => { describe("Highlighter Legend", () => {
it('displays legend with correct color for single highlighter', () => { it("displays legend with correct color for single highlighter", () => {
const pubkey = 'abc123def456'.repeat(5) + 'abcd'; const pubkey = "abc123def456".repeat(5) + "abcd";
const hue = pubkeyToHue(pubkey); const hue = pubkeyToHue(pubkey);
const color = `hsla(${hue}, 70%, 60%, 0.3)`; const color = `hsla(${hue}, 70%, 60%, 0.3)`;
@ -471,14 +473,14 @@ describe('HighlightLayer Component', () => {
expect(legend.shortPubkey).toBe(`${pubkey.slice(0, 8)}...`); expect(legend.shortPubkey).toBe(`${pubkey.slice(0, 8)}...`);
}); });
it('displays legend with colors for multiple highlighters', () => { it("displays legend with colors for multiple highlighters", () => {
const pubkeys = [ const pubkeys = [
'a'.repeat(64), "a".repeat(64),
'b'.repeat(64), "b".repeat(64),
'c'.repeat(64), "c".repeat(64),
]; ];
const legendEntries = pubkeys.map(pubkey => ({ const legendEntries = pubkeys.map((pubkey) => ({
pubkey, pubkey,
color: `hsla(${pubkeyToHue(pubkey)}, 70%, 60%, 0.3)`, color: `hsla(${pubkeyToHue(pubkey)}, 70%, 60%, 0.3)`,
shortPubkey: `${pubkey.slice(0, 8)}...`, shortPubkey: `${pubkey.slice(0, 8)}...`,
@ -487,45 +489,45 @@ describe('HighlightLayer Component', () => {
expect(legendEntries.length).toBe(3); expect(legendEntries.length).toBe(3);
// Each should have unique color // Each should have unique color
const colors = legendEntries.map(e => e.color); const colors = legendEntries.map((e) => e.color);
const uniqueColors = new Set(colors); const uniqueColors = new Set(colors);
expect(uniqueColors.size).toBe(3); expect(uniqueColors.size).toBe(3);
}); });
it('shows truncated pubkey in legend', () => { it("shows truncated pubkey in legend", () => {
const pubkey = 'abcdefghijklmnop'.repeat(4); const pubkey = "abcdefghijklmnop".repeat(4);
const shortPubkey = `${pubkey.slice(0, 8)}...`; const shortPubkey = `${pubkey.slice(0, 8)}...`;
expect(shortPubkey).toBe('abcdefgh...'); expect(shortPubkey).toBe("abcdefgh...");
expect(shortPubkey.length).toBeLessThan(pubkey.length); expect(shortPubkey.length).toBeLessThan(pubkey.length);
}); });
it('displays highlight count', () => { it("displays highlight count", () => {
const highlights = [ const highlights = [
{ id: '1', pubkey: 'a'.repeat(64), content: 'text1' }, { id: "1", pubkey: "a".repeat(64), content: "text1" },
{ id: '2', pubkey: 'b'.repeat(64), content: 'text2' }, { id: "2", pubkey: "b".repeat(64), content: "text2" },
{ id: '3', pubkey: 'a'.repeat(64), content: 'text3' }, { id: "3", pubkey: "a".repeat(64), content: "text3" },
]; ];
expect(highlights.length).toBe(3); expect(highlights.length).toBe(3);
// Count unique highlighters // Count unique highlighters
const uniqueHighlighters = new Set(highlights.map(h => h.pubkey)); const uniqueHighlighters = new Set(highlights.map((h) => h.pubkey));
expect(uniqueHighlighters.size).toBe(2); expect(uniqueHighlighters.size).toBe(2);
}); });
}); });
describe('Text Matching', () => { describe("Text Matching", () => {
it('matches text case-insensitively', () => { it("matches text case-insensitively", () => {
const searchText = 'Hello World'; const searchText = "Hello World";
const contentText = 'hello world'; const contentText = "hello world";
const index = contentText.toLowerCase().indexOf(searchText.toLowerCase()); const index = contentText.toLowerCase().indexOf(searchText.toLowerCase());
expect(index).toBeGreaterThanOrEqual(0); expect(index).toBeGreaterThanOrEqual(0);
}); });
it('handles special characters in search text', () => { it("handles special characters in search text", () => {
const searchText = 'text with "quotes" and symbols!'; const searchText = 'text with "quotes" and symbols!';
const contentText = 'This is text with "quotes" and symbols! in it.'; const contentText = 'This is text with "quotes" and symbols! in it.';
@ -534,67 +536,75 @@ describe('HighlightLayer Component', () => {
expect(index).toBeGreaterThanOrEqual(0); expect(index).toBeGreaterThanOrEqual(0);
}); });
it('handles Unicode characters', () => { it("handles Unicode characters", () => {
const searchText = 'café résumé'; const searchText = "café résumé";
const contentText = 'The café résumé was excellent.'; const contentText = "The café résumé was excellent.";
const index = contentText.toLowerCase().indexOf(searchText.toLowerCase()); const index = contentText.toLowerCase().indexOf(searchText.toLowerCase());
expect(index).toBeGreaterThanOrEqual(0); expect(index).toBeGreaterThanOrEqual(0);
}); });
it('handles multi-line text', () => { it("handles multi-line text", () => {
const searchText = 'line one\nline two'; const searchText = "line one\nline two";
const contentText = 'This is line one\nline two in the document.'; const contentText = "This is line one\nline two in the document.";
const index = contentText.indexOf(searchText); const index = contentText.indexOf(searchText);
expect(index).toBeGreaterThanOrEqual(0); expect(index).toBeGreaterThanOrEqual(0);
}); });
it('does not match partial words when searching for whole words', () => { it("does not match partial words when searching for whole words", () => {
const searchText = 'cat'; const searchText = "cat";
const contentText = 'The category is important.'; const contentText = "The category is important.";
// Simple word boundary check // Simple word boundary check
const wordBoundaryMatch = new RegExp(`\\b${searchText}\\b`, 'i').test(contentText); const wordBoundaryMatch = new RegExp(`\\b${searchText}\\b`, "i").test(
contentText,
);
expect(wordBoundaryMatch).toBe(false); expect(wordBoundaryMatch).toBe(false);
}); });
}); });
describe('Subscription Lifecycle', () => { describe("Subscription Lifecycle", () => {
it('registers EOSE event handler', () => { it("registers EOSE event handler", () => {
const subscription = mockNdk.subscribe({ kinds: [9802], limit: 100 }); const subscription = mockNdk.subscribe({ kinds: [9802], limit: 100 });
// Verify that 'on' method is available for registering handlers // Verify that 'on' method is available for registering handlers
expect(subscription.on).toBeDefined(); expect(subscription.on).toBeDefined();
// Register EOSE handler // Register EOSE handler
subscription.on('eose', () => { subscription.on("eose", () => {
subscription.stop(); subscription.stop();
}); });
// Verify on was called // Verify on was called
expect(subscription.on).toHaveBeenCalledWith('eose', expect.any(Function)); expect(subscription.on).toHaveBeenCalledWith(
"eose",
expect.any(Function),
);
}); });
it('registers error event handler', () => { it("registers error event handler", () => {
const subscription = mockNdk.subscribe({ kinds: [9802], limit: 100 }); const subscription = mockNdk.subscribe({ kinds: [9802], limit: 100 });
// Verify that 'on' method is available for registering handlers // Verify that 'on' method is available for registering handlers
expect(subscription.on).toBeDefined(); expect(subscription.on).toBeDefined();
// Register error handler // Register error handler
subscription.on('error', () => { subscription.on("error", () => {
subscription.stop(); subscription.stop();
}); });
// Verify on was called // Verify on was called
expect(subscription.on).toHaveBeenCalledWith('error', expect.any(Function)); expect(subscription.on).toHaveBeenCalledWith(
"error",
expect.any(Function),
);
}); });
it('stops subscription on timeout', async () => { it("stops subscription on timeout", async () => {
vi.useFakeTimers(); vi.useFakeTimers();
mockNdk.subscribe({ kinds: [9802], limit: 100 }); mockNdk.subscribe({ kinds: [9802], limit: 100 });
@ -608,7 +618,7 @@ describe('HighlightLayer Component', () => {
vi.useRealTimers(); vi.useRealTimers();
}); });
it('handles multiple subscription cleanup calls safely', () => { it("handles multiple subscription cleanup calls safely", () => {
mockNdk.subscribe({ kinds: [9802], limit: 100 }); mockNdk.subscribe({ kinds: [9802], limit: 100 });
// Call stop multiple times // Call stop multiple times
@ -621,8 +631,8 @@ describe('HighlightLayer Component', () => {
}); });
}); });
describe('Performance', () => { describe("Performance", () => {
it('handles large number of highlights efficiently', () => { it("handles large number of highlights efficiently", () => {
const startTime = Date.now(); const startTime = Date.now();
const highlights = Array.from({ length: 1000 }, (_, i) => ({ const highlights = Array.from({ length: 1000 }, (_, i) => ({
@ -636,7 +646,7 @@ describe('HighlightLayer Component', () => {
// Generate colors for all highlights // Generate colors for all highlights
const colorMap = new Map<string, string>(); const colorMap = new Map<string, string>();
highlights.forEach(h => { highlights.forEach((h) => {
if (!colorMap.has(h.pubkey)) { if (!colorMap.has(h.pubkey)) {
const hue = pubkeyToHue(h.pubkey); const hue = pubkeyToHue(h.pubkey);
colorMap.set(h.pubkey, `hsla(${hue}, 70%, 60%, 0.3)`); colorMap.set(h.pubkey, `hsla(${hue}, 70%, 60%, 0.3)`);
@ -653,9 +663,9 @@ describe('HighlightLayer Component', () => {
}); });
}); });
describe('Integration Tests', () => { describe("Integration Tests", () => {
describe('Toggle Functionality', () => { describe("Toggle Functionality", () => {
it('toggle button shows highlights when clicked', () => { it("toggle button shows highlights when clicked", () => {
let highlightsVisible = false; let highlightsVisible = false;
// Simulate toggle // Simulate toggle
@ -664,7 +674,7 @@ describe('Integration Tests', () => {
expect(highlightsVisible).toBe(true); expect(highlightsVisible).toBe(true);
}); });
it('toggle button hides highlights when clicked again', () => { it("toggle button hides highlights when clicked again", () => {
let highlightsVisible = true; let highlightsVisible = true;
// Simulate toggle // Simulate toggle
@ -673,7 +683,7 @@ describe('Integration Tests', () => {
expect(highlightsVisible).toBe(false); expect(highlightsVisible).toBe(false);
}); });
it('toggle state persists between interactions', () => { it("toggle state persists between interactions", () => {
let highlightsVisible = false; let highlightsVisible = false;
highlightsVisible = !highlightsVisible; highlightsVisible = !highlightsVisible;
@ -687,37 +697,38 @@ describe('Integration Tests', () => {
}); });
}); });
describe('Color Format Validation', () => { describe("Color Format Validation", () => {
it('generates semi-transparent colors with 0.3 opacity', () => { it("generates semi-transparent colors with 0.3 opacity", () => {
const pubkeys = [ const pubkeys = [
'a'.repeat(64), "a".repeat(64),
'b'.repeat(64), "b".repeat(64),
'c'.repeat(64), "c".repeat(64),
]; ];
pubkeys.forEach(pubkey => { pubkeys.forEach((pubkey) => {
const hue = pubkeyToHue(pubkey); const hue = pubkeyToHue(pubkey);
const color = `hsla(${hue}, 70%, 60%, 0.3)`; const color = `hsla(${hue}, 70%, 60%, 0.3)`;
expect(color).toContain('0.3'); expect(color).toContain("0.3");
}); });
}); });
it('uses HSL color format with correct saturation and lightness', () => { it("uses HSL color format with correct saturation and lightness", () => {
const pubkey = 'a'.repeat(64); const pubkey = "a".repeat(64);
const hue = pubkeyToHue(pubkey); const hue = pubkeyToHue(pubkey);
const color = `hsla(${hue}, 70%, 60%, 0.3)`; const color = `hsla(${hue}, 70%, 60%, 0.3)`;
expect(color).toContain('70%'); expect(color).toContain("70%");
expect(color).toContain('60%'); expect(color).toContain("60%");
}); });
it('generates valid CSS color strings', () => { it("generates valid CSS color strings", () => {
const pubkeys = Array.from({ length: 20 }, (_, i) => const pubkeys = Array.from(
String.fromCharCode(97 + i).repeat(64) { length: 20 },
(_, i) => String.fromCharCode(97 + i).repeat(64),
); );
pubkeys.forEach(pubkey => { pubkeys.forEach((pubkey) => {
const hue = pubkeyToHue(pubkey); const hue = pubkeyToHue(pubkey);
const color = `hsla(${hue}, 70%, 60%, 0.3)`; const color = `hsla(${hue}, 70%, 60%, 0.3)`;
@ -727,8 +738,8 @@ describe('Integration Tests', () => {
}); });
}); });
describe('End-to-End Flow', () => { describe("End-to-End Flow", () => {
it('complete highlight workflow', () => { it("complete highlight workflow", () => {
// 1. Start with no highlights visible // 1. Start with no highlights visible
let highlightsVisible = false; let highlightsVisible = false;
let highlights: any[] = []; let highlights: any[] = [];
@ -739,18 +750,18 @@ describe('Integration Tests', () => {
// 2. Fetch highlights // 2. Fetch highlights
const mockHighlights = [ const mockHighlights = [
{ {
id: 'h1', id: "h1",
kind: 9802, kind: 9802,
pubkey: 'a'.repeat(64), pubkey: "a".repeat(64),
content: 'first highlight', content: "first highlight",
created_at: Date.now(), created_at: Date.now(),
tags: [], tags: [],
}, },
{ {
id: 'h2', id: "h2",
kind: 9802, kind: 9802,
pubkey: 'b'.repeat(64), pubkey: "b".repeat(64),
content: 'second highlight', content: "second highlight",
created_at: Date.now(), created_at: Date.now(),
tags: [], tags: [],
}, },
@ -761,7 +772,7 @@ describe('Integration Tests', () => {
// 3. Generate color map // 3. Generate color map
const colorMap = new Map<string, string>(); const colorMap = new Map<string, string>();
highlights.forEach(h => { highlights.forEach((h) => {
if (!colorMap.has(h.pubkey)) { if (!colorMap.has(h.pubkey)) {
const hue = pubkeyToHue(h.pubkey); const hue = pubkeyToHue(h.pubkey);
colorMap.set(h.pubkey, `hsla(${hue}, 70%, 60%, 0.3)`); colorMap.set(h.pubkey, `hsla(${hue}, 70%, 60%, 0.3)`);
@ -783,17 +794,17 @@ describe('Integration Tests', () => {
expect(highlightsVisible).toBe(false); expect(highlightsVisible).toBe(false);
}); });
it('handles event updates correctly', () => { it("handles event updates correctly", () => {
let eventId = 'event1'; let eventId = "event1";
let highlights: any[] = []; let highlights: any[] = [];
// Initial load // Initial load
highlights = [ highlights = [
{ {
id: 'h1', id: "h1",
kind: 9802, kind: 9802,
pubkey: 'a'.repeat(64), pubkey: "a".repeat(64),
content: 'highlight 1', content: "highlight 1",
created_at: Date.now(), created_at: Date.now(),
tags: [], tags: [],
}, },
@ -802,7 +813,7 @@ describe('Integration Tests', () => {
expect(highlights.length).toBe(1); expect(highlights.length).toBe(1);
// Event changes // Event changes
eventId = 'event2'; eventId = "event2";
highlights = []; highlights = [];
expect(highlights.length).toBe(0); expect(highlights.length).toBe(0);
@ -810,22 +821,22 @@ describe('Integration Tests', () => {
// New highlights loaded // New highlights loaded
highlights = [ highlights = [
{ {
id: 'h2', id: "h2",
kind: 9802, kind: 9802,
pubkey: 'b'.repeat(64), pubkey: "b".repeat(64),
content: 'highlight 2', content: "highlight 2",
created_at: Date.now(), created_at: Date.now(),
tags: [], tags: [],
}, },
]; ];
expect(highlights.length).toBe(1); expect(highlights.length).toBe(1);
expect(highlights[0].id).toBe('h2'); expect(highlights[0].id).toBe("h2");
}); });
}); });
describe('Error Handling', () => { describe("Error Handling", () => {
it('handles missing event ID and address gracefully', () => { it("handles missing event ID and address gracefully", () => {
const eventId = undefined; const eventId = undefined;
const eventAddress = undefined; const eventAddress = undefined;
@ -834,25 +845,25 @@ describe('Integration Tests', () => {
expect(eventAddress).toBeUndefined(); expect(eventAddress).toBeUndefined();
}); });
it('handles subscription errors gracefully', () => { it("handles subscription errors gracefully", () => {
const error = new Error('Subscription failed'); const error = new Error("Subscription failed");
// Should log error but not crash // Should log error but not crash
expect(error.message).toBe('Subscription failed'); expect(error.message).toBe("Subscription failed");
}); });
it('handles malformed highlight events', () => { it("handles malformed highlight events", () => {
const malformedHighlight = { const malformedHighlight = {
id: 'h1', id: "h1",
kind: 9802, kind: 9802,
pubkey: '', // Empty pubkey pubkey: "", // Empty pubkey
content: undefined, // Missing content content: undefined, // Missing content
created_at: Date.now(), created_at: Date.now(),
tags: [], tags: [],
}; };
// Should handle gracefully // Should handle gracefully
expect(malformedHighlight.pubkey).toBe(''); expect(malformedHighlight.pubkey).toBe("");
expect(malformedHighlight.content).toBeUndefined(); expect(malformedHighlight.content).toBeUndefined();
}); });
}); });

45
tests/unit/publication_tree_processor.test.ts

@ -5,15 +5,19 @@
* using deep_hierarchy_test.adoc to verify NKBIP-01 compliance. * using deep_hierarchy_test.adoc to verify NKBIP-01 compliance.
*/ */
import { describe, it, expect, beforeAll } from 'vitest'; import { beforeAll, describe, expect, it } from "vitest";
import { readFileSync } from 'fs'; import { readFileSync } from "fs";
import { parseAsciiDocWithTree, validateParseLevel, getSupportedParseLevels } from '../../src/lib/utils/asciidoc_publication_parser.js'; import {
getSupportedParseLevels,
parseAsciiDocWithTree,
validateParseLevel,
} from "../../src/lib/utils/asciidoc_publication_parser.js";
// Mock NDK for testing // Mock NDK for testing
const mockNDK = { const mockNDK = {
activeUser: { activeUser: {
pubkey: "test-pubkey-12345" pubkey: "test-pubkey-12345",
} },
} as any; } as any;
// Read the test document // Read the test document
@ -21,7 +25,7 @@ const testDocumentPath = "./test_data/AsciidocFiles/deep_hierarchy_test.adoc";
let testContent: string; let testContent: string;
try { try {
testContent = readFileSync(testDocumentPath, 'utf-8'); testContent = readFileSync(testDocumentPath, "utf-8");
} catch (error) { } catch (error) {
console.error("Failed to read test document:", error); console.error("Failed to read test document:", error);
testContent = `= Deep Hierarchical Document Test testContent = `= Deep Hierarchical Document Test
@ -65,7 +69,6 @@ A second main section to ensure we have balanced content at the top level.`;
} }
describe("NKBIP-01 Publication Tree Processor", () => { describe("NKBIP-01 Publication Tree Processor", () => {
it("should validate parse levels correctly", () => { it("should validate parse levels correctly", () => {
// Test valid parse levels // Test valid parse levels
expect(validateParseLevel(2)).toBe(true); expect(validateParseLevel(2)).toBe(true);
@ -98,12 +101,12 @@ describe("NKBIP-01 Publication Tree Processor", () => {
expect(result.contentEvents.length).toBe(2); expect(result.contentEvents.length).toBe(2);
// All content events should be kind 30041 // All content events should be kind 30041
result.contentEvents.forEach(event => { result.contentEvents.forEach((event) => {
expect(event.kind).toBe(30041); expect(event.kind).toBe(30041);
}); });
// Check titles of level 2 sections // Check titles of level 2 sections
const contentTitles = result.contentEvents.map(e => const contentTitles = result.contentEvents.map((e) =>
e.tags.find((t: string[]) => t[0] === "title")?.[1] e.tags.find((t: string[]) => t[0] === "title")?.[1]
); );
expect(contentTitles).toContain("Level 2: Main Sections"); expect(contentTitles).toContain("Level 2: Main Sections");
@ -114,8 +117,11 @@ describe("NKBIP-01 Publication Tree Processor", () => {
expect(firstSectionContent).toBeDefined(); expect(firstSectionContent).toBeDefined();
// Should contain level 3, 4, 5 content as nested AsciiDoc markup // Should contain level 3, 4, 5 content as nested AsciiDoc markup
expect(firstSectionContent.includes("=== Level 3: Subsections")).toBe(true); expect(firstSectionContent.includes("=== Level 3: Subsections")).toBe(true);
expect(firstSectionContent.includes("==== Level 4: Sub-subsections")).toBe(true); expect(firstSectionContent.includes("==== Level 4: Sub-subsections")).toBe(
expect(firstSectionContent.includes("===== Level 5: Deep Subsections")).toBe(true); true,
);
expect(firstSectionContent.includes("===== Level 5: Deep Subsections"))
.toBe(true);
}); });
it("should parse Level 3 with NKBIP-01 intermediate structure", async () => { it("should parse Level 3 with NKBIP-01 intermediate structure", async () => {
@ -129,19 +135,19 @@ describe("NKBIP-01 Publication Tree Processor", () => {
expect(result.indexEvent?.kind).toBe(30040); expect(result.indexEvent?.kind).toBe(30040);
// Should have mix of 30040 (for level 2 sections with children) and 30041 (for content) // Should have mix of 30040 (for level 2 sections with children) and 30041 (for content)
const kinds = result.contentEvents.map(e => e.kind); const kinds = result.contentEvents.map((e) => e.kind);
expect(kinds).toContain(30040); // Level 2 sections with children expect(kinds).toContain(30040); // Level 2 sections with children
expect(kinds).toContain(30041); // Level 3 content sections expect(kinds).toContain(30041); // Level 3 content sections
// Level 2 sections with children should be 30040 index events // Level 2 sections with children should be 30040 index events
const level2WithChildrenEvents = result.contentEvents.filter(e => const level2WithChildrenEvents = result.contentEvents.filter((e) =>
e.kind === 30040 && e.kind === 30040 &&
e.tags.find((t: string[]) => t[0] === "title")?.[1]?.includes("Level 2:") e.tags.find((t: string[]) => t[0] === "title")?.[1]?.includes("Level 2:")
); );
expect(level2WithChildrenEvents.length).toBe(2); // Both level 2 sections have children expect(level2WithChildrenEvents.length).toBe(2); // Both level 2 sections have children
// Should have 30041 events for level 3 content // Should have 30041 events for level 3 content
const level3ContentEvents = result.contentEvents.filter(e => const level3ContentEvents = result.contentEvents.filter((e) =>
e.kind === 30041 && e.kind === 30041 &&
e.tags.find((t: string[]) => t[0] === "title")?.[1]?.includes("Level 3:") e.tags.find((t: string[]) => t[0] === "title")?.[1]?.includes("Level 3:")
); );
@ -158,12 +164,12 @@ describe("NKBIP-01 Publication Tree Processor", () => {
expect(result.indexEvent).toBeDefined(); expect(result.indexEvent).toBeDefined();
expect(result.indexEvent?.kind).toBe(30040); expect(result.indexEvent?.kind).toBe(30040);
const kinds = result.contentEvents.map(e => e.kind); const kinds = result.contentEvents.map((e) => e.kind);
expect(kinds).toContain(30040); // Level 2 sections with children expect(kinds).toContain(30040); // Level 2 sections with children
expect(kinds).toContain(30041); // Content sections expect(kinds).toContain(30041); // Content sections
// Check that we have level 4 content sections // Check that we have level 4 content sections
const contentTitles = result.contentEvents.map(e => const contentTitles = result.contentEvents.map((e) =>
e.tags.find((t: string[]) => t[0] === "title")?.[1] e.tags.find((t: string[]) => t[0] === "title")?.[1]
); );
expect(contentTitles).toContain("Level 4: Sub-subsections"); expect(contentTitles).toContain("Level 4: Sub-subsections");
@ -180,7 +186,7 @@ describe("NKBIP-01 Publication Tree Processor", () => {
expect(result.indexEvent?.kind).toBe(30040); expect(result.indexEvent?.kind).toBe(30040);
// Should include level 5 sections as content events // Should include level 5 sections as content events
const contentTitles = result.contentEvents.map(e => const contentTitles = result.contentEvents.map((e) =>
e.tags.find((t: string[]) => t[0] === "title")?.[1] e.tags.find((t: string[]) => t[0] === "title")?.[1]
); );
expect(contentTitles).toContain("Level 5: Deep Subsections"); expect(contentTitles).toContain("Level 5: Deep Subsections");
@ -204,7 +210,7 @@ describe("NKBIP-01 Publication Tree Processor", () => {
expect(titleTag![1]).toBe("Deep Hierarchical Document Test"); expect(titleTag![1]).toBe("Deep Hierarchical Document Test");
// Test content events structure - mix of 30040 and 30041 // Test content events structure - mix of 30040 and 30041
result.contentEvents.forEach(event => { result.contentEvents.forEach((event) => {
expect([30040, 30041]).toContain(event.kind); expect([30040, 30041]).toContain(event.kind);
expect(event.tags).toBeDefined(); expect(event.tags).toBeDefined();
expect(event.content).toBeDefined(); expect(event.content).toBeDefined();
@ -262,7 +268,7 @@ Content of second note.`;
expect(result.contentEvents.length).toBe(2); expect(result.contentEvents.length).toBe(2);
// All events should be 30041 content events // All events should be 30041 content events
result.contentEvents.forEach(event => { result.contentEvents.forEach((event) => {
expect(event.kind).toBe(30041); expect(event.kind).toBe(30041);
}); });
}); });
@ -280,5 +286,4 @@ Content of second note.`;
expect(result.metadata.eventStructure).toBeDefined(); expect(result.metadata.eventStructure).toBeDefined();
expect(Array.isArray(result.metadata.eventStructure)).toBe(true); expect(Array.isArray(result.metadata.eventStructure)).toBe(true);
}); });
}); });

500
tests/zettel-publisher-tdd.test.ts

@ -14,8 +14,8 @@
* 7. Custom attributes: all :key: value pairs preserved as event tags * 7. Custom attributes: all :key: value pairs preserved as event tags
*/ */
import fs from 'fs'; import fs from "fs";
import path from 'path'; import path from "path";
// Test framework // Test framework
interface TestCase { interface TestCase {
@ -40,7 +40,9 @@ class TestFramework {
}, },
toEqual: (expected: any) => { toEqual: (expected: any) => {
if (JSON.stringify(actual) === JSON.stringify(expected)) return true; if (JSON.stringify(actual) === JSON.stringify(expected)) return true;
throw new Error(`Expected ${JSON.stringify(expected)}, got ${JSON.stringify(actual)}`); throw new Error(
`Expected ${JSON.stringify(expected)}, got ${JSON.stringify(actual)}`,
);
}, },
toContain: (expected: any) => { toContain: (expected: any) => {
if (actual && actual.includes && actual.includes(expected)) return true; if (actual && actual.includes && actual.includes(expected)) return true;
@ -48,9 +50,11 @@ class TestFramework {
}, },
not: { not: {
toContain: (expected: any) => { toContain: (expected: any) => {
if (actual && actual.includes && !actual.includes(expected)) return true; if (actual && actual.includes && !actual.includes(expected)) {
throw new Error(`Expected "${actual}" NOT to contain "${expected}"`); return true;
} }
throw new Error(`Expected "${actual}" NOT to contain "${expected}"`);
},
}, },
toBeTruthy: () => { toBeTruthy: () => {
if (actual) return true; if (actual) return true;
@ -58,8 +62,12 @@ class TestFramework {
}, },
toHaveLength: (expected: number) => { toHaveLength: (expected: number) => {
if (actual && actual.length === expected) return true; if (actual && actual.length === expected) return true;
throw new Error(`Expected length ${expected}, got ${actual ? actual.length : 'undefined'}`); throw new Error(
} `Expected length ${expected}, got ${
actual ? actual.length : "undefined"
}`,
);
},
}; };
} }
@ -87,57 +95,68 @@ class TestFramework {
const test = new TestFramework(); const test = new TestFramework();
// Load test data files // Load test data files
const testDataPath = path.join(process.cwd(), 'test_data', 'AsciidocFiles'); const testDataPath = path.join(process.cwd(), "test_data", "AsciidocFiles");
const understandingKnowledge = fs.readFileSync(path.join(testDataPath, 'understanding_knowledge.adoc'), 'utf-8'); const understandingKnowledge = fs.readFileSync(
const desire = fs.readFileSync(path.join(testDataPath, 'desire.adoc'), 'utf-8'); path.join(testDataPath, "understanding_knowledge.adoc"),
"utf-8",
);
const desire = fs.readFileSync(path.join(testDataPath, "desire.adoc"), "utf-8");
// ============================================================================= // =============================================================================
// PHASE 1: Core Data Structure Tests (Based on Real Test Data) // PHASE 1: Core Data Structure Tests (Based on Real Test Data)
// ============================================================================= // =============================================================================
test.test('Understanding Knowledge: Document metadata should be extracted from = level', () => { test.test("Understanding Knowledge: Document metadata should be extracted from = level", () => {
// Expected 30040 metadata from understanding_knowledge.adoc // Expected 30040 metadata from understanding_knowledge.adoc
const expectedDocMetadata = { const expectedDocMetadata = {
title: 'Understanding Knowledge', title: "Understanding Knowledge",
image: 'https://i.nostr.build/IUs0xNyUEf5hXTFL.jpg', image: "https://i.nostr.build/IUs0xNyUEf5hXTFL.jpg",
published: '2025-04-21', published: "2025-04-21",
language: 'en, ISO-639-1', language: "en, ISO-639-1",
tags: ['knowledge', 'philosophy', 'education'], tags: ["knowledge", "philosophy", "education"],
type: 'text' type: "text",
}; };
// Test will pass when document parsing extracts these correctly // Test will pass when document parsing extracts these correctly
test.expect(expectedDocMetadata.title).toBe('Understanding Knowledge'); test.expect(expectedDocMetadata.title).toBe("Understanding Knowledge");
test.expect(expectedDocMetadata.tags).toHaveLength(3); test.expect(expectedDocMetadata.tags).toHaveLength(3);
test.expect(expectedDocMetadata.type).toBe('text'); test.expect(expectedDocMetadata.type).toBe("text");
}); });
test.test('Desire: Document metadata should include all custom attributes', () => { test.test("Desire: Document metadata should include all custom attributes", () => {
// Expected 30040 metadata from desire.adoc // Expected 30040 metadata from desire.adoc
const expectedDocMetadata = { const expectedDocMetadata = {
title: 'Desire Part 1: Mimesis', title: "Desire Part 1: Mimesis",
image: 'https://i.nostr.build/hGzyi4c3YhTwoCCe.png', image: "https://i.nostr.build/hGzyi4c3YhTwoCCe.png",
published: '2025-07-02', published: "2025-07-02",
language: 'en, ISO-639-1', language: "en, ISO-639-1",
tags: ['memetics', 'philosophy', 'desire'], tags: ["memetics", "philosophy", "desire"],
type: 'podcastArticle' type: "podcastArticle",
}; };
test.expect(expectedDocMetadata.type).toBe('podcastArticle'); test.expect(expectedDocMetadata.type).toBe("podcastArticle");
test.expect(expectedDocMetadata.tags).toContain('memetics'); test.expect(expectedDocMetadata.tags).toContain("memetics");
}); });
test.test('Iterative ParsedAsciiDoc interface should support level-based parsing', () => { test.test("Iterative ParsedAsciiDoc interface should support level-based parsing", () => {
// Test the ITERATIVE interface structure (not recursive) // Test the ITERATIVE interface structure (not recursive)
// Based on docreference.md - Level 2 parsing example // Based on docreference.md - Level 2 parsing example
const mockLevel2Structure = { const mockLevel2Structure = {
metadata: { title: 'Programming Fundamentals Guide', tags: ['programming', 'fundamentals'] }, metadata: {
content: 'This is the main introduction to the programming guide.', title: "Programming Fundamentals Guide",
title: 'Programming Fundamentals Guide', tags: ["programming", "fundamentals"],
},
content: "This is the main introduction to the programming guide.",
title: "Programming Fundamentals Guide",
sections: [ sections: [
{ {
metadata: { title: 'Data Structures', tags: ['arrays', 'lists', 'trees'], difficulty: 'intermediate' }, metadata: {
content: `Understanding fundamental data structures is crucial for effective programming. title: "Data Structures",
tags: ["arrays", "lists", "trees"],
difficulty: "intermediate",
},
content:
`Understanding fundamental data structures is crucial for effective programming.
=== Arrays and Lists === Arrays and Lists
@ -155,11 +174,16 @@ Linked lists use pointers to connect elements.
=== Trees and Graphs === Trees and Graphs
Tree and graph structures enable hierarchical and networked data representation.`, Tree and graph structures enable hierarchical and networked data representation.`,
title: 'Data Structures' title: "Data Structures",
}, },
{ {
metadata: { title: 'Algorithms', tags: ['sorting', 'searching', 'optimization'], difficulty: 'advanced' }, metadata: {
content: `Algorithmic thinking forms the foundation of efficient problem-solving. title: "Algorithms",
tags: ["sorting", "searching", "optimization"],
difficulty: "advanced",
},
content:
`Algorithmic thinking forms the foundation of efficient problem-solving.
=== Sorting Algorithms === Sorting Algorithms
@ -172,54 +196,64 @@ Bubble sort repeatedly steps through the list, compares adjacent elements.
==== Quick Sort ==== Quick Sort
Quick sort uses divide-and-conquer approach with pivot selection.`, Quick sort uses divide-and-conquer approach with pivot selection.`,
title: 'Algorithms' title: "Algorithms",
} },
] ],
}; };
// Verify ITERATIVE structure: only level 2 sections, containing ALL subsections // Verify ITERATIVE structure: only level 2 sections, containing ALL subsections
test.expect(mockLevel2Structure.sections).toHaveLength(2); test.expect(mockLevel2Structure.sections).toHaveLength(2);
test.expect(mockLevel2Structure.sections[0].title).toBe('Data Structures'); test.expect(mockLevel2Structure.sections[0].title).toBe("Data Structures");
test.expect(mockLevel2Structure.sections[0].content).toContain('=== Arrays and Lists'); test.expect(mockLevel2Structure.sections[0].content).toContain(
test.expect(mockLevel2Structure.sections[0].content).toContain('==== Dynamic Arrays'); "=== Arrays and Lists",
test.expect(mockLevel2Structure.sections[1].content).toContain('==== Quick Sort'); );
test.expect(mockLevel2Structure.sections[0].content).toContain(
"==== Dynamic Arrays",
);
test.expect(mockLevel2Structure.sections[1].content).toContain(
"==== Quick Sort",
);
}); });
// ============================================================================= // =============================================================================
// PHASE 2: Content Processing Tests (Header Separation) // PHASE 2: Content Processing Tests (Header Separation)
// ============================================================================= // =============================================================================
test.test('Section content should NOT contain its own header', () => { test.test("Section content should NOT contain its own header", () => {
// From understanding_knowledge.adoc: "== Preface" section // From understanding_knowledge.adoc: "== Preface" section
const expectedPrefaceContent = `[NOTE] const expectedPrefaceContent = `[NOTE]
This essay was written to outline and elaborate on the purpose of the Nostr client Alexandria. No formal academic citations are included as this serves primarily as a conceptual foundation, inviting readers to experience related ideas connecting and forming as more content becomes uploaded. Traces of AI edits and guidance are left, but the essay style is still my own. Over time this essay may change its wording, structure and content. This essay was written to outline and elaborate on the purpose of the Nostr client Alexandria. No formal academic citations are included as this serves primarily as a conceptual foundation, inviting readers to experience related ideas connecting and forming as more content becomes uploaded. Traces of AI edits and guidance are left, but the essay style is still my own. Over time this essay may change its wording, structure and content.
-- liminal`; -- liminal`;
// Should NOT contain "== Preface" // Should NOT contain "== Preface"
test.expect(expectedPrefaceContent).not.toContain('== Preface'); test.expect(expectedPrefaceContent).not.toContain("== Preface");
test.expect(expectedPrefaceContent).toContain('[NOTE]'); test.expect(expectedPrefaceContent).toContain("[NOTE]");
}); });
test.test('Introduction section should separate from its subsections', () => { test.test("Introduction section should separate from its subsections", () => {
// From understanding_knowledge.adoc // From understanding_knowledge.adoc
const expectedIntroContent = `image:https://i.nostr.build/IUs0xNyUEf5hXTFL.jpg[library]`; const expectedIntroContent =
`image:https://i.nostr.build/IUs0xNyUEf5hXTFL.jpg[library]`;
// Should NOT contain subsection content or headers // Should NOT contain subsection content or headers
test.expect(expectedIntroContent).not.toContain('=== Why Investigate'); test.expect(expectedIntroContent).not.toContain("=== Why Investigate");
test.expect(expectedIntroContent).not.toContain('Understanding the nature of knowledge'); test.expect(expectedIntroContent).not.toContain(
test.expect(expectedIntroContent).toContain('image:https://i.nostr.build'); "Understanding the nature of knowledge",
);
test.expect(expectedIntroContent).toContain("image:https://i.nostr.build");
}); });
test.test('Subsection content should be cleanly separated', () => { test.test("Subsection content should be cleanly separated", () => {
// "=== Why Investigate the Nature of Knowledge?" subsection // "=== Why Investigate the Nature of Knowledge?" subsection
const expectedSubsectionContent = `Understanding the nature of knowledge itself is fundamental, distinct from simply studying how we learn or communicate. Knowledge exests first as representations within individuals, separate from how we interact with it...`; const expectedSubsectionContent =
`Understanding the nature of knowledge itself is fundamental, distinct from simply studying how we learn or communicate. Knowledge exests first as representations within individuals, separate from how we interact with it...`;
// Should NOT contain its own header // Should NOT contain its own header
test.expect(expectedSubsectionContent).not.toContain('=== Why Investigate'); test.expect(expectedSubsectionContent).not.toContain("=== Why Investigate");
test.expect(expectedSubsectionContent).toContain('Understanding the nature'); test.expect(expectedSubsectionContent).toContain("Understanding the nature");
}); });
test.test('Deep headers (====) should have proper newlines', () => { test.test("Deep headers (====) should have proper newlines", () => {
// From "=== The Four Perspectives" section with ==== subsections // From "=== The Four Perspectives" section with ==== subsections
const expectedFormatted = ` const expectedFormatted = `
==== 1. The Building Blocks (Material Cause) ==== 1. The Building Blocks (Material Cause)
@ -230,173 +264,198 @@ Just as living organisms are made up of cells, knowledge systems are built from
If you've ever seen how mushrooms connect through underground networks...`; If you've ever seen how mushrooms connect through underground networks...`;
test.expect(expectedFormatted).toContain('\n==== 1. The Building Blocks (Material Cause)\n'); test.expect(expectedFormatted).toContain(
test.expect(expectedFormatted).toContain('\n==== 2. The Pattern of Organization (Formal Cause)\n'); "\n==== 1. The Building Blocks (Material Cause)\n",
);
test.expect(expectedFormatted).toContain(
"\n==== 2. The Pattern of Organization (Formal Cause)\n",
);
}); });
// ============================================================================= // =============================================================================
// PHASE 3: Publishing Logic Tests (30040/30041 Structure) // PHASE 3: Publishing Logic Tests (30040/30041 Structure)
// ============================================================================= // =============================================================================
test.test('Understanding Knowledge should create proper 30040 index event', () => { test.test("Understanding Knowledge should create proper 30040 index event", () => {
// Expected 30040 index event structure // Expected 30040 index event structure
const expectedIndexEvent = { const expectedIndexEvent = {
kind: 30040, kind: 30040,
content: '', // Index events have empty content content: "", // Index events have empty content
tags: [ tags: [
['d', 'understanding-knowledge'], ["d", "understanding-knowledge"],
['title', 'Understanding Knowledge'], ["title", "Understanding Knowledge"],
['image', 'https://i.nostr.build/IUs0xNyUEf5hXTFL.jpg'], ["image", "https://i.nostr.build/IUs0xNyUEf5hXTFL.jpg"],
['published', '2025-04-21'], ["published", "2025-04-21"],
['language', 'en, ISO-639-1'], ["language", "en, ISO-639-1"],
['t', 'knowledge'], ["t", "knowledge"],
['t', 'philosophy'], ["t", "philosophy"],
['t', 'education'], ["t", "education"],
['type', 'text'], ["type", "text"],
// a-tags referencing sections // a-tags referencing sections
['a', '30041:pubkey:understanding-knowledge-preface'], ["a", "30041:pubkey:understanding-knowledge-preface"],
['a', '30041:pubkey:understanding-knowledge-introduction-knowledge-as-a-living-ecosystem'], [
['a', '30041:pubkey:understanding-knowledge-i-material-cause-the-substance-of-knowledge'], "a",
"30041:pubkey:understanding-knowledge-introduction-knowledge-as-a-living-ecosystem",
],
[
"a",
"30041:pubkey:understanding-knowledge-i-material-cause-the-substance-of-knowledge",
],
// ... more a-tags for each section // ... more a-tags for each section
] ],
}; };
test.expect(expectedIndexEvent.kind).toBe(30040); test.expect(expectedIndexEvent.kind).toBe(30040);
test.expect(expectedIndexEvent.content).toBe(''); test.expect(expectedIndexEvent.content).toBe("");
test.expect(expectedIndexEvent.tags.filter(([k]) => k === 't')).toHaveLength(3); test.expect(expectedIndexEvent.tags.filter(([k]) => k === "t")).toHaveLength(
test.expect(expectedIndexEvent.tags.find(([k, v]) => k === 'type' && v === 'text')).toBeTruthy(); 3,
);
test.expect(
expectedIndexEvent.tags.find(([k, v]) => k === "type" && v === "text"),
).toBeTruthy();
}); });
test.test('Understanding Knowledge sections should create proper 30041 events', () => { test.test("Understanding Knowledge sections should create proper 30041 events", () => {
// Expected 30041 events for main sections // Expected 30041 events for main sections
const expectedSectionEvents = [ const expectedSectionEvents = [
{ {
kind: 30041, kind: 30041,
content: `[NOTE]\nThis essay was written to outline and elaborate on the purpose of the Nostr client Alexandria...`, content:
`[NOTE]\nThis essay was written to outline and elaborate on the purpose of the Nostr client Alexandria...`,
tags: [ tags: [
['d', 'understanding-knowledge-preface'], ["d", "understanding-knowledge-preface"],
['title', 'Preface'] ["title", "Preface"],
] ],
}, },
{ {
kind: 30041, kind: 30041,
content: `image:https://i.nostr.build/IUs0xNyUEf5hXTFL.jpg[library]`, content: `image:https://i.nostr.build/IUs0xNyUEf5hXTFL.jpg[library]`,
tags: [ tags: [
['d', 'understanding-knowledge-introduction-knowledge-as-a-living-ecosystem'], [
['title', 'Introduction: Knowledge as a Living Ecosystem'] "d",
] "understanding-knowledge-introduction-knowledge-as-a-living-ecosystem",
} ],
["title", "Introduction: Knowledge as a Living Ecosystem"],
],
},
]; ];
expectedSectionEvents.forEach(event => { expectedSectionEvents.forEach((event) => {
test.expect(event.kind).toBe(30041); test.expect(event.kind).toBe(30041);
test.expect(event.content).toBeTruthy(); test.expect(event.content).toBeTruthy();
test.expect(event.tags.find(([k]) => k === 'd')).toBeTruthy(); test.expect(event.tags.find(([k]) => k === "d")).toBeTruthy();
test.expect(event.tags.find(([k]) => k === 'title')).toBeTruthy(); test.expect(event.tags.find(([k]) => k === "title")).toBeTruthy();
}); });
}); });
test.test('Level-based parsing should create correct 30040/30041 structure', () => { test.test("Level-based parsing should create correct 30040/30041 structure", () => {
// Based on docreference.md examples // Based on docreference.md examples
// Level 2 parsing: only == sections become events, containing all subsections // Level 2 parsing: only == sections become events, containing all subsections
const expectedLevel2Events = { const expectedLevel2Events = {
mainIndex: { mainIndex: {
kind: 30040, kind: 30040,
content: '', content: "",
tags: [ tags: [
['d', 'programming-fundamentals-guide'], ["d", "programming-fundamentals-guide"],
['title', 'Programming Fundamentals Guide'], ["title", "Programming Fundamentals Guide"],
['a', '30041:author_pubkey:data-structures'], ["a", "30041:author_pubkey:data-structures"],
['a', '30041:author_pubkey:algorithms'] ["a", "30041:author_pubkey:algorithms"],
] ],
}, },
dataStructuresSection: { dataStructuresSection: {
kind: 30041, kind: 30041,
content: 'Understanding fundamental data structures...\n\n=== Arrays and Lists\n\n...==== Dynamic Arrays\n\n...==== Linked Lists\n\n...', content:
"Understanding fundamental data structures...\n\n=== Arrays and Lists\n\n...==== Dynamic Arrays\n\n...==== Linked Lists\n\n...",
tags: [ tags: [
['d', 'data-structures'], ["d", "data-structures"],
['title', 'Data Structures'], ["title", "Data Structures"],
['difficulty', 'intermediate'] ["difficulty", "intermediate"],
] ],
} },
}; };
// Level 3 parsing: == sections become 30040 indices, === sections become 30041 events // Level 3 parsing: == sections become 30040 indices, === sections become 30041 events
const expectedLevel3Events = { const expectedLevel3Events = {
mainIndex: { mainIndex: {
kind: 30040, kind: 30040,
content: '', content: "",
tags: [ tags: [
['d', 'programming-fundamentals-guide'], ["d", "programming-fundamentals-guide"],
['title', 'Programming Fundamentals Guide'], ["title", "Programming Fundamentals Guide"],
['a', '30040:author_pubkey:data-structures'], // Now references sub-index ["a", "30040:author_pubkey:data-structures"], // Now references sub-index
['a', '30040:author_pubkey:algorithms'] ["a", "30040:author_pubkey:algorithms"],
] ],
}, },
dataStructuresIndex: { dataStructuresIndex: {
kind: 30040, kind: 30040,
content: '', content: "",
tags: [ tags: [
['d', 'data-structures'], ["d", "data-structures"],
['title', 'Data Structures'], ["title", "Data Structures"],
['a', '30041:author_pubkey:data-structures-content'], ["a", "30041:author_pubkey:data-structures-content"],
['a', '30041:author_pubkey:arrays-and-lists'], ["a", "30041:author_pubkey:arrays-and-lists"],
['a', '30041:author_pubkey:trees-and-graphs'] ["a", "30041:author_pubkey:trees-and-graphs"],
] ],
}, },
arraysAndListsSection: { arraysAndListsSection: {
kind: 30041, kind: 30041,
content: 'Arrays are contiguous...\n\n==== Dynamic Arrays\n\n...==== Linked Lists\n\n...', content:
"Arrays are contiguous...\n\n==== Dynamic Arrays\n\n...==== Linked Lists\n\n...",
tags: [ tags: [
['d', 'arrays-and-lists'], ["d", "arrays-and-lists"],
['title', 'Arrays and Lists'] ["title", "Arrays and Lists"],
] ],
} },
}; };
test.expect(expectedLevel2Events.mainIndex.kind).toBe(30040); test.expect(expectedLevel2Events.mainIndex.kind).toBe(30040);
test.expect(expectedLevel2Events.dataStructuresSection.kind).toBe(30041); test.expect(expectedLevel2Events.dataStructuresSection.kind).toBe(30041);
test.expect(expectedLevel2Events.dataStructuresSection.content).toContain('=== Arrays and Lists'); test.expect(expectedLevel2Events.dataStructuresSection.content).toContain(
"=== Arrays and Lists",
);
test.expect(expectedLevel3Events.dataStructuresIndex.kind).toBe(30040); test.expect(expectedLevel3Events.dataStructuresIndex.kind).toBe(30040);
test.expect(expectedLevel3Events.arraysAndListsSection.content).toContain('==== Dynamic Arrays'); test.expect(expectedLevel3Events.arraysAndListsSection.content).toContain(
"==== Dynamic Arrays",
);
}); });
// ============================================================================= // =============================================================================
// PHASE 4: Smart Publishing System Tests // PHASE 4: Smart Publishing System Tests
// ============================================================================= // =============================================================================
test.test('Content type detection should work for both test files', () => { test.test("Content type detection should work for both test files", () => {
const testCases = [ const testCases = [
{ {
name: 'Understanding Knowledge (article)', name: "Understanding Knowledge (article)",
content: understandingKnowledge, content: understandingKnowledge,
expected: 'article' expected: "article",
}, },
{ {
name: 'Desire (article)', name: "Desire (article)",
content: desire, content: desire,
expected: 'article' expected: "article",
}, },
{ {
name: 'Scattered notes format', name: "Scattered notes format",
content: '== Note 1\nContent\n\n== Note 2\nMore content', content: "== Note 1\nContent\n\n== Note 2\nMore content",
expected: 'scattered-notes' expected: "scattered-notes",
} },
]; ];
testCases.forEach(({ name, content, expected }) => { testCases.forEach(({ name, content, expected }) => {
const hasDocTitle = content.trim().startsWith('=') && !content.trim().startsWith('=='); const hasDocTitle = content.trim().startsWith("=") &&
const hasSections = content.includes('=='); !content.trim().startsWith("==");
const hasSections = content.includes("==");
let detected; let detected;
if (hasDocTitle) { if (hasDocTitle) {
detected = 'article'; detected = "article";
} else if (hasSections) { } else if (hasSections) {
detected = 'scattered-notes'; detected = "scattered-notes";
} else { } else {
detected = 'none'; detected = "none";
} }
console.log(` ${name}: detected ${detected}`); console.log(` ${name}: detected ${detected}`);
@ -404,14 +463,27 @@ test.test('Content type detection should work for both test files', () => {
}); });
}); });
test.test('Parse level should affect event structure correctly', () => { test.test("Parse level should affect event structure correctly", () => {
// Understanding Knowledge has structure: = > == (6 sections) > === (many subsections) > ==== // Understanding Knowledge has structure: = > == (6 sections) > === (many subsections) > ====
// Based on actual content analysis // Based on actual content analysis
const levelEventCounts = [ const levelEventCounts = [
{ level: 1, description: 'Only document index', events: 1 }, { level: 1, description: "Only document index", events: 1 },
{ level: 2, description: 'Document index + level 2 sections (==)', events: 7 }, // 1 index + 6 sections {
{ level: 3, description: 'Document index + section indices + level 3 subsections (===)', events: 20 }, // More complex level: 2,
{ level: 4, description: 'Full hierarchy including level 4 (====)', events: 35 } description: "Document index + level 2 sections (==)",
events: 7,
}, // 1 index + 6 sections
{
level: 3,
description:
"Document index + section indices + level 3 subsections (===)",
events: 20,
}, // More complex
{
level: 4,
description: "Full hierarchy including level 4 (====)",
events: 35,
},
]; ];
levelEventCounts.forEach(({ level, description, events }) => { levelEventCounts.forEach(({ level, description, events }) => {
@ -424,27 +496,27 @@ test.test('Parse level should affect event structure correctly', () => {
// PHASE 5: Integration Tests (End-to-End Workflow) // PHASE 5: Integration Tests (End-to-End Workflow)
// ============================================================================= // =============================================================================
test.test('Full Understanding Knowledge publishing workflow (Level 2)', async () => { test.test("Full Understanding Knowledge publishing workflow (Level 2)", async () => {
// Mock the complete ITERATIVE workflow // Mock the complete ITERATIVE workflow
const mockWorkflow = { const mockWorkflow = {
parseLevel2: (content: string) => ({ parseLevel2: (content: string) => ({
metadata: { metadata: {
title: 'Understanding Knowledge', title: "Understanding Knowledge",
image: 'https://i.nostr.build/IUs0xNyUEf5hXTFL.jpg', image: "https://i.nostr.build/IUs0xNyUEf5hXTFL.jpg",
published: '2025-04-21', published: "2025-04-21",
tags: ['knowledge', 'philosophy', 'education'], tags: ["knowledge", "philosophy", "education"],
type: 'text' type: "text",
}, },
title: 'Understanding Knowledge', title: "Understanding Knowledge",
content: 'Introduction content before any sections', content: "Introduction content before any sections",
sections: [ sections: [
{ {
title: 'Preface', title: "Preface",
content: '[NOTE]\nThis essay was written to outline...', content: "[NOTE]\nThis essay was written to outline...",
metadata: { title: 'Preface' } metadata: { title: "Preface" },
}, },
{ {
title: 'Introduction: Knowledge as a Living Ecosystem', title: "Introduction: Knowledge as a Living Ecosystem",
// Contains ALL subsections (===, ====) in content // Contains ALL subsections (===, ====) in content
content: `image:https://i.nostr.build/IUs0xNyUEf5hXTFL.jpg[library] content: `image:https://i.nostr.build/IUs0xNyUEf5hXTFL.jpg[library]
@ -461,41 +533,46 @@ Traditionally, knowledge has been perceived as a static repository...
===== 1. The Building Blocks (Material Cause) ===== 1. The Building Blocks (Material Cause)
Just as living organisms are made up of cells...`, Just as living organisms are made up of cells...`,
metadata: { title: 'Introduction: Knowledge as a Living Ecosystem' } metadata: { title: "Introduction: Knowledge as a Living Ecosystem" },
} },
// ... 4 more sections (Material Cause, Formal Cause, Efficient Cause, Final Cause) // ... 4 more sections (Material Cause, Formal Cause, Efficient Cause, Final Cause)
] ],
}), }),
buildLevel2Events: (parsed: any) => ({ buildLevel2Events: (parsed: any) => ({
indexEvent: { indexEvent: {
kind: 30040, kind: 30040,
content: '', content: "",
tags: [ tags: [
['d', 'understanding-knowledge'], ["d", "understanding-knowledge"],
['title', parsed.title], ["title", parsed.title],
['image', parsed.metadata.image], ["image", parsed.metadata.image],
['t', 'knowledge'], ['t', 'philosophy'], ['t', 'education'], ["t", "knowledge"],
['type', 'text'], ["t", "philosophy"],
['a', '30041:pubkey:preface'], ["t", "education"],
['a', '30041:pubkey:introduction-knowledge-as-a-living-ecosystem'] ["type", "text"],
] ["a", "30041:pubkey:preface"],
["a", "30041:pubkey:introduction-knowledge-as-a-living-ecosystem"],
],
}, },
sectionEvents: parsed.sections.map((s: any) => ({ sectionEvents: parsed.sections.map((s: any) => ({
kind: 30041, kind: 30041,
content: s.content, content: s.content,
tags: [ tags: [
['d', s.title.toLowerCase().replace(/[^a-z0-9]+/g, '-')], ["d", s.title.toLowerCase().replace(/[^a-z0-9]+/g, "-")],
['title', s.title] ["title", s.title],
] ],
})) })),
}), }),
publish: (events: any) => ({ publish: (events: any) => ({
success: true, success: true,
published: events.sectionEvents.length + 1, published: events.sectionEvents.length + 1,
eventIds: ['main-index', ...events.sectionEvents.map((_: any, i: number) => `section-${i}`)] eventIds: [
}) "main-index",
...events.sectionEvents.map((_: any, i: number) => `section-${i}`),
],
}),
}; };
// Test the full Level 2 workflow // Test the full Level 2 workflow
@ -503,28 +580,37 @@ Just as living organisms are made up of cells...`,
const events = mockWorkflow.buildLevel2Events(parsed); const events = mockWorkflow.buildLevel2Events(parsed);
const result = mockWorkflow.publish(events); const result = mockWorkflow.publish(events);
test.expect(parsed.metadata.title).toBe('Understanding Knowledge'); test.expect(parsed.metadata.title).toBe("Understanding Knowledge");
test.expect(parsed.sections).toHaveLength(2); test.expect(parsed.sections).toHaveLength(2);
test.expect(events.indexEvent.kind).toBe(30040); test.expect(events.indexEvent.kind).toBe(30040);
test.expect(events.sectionEvents).toHaveLength(2); test.expect(events.sectionEvents).toHaveLength(2);
test.expect(events.sectionEvents[1].content).toContain('=== Why Investigate'); // Contains subsections test.expect(events.sectionEvents[1].content).toContain("=== Why Investigate"); // Contains subsections
test.expect(events.sectionEvents[1].content).toContain('===== 1. The Building Blocks'); // Contains deeper levels test.expect(events.sectionEvents[1].content).toContain(
"===== 1. The Building Blocks",
); // Contains deeper levels
test.expect(result.success).toBeTruthy(); test.expect(result.success).toBeTruthy();
test.expect(result.published).toBe(3); // 1 index + 2 sections test.expect(result.published).toBe(3); // 1 index + 2 sections
}); });
test.test('Error handling for malformed content', () => { test.test("Error handling for malformed content", () => {
const invalidCases = [ const invalidCases = [
{ content: '== Section\n=== Subsection\n==== Missing content', error: 'Empty content sections' }, {
{ content: '= Title\n\n== Section\n==== Skipped level', error: 'Invalid header nesting' }, content: "== Section\n=== Subsection\n==== Missing content",
{ content: '', error: 'Empty document' } error: "Empty content sections",
},
{
content: "= Title\n\n== Section\n==== Skipped level",
error: "Invalid header nesting",
},
{ content: "", error: "Empty document" },
]; ];
invalidCases.forEach(({ content, error }) => { invalidCases.forEach(({ content, error }) => {
// Mock error detection // Mock error detection
const hasEmptySections = content.includes('Missing content'); const hasEmptySections = content.includes("Missing content");
const hasSkippedLevels = content.includes('====') && !content.includes('==='); const hasSkippedLevels = content.includes("====") &&
const isEmpty = content.trim() === ''; !content.includes("===");
const isEmpty = content.trim() === "";
const shouldError = hasEmptySections || hasSkippedLevels || isEmpty; const shouldError = hasEmptySections || hasSkippedLevels || isEmpty;
test.expect(shouldError).toBeTruthy(); test.expect(shouldError).toBeTruthy();
@ -535,26 +621,40 @@ test.test('Error handling for malformed content', () => {
// Test Execution // Test Execution
// ============================================================================= // =============================================================================
console.log('🎯 ZettelPublisher Test-Driven Development (ITERATIVE)\n'); console.log("🎯 ZettelPublisher Test-Driven Development (ITERATIVE)\n");
console.log('📋 Test Data Analysis:'); console.log("📋 Test Data Analysis:");
console.log(`- Understanding Knowledge: ${understandingKnowledge.split('\n').length} lines`); console.log(
console.log(`- Desire: ${desire.split('\n').length} lines`); `- Understanding Knowledge: ${
console.log('- Both files use = document title with metadata directly underneath'); understandingKnowledge.split("\n").length
console.log('- Sections use == with deep nesting (===, ====, =====)'); } lines`,
console.log('- Custom attributes like :type: podcastArticle need preservation'); );
console.log('- CRITICAL: Structure is ITERATIVE not recursive (per docreference.md)\n'); console.log(`- Desire: ${desire.split("\n").length} lines`);
console.log(
test.run().then(success => { "- Both files use = document title with metadata directly underneath",
);
console.log("- Sections use == with deep nesting (===, ====, =====)");
console.log("- Custom attributes like :type: podcastArticle need preservation");
console.log(
"- CRITICAL: Structure is ITERATIVE not recursive (per docreference.md)\n",
);
test.run().then((success) => {
if (success) { if (success) {
console.log('\n🎉 All tests defined! Ready for ITERATIVE implementation.'); console.log("\n🎉 All tests defined! Ready for ITERATIVE implementation.");
console.log('\n📋 Implementation Plan:'); console.log("\n📋 Implementation Plan:");
console.log('1. ✅ Update ParsedAsciiDoc interface for ITERATIVE parsing'); console.log("1. ✅ Update ParsedAsciiDoc interface for ITERATIVE parsing");
console.log('2. ✅ Fix content processing (header separation, custom attributes)'); console.log(
console.log('3. ✅ Implement level-based publishing logic (30040/30041 structure)'); "2. ✅ Fix content processing (header separation, custom attributes)",
console.log('4. ✅ Add parse-level controlled event generation'); );
console.log('5. ✅ Create context-aware UI with level selector'); console.log(
console.log('\n🔄 Each level can be developed and tested independently!'); "3. ✅ Implement level-based publishing logic (30040/30041 structure)",
);
console.log("4. ✅ Add parse-level controlled event generation");
console.log("5. ✅ Create context-aware UI with level selector");
console.log("\n🔄 Each level can be developed and tested independently!");
} else { } else {
console.log('\n❌ Tests ready - implement ITERATIVE features to make them pass!'); console.log(
"\n❌ Tests ready - implement ITERATIVE features to make them pass!",
);
} }
}).catch(console.error); }).catch(console.error);
Loading…
Cancel
Save