Compare commits

...

4 Commits

  1. 7
      .gitignore
  2. 235
      DOCKER.md
  3. 105
      Dockerfile
  4. 56
      docker-compose-prod.yml
  5. 402
      internal/asciidoc/processor.go
  6. 89
      internal/cache/cache.go
  7. 54
      internal/cache/feed_cache.go
  8. 198
      internal/cache/media_cache.go
  9. 29
      internal/cache/page.go
  10. 411
      internal/cache/rewarm.go
  11. 144
      internal/generator/html.go
  12. 219
      internal/server/handlers.go
  13. 2
      internal/server/server.go
  14. 42
      package-lock.json
  15. 6
      package.json
  16. 45
      scripts/process-content.js
  17. 10
      static/css/highlight.js.css
  18. 113
      static/css/main.css
  19. 1213
      static/js/highlight.min.js
  20. 8
      templates/base.html
  21. 68
      templates/events.html
  22. 23
      templates/page.html

7
.gitignore vendored

@ -45,6 +45,9 @@ yarn-error.log* @@ -45,6 +45,9 @@ yarn-error.log*
.npm
# Note: package-lock.json should be committed for reproducible builds
# Local gc-parser directory (the actual repo is at ../gc-parser)
gc-parser/
# IDE and editor files
.vscode/
.idea/
@ -75,9 +78,9 @@ Thumbs.db @@ -75,9 +78,9 @@ Thumbs.db
.AppleDouble
.LSOverride
# Cache directories
# Cache directories (runtime cache, not source code)
.cache/
cache/
/cache/
# Build artifacts
*.a

235
DOCKER.md

@ -1,235 +0,0 @@ @@ -1,235 +0,0 @@
# Docker Setup for GitCitadel Online
This guide explains how to run GitCitadel Online using Docker.
## Prerequisites
- Docker (version 20.10 or later)
- Docker Compose (version 2.0 or later, optional but recommended)
- Network access (for downloading dependencies and connecting to Nostr relays)
## Image Details
The Docker image uses **Alpine Linux** for a smaller footprint (~50MB base image vs ~200MB+ for Debian). This works well because:
- The Go binary is statically compiled (`CGO_ENABLED=0`), so no C library dependencies
- Node.js packages (`@asciidoctor/core`, `marked`) are pure JavaScript with no native bindings
- Alpine's musl libc is sufficient for our use case
If you encounter any compatibility issues, you can modify the Dockerfile to use Debian-based images (`golang:1.22` and `node:20-slim`), though this will increase the image size.
## Quick Start
### Using Docker Compose (Recommended)
1. **Create your configuration file:**
```bash
cp config.yaml.example config.yaml
# Edit config.yaml with your Nostr indices, relay URLs, and settings
```
2. **Build and run:**
```bash
docker-compose up -d
```
3. **View logs:**
```bash
docker-compose logs -f
```
4. **Stop the container:**
```bash
docker-compose down
```
### Using Docker directly
1. **Build the image:**
```bash
docker build -t gitcitadel-online .
```
2. **Create config file:**
```bash
cp config.yaml.example config.yaml
# Edit config.yaml with your settings
```
3. **Run the container:**
```bash
docker run -d \
--name gitcitadel-online \
-p 8080:8080 \
-v $(pwd)/config.yaml:/app/config.yaml:ro \
-v $(pwd)/cache:/app/cache \
--restart unless-stopped \
gitcitadel-online
```
4. **View logs:**
```bash
docker logs -f gitcitadel-online
```
5. **Stop the container:**
```bash
docker stop gitcitadel-online
docker rm gitcitadel-online
```
## Configuration
### Config File
The `config.yaml` file must be mounted into the container. The default path is `/app/config.yaml`.
You can override the config path using the `--config` flag:
```bash
docker run ... gitcitadel-online --config /path/to/config.yaml
```
### Port Mapping
By default, the application runs on port 8080. You can change the host port mapping:
```bash
# Map to different host port
docker run -p 3000:8080 ...
```
Or update `docker-compose.yml`:
```yaml
ports:
- "3000:8080"
```
### Cache Persistence
The cache directory (`cache/`) is persisted as a volume to maintain cached pages and media between container restarts.
### Environment Variables
You can pass environment variables, though most configuration should be in `config.yaml`:
```bash
docker run -e LOG_LEVEL=debug ...
```
## Development Mode
To run in development mode with verbose logging:
```bash
docker run ... gitcitadel-online --dev
```
Or with docker-compose, override the command:
```yaml
command: ["--config", "/app/config.yaml", "--dev"]
```
## Health Check
The container includes a health check that monitors the `/health` endpoint. You can check the health status:
```bash
docker ps
# Look for "healthy" status
# Or inspect directly
docker inspect --format='{{.State.Health.Status}}' gitcitadel-online
```
## Troubleshooting
### Container won't start
1. **Check logs:**
```bash
docker logs gitcitadel-online
```
2. **Verify config file:**
```bash
docker exec gitcitadel-online cat /app/config.yaml
```
3. **Check file permissions:**
The container runs as a non-root user (UID 1000). Ensure cache directory is writable:
```bash
chmod -R 777 cache/
```
### Can't connect to Nostr relays
- Ensure the container has network access
- Check firewall rules if running on a remote server
- Verify relay URLs in `config.yaml` are correct
### Cache not persisting
- Ensure the cache volume is properly mounted
- Check volume permissions
- Verify the cache directory exists and is writable
## Building for Different Architectures
The Dockerfile builds for `linux/amd64` by default. To build for other architectures:
```bash
# For ARM64 (e.g., Raspberry Pi, Apple Silicon)
docker buildx build --platform linux/arm64 -t gitcitadel-online .
# For multiple architectures
docker buildx build --platform linux/amd64,linux/arm64 -t gitcitadel-online .
```
## Production Deployment
For production deployment:
1. **Use a reverse proxy** (nginx, Traefik, Apache, etc.) in front of the container
2. **Set up SSL/TLS** certificates
3. **Configure proper logging** and monitoring
4. **Use secrets management** for sensitive configuration
5. **Set resource limits** in docker-compose.yml:
```yaml
deploy:
resources:
limits:
cpus: '1'
memory: 512M
```
## Apache Reverse Proxy Setup
The `docker-compose.yml` is configured to expose the container on port **2323** for Apache reverse proxy integration.
### Port Configuration
The Docker container exposes port 2323 on the host, which maps to port 8080 inside the container. This matches Apache configurations that proxy to `127.0.0.1:2323`.
If you need to use a different port, update `docker-compose.yml`: Change `"2323:8080"` to your desired port mapping.
**Note:** For Plesk-managed Apache servers, configure the reverse proxy settings through the Plesk control panel. The Docker container is ready to accept connections on port 2323.
## Updating
To update to a new version:
```bash
# Pull latest code
git pull
# Rebuild and restart
docker-compose build
docker-compose up -d
```
Or with Docker directly:
```bash
docker build -t gitcitadel-online .
docker stop gitcitadel-online
docker rm gitcitadel-online
docker run ... # (same command as before)
```

105
Dockerfile

@ -1,17 +1,10 @@ @@ -1,17 +1,10 @@
# Multi-stage build for GitCitadel Online
# Using Alpine Linux for smaller image size (~50MB vs ~200MB+ for Debian)
# Alpine works well here because:
# - Go binary is statically compiled (CGO_ENABLED=0)
# - Node.js packages are pure JavaScript (no native bindings)
# - No C library dependencies required
# Stage 1: Build Go application
FROM golang:1.22-alpine AS builder
FROM golang:1.22-alpine AS go-builder
# Install build dependencies
RUN apk add --no-cache git
RUN apk add --no-cache git make
# Set working directory
WORKDIR /build
# Copy go mod files
@ -22,57 +15,79 @@ RUN go mod download @@ -22,57 +15,79 @@ RUN go mod download
COPY . .
# Build the application
RUN CGO_ENABLED=0 GOOS=linux GOARCH=amd64 go build -ldflags='-w -s' -o gitcitadel-online ./cmd/server
# Stage 2: Runtime with Node.js for AsciiDoc processing
FROM node:20-alpine
RUN CGO_ENABLED=0 GOOS=linux go build -a -installsuffix cgo -o gitcitadel-online ./cmd/server/main.go
# Install runtime dependencies (wget for health check and nostr-tools download)
RUN apk add --no-cache ca-certificates tzdata wget
# Stage 2: Build and install gc-parser from local directory
# We copy the local gc-parser directory, build it, and install it to avoid git repository access issues
FROM node:18-alpine AS node-setup
# Set working directory
WORKDIR /app
# Install Node.js dependencies for AsciiDoc processing
COPY package.json package-lock.json ./
RUN npm ci --only=production
# Copy package.json and the local gc-parser directory
COPY package.json package-lock.json* ./
COPY gc-parser ./gc-parser
# Copy built binary from builder
COPY --from=builder /build/gitcitadel-online /app/gitcitadel-online
# Build gc-parser (install dependencies and compile TypeScript)
WORKDIR /app/gc-parser
RUN npm install && npm run build
# Copy static files and templates
COPY static/ ./static/
COPY templates/ ./templates/
# Go back to /app and install gc-parser from local directory
# This will install gc-parser and @asciidoctor/core (as a dependency of gc-parser)
WORKDIR /app
RUN npm install ./gc-parser
# Download nostr-tools bundle if not present (for contact form)
RUN if [ ! -f ./static/js/nostr.bundle.js ]; then \
mkdir -p ./static/js && \
wget -O ./static/js/nostr.bundle.js https://unpkg.com/nostr-tools@latest/lib/nostr.bundle.js || \
echo "Warning: Failed to download nostr-tools bundle"; \
# Verify gc-parser is installed and can be required
RUN node -e "require('gc-parser'); console.log('✓ gc-parser installed and verified')" || \
(echo "Error: gc-parser verification failed" && exit 1)
# Ensure gc-parser is a directory, not a symlink (for proper copying to final stage)
RUN if [ -L node_modules/gc-parser ]; then \
echo "Warning: gc-parser is a symlink, copying as directory..."; \
rm node_modules/gc-parser; \
cp -r gc-parser node_modules/gc-parser; \
fi
# Copy example config (user should mount their own config.yaml)
COPY config.yaml.example ./config.yaml.example
# Stage 3: Final runtime image
FROM alpine:latest
# Install runtime dependencies
RUN apk add --no-cache \
ca-certificates \
wget \
nodejs \
npm \
git \
&& rm -rf /var/cache/apk/*
# Copy entrypoint script
COPY docker-entrypoint.sh /app/docker-entrypoint.sh
WORKDIR /app
# Copy Node.js dependencies from node-setup stage
COPY --from=node-setup /app/node_modules ./node_modules
# Copy built Go binary from go-builder stage
COPY --from=go-builder /build/gitcitadel-online .
# Create non-root user for security
# node:20-alpine already has a 'node' user with UID 1000
# Change ownership of /app to node user
RUN chown -R node:node /app && \
chmod +x /app/docker-entrypoint.sh
# Copy necessary files
COPY scripts ./scripts
COPY static ./static
COPY templates ./templates
COPY docker-entrypoint.sh ./
# Switch to non-root user
USER node
# Make entrypoint executable
RUN chmod +x docker-entrypoint.sh
# Expose port (default 8080, can be overridden via config)
# Create cache directory
RUN mkdir -p cache/media && chmod 755 cache
# Expose port
EXPOSE 8080
# Health check
HEALTHCHECK --interval=30s --timeout=10s --start-period=40s --retries=3 \
CMD wget --no-verbose --tries=1 --spider http://localhost:8080/health || exit 1
CMD wget --no-verbose --tries=1 --spider http://localhost:8080/health || exit 1
# Use entrypoint script
ENTRYPOINT ["./docker-entrypoint.sh"]
# Run the application via entrypoint script
ENTRYPOINT ["/app/docker-entrypoint.sh"]
CMD ["/app/gitcitadel-online", "--config", "/app/config.yaml"]
# Run the application
CMD ["./gitcitadel-online"]

56
docker-compose-prod.yml

@ -1,24 +1,39 @@ @@ -1,24 +1,39 @@
version: '3.8'
# Production Docker Compose configuration for GitCitadel Online
# Domain: https://gitcitadel.imwald.eu
#
# This configuration:
# - Installs gc-parser from git.imwald.eu (with GitHub fallback)
# - Sets up production networking and resource limits
# - Includes Traefik labels for reverse proxy (optional - remove if not using Traefik)
# - Configures health checks and automatic restarts
services:
gitcitadel-online:
image: silberengel/gitcitadel-online:latest
container_name: gitcitadel-online
build:
context: .
dockerfile: Dockerfile
container_name: gitcitadel-online-prod
restart: unless-stopped
ports:
# Expose port 2323 for Apache reverse proxy (maps to container port 8080)
- "2323:8080"
volumes:
# Mount config file (create from config.yaml.example)
- ./config.yaml:/app/config.yaml:ro
# Persist cache directory
# Note: Ensure the host cache directory is writable by UID 1000 (node user)
# Run: sudo chown -R 1000:1000 ./cache (or use 777 permissions)
- ./cache:/app/cache
# Optional: Mount config file to override defaults
# - ./config.yaml:/app/config.yaml:ro
# Optional environment variables (uncomment and set as needed):
# environment:
# - CONFIG_PATH=/app/config.yaml
# - LOG_LEVEL=info
# Ensure scripts directory is available
- ./scripts:/app/scripts:ro
environment:
# Production environment variables
- CONFIG_PATH=/app/config.yaml
- LOG_LEVEL=info
# Link base URL for production domain
- LINK_BASE_URL=https://gitcitadel.imwald.eu
networks:
- gitcitadel-network
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:8080/health"]
interval: 30s
@ -28,5 +43,22 @@ services: @@ -28,5 +43,22 @@ services:
deploy:
resources:
limits:
cpus: '1'
memory: 512M
cpus: '2'
memory: 1G
# Note: reservations are only supported in Docker Swarm mode
# For regular docker-compose, only limits are supported
# Traefik labels for reverse proxy (optional - remove if not using Traefik)
# If using nginx or another reverse proxy, remove these labels and configure
# your reverse proxy to forward requests to localhost:2323
labels:
- "traefik.enable=true"
- "traefik.http.routers.gitcitadel.rule=Host(`gitcitadel.imwald.eu`)"
- "traefik.http.routers.gitcitadel.entrypoints=websecure"
- "traefik.http.routers.gitcitadel.tls.certresolver=letsencrypt"
- "traefik.http.services.gitcitadel.loadbalancer.server.port=8080"
- "traefik.docker.network=gitcitadel-network"
networks:
gitcitadel-network:
driver: bridge
name: gitcitadel-network

402
internal/asciidoc/processor.go

@ -1,392 +1,90 @@ @@ -1,392 +1,90 @@
package asciidoc
import (
"bytes"
"encoding/json"
"fmt"
"os/exec"
"regexp"
"path/filepath"
"strings"
)
// Processor handles AsciiDoc to HTML conversion
// Processor handles content processing using gc-parser
type Processor struct {
linkBaseURL string
scriptPath string
}
// ProcessResult contains the processed HTML content and extracted table of contents
type ProcessResult struct {
Content string
TableOfContents string
Content string
TableOfContents string
HasLaTeX bool
HasMusicalNotation bool
}
// NewProcessor creates a new AsciiDoc processor
// gcParserResult matches the JSON output from gc-parser
type gcParserResult struct {
Content string `json:"content"`
TableOfContents string `json:"tableOfContents"`
HasLaTeX bool `json:"hasLaTeX"`
HasMusicalNotation bool `json:"hasMusicalNotation"`
NostrLinks []interface{} `json:"nostrLinks"`
Wikilinks []interface{} `json:"wikilinks"`
Hashtags []string `json:"hashtags"`
Links []interface{} `json:"links"`
Media []string `json:"media"`
Error string `json:"error,omitempty"`
}
// NewProcessor creates a new content processor using gc-parser
func NewProcessor(linkBaseURL string) *Processor {
// Determine script path relative to the executable
// In production, the script should be in the same directory as the binary
scriptPath := filepath.Join("scripts", "process-content.js")
return &Processor{
linkBaseURL: linkBaseURL,
scriptPath: scriptPath,
}
}
// Process converts AsciiDoc content to HTML with link rewriting
// Process converts content (AsciiDoc, Markdown, etc.) to HTML using gc-parser
// Returns both the content HTML and the extracted table of contents
func (p *Processor) Process(asciidocContent string) (*ProcessResult, error) {
// First, rewrite links in the AsciiDoc content
processedContent := p.rewriteLinks(asciidocContent)
// Convert AsciiDoc to HTML using asciidoctor CLI
html, err := p.convertToHTML(processedContent)
if err != nil {
return nil, fmt.Errorf("failed to convert AsciiDoc to HTML: %w", err)
}
// Extract table of contents from HTML
toc, contentWithoutTOC := p.extractTOC(html)
// Sanitize HTML to prevent XSS
sanitized := p.sanitizeHTML(contentWithoutTOC)
// Process links: make external links open in new tab, local links in same tab
processed := p.processLinks(sanitized)
// Also sanitize and process links in TOC
tocSanitized := p.sanitizeHTML(toc)
tocProcessed := p.processLinks(tocSanitized)
return &ProcessResult{
Content: processed,
TableOfContents: tocProcessed,
}, nil
}
// rewriteLinks rewrites wikilinks and nostr: links in AsciiDoc content
func (p *Processor) rewriteLinks(content string) string {
// Rewrite wikilinks: [[target]] or [[target|display text]]
// Format: [[target]] -> https://alexandria.gitcitadel.eu/events?d=<normalized-d-tag>
wikilinkRegex := regexp.MustCompile(`\[\[([^\]]+)\]\]`)
content = wikilinkRegex.ReplaceAllStringFunc(content, func(match string) string {
// Extract the content inside [[ ]]
inner := match[2 : len(match)-2]
var target, display string
if strings.Contains(inner, "|") {
parts := strings.SplitN(inner, "|", 2)
target = strings.TrimSpace(parts[0])
display = strings.TrimSpace(parts[1])
} else {
target = strings.TrimSpace(inner)
display = target
}
// Normalize the d tag (convert to lowercase, replace spaces with hyphens, etc.)
normalized := normalizeDTag(target)
// Create the link
url := fmt.Sprintf("%s/events?d=%s", p.linkBaseURL, normalized)
return fmt.Sprintf("link:%s[%s]", url, display)
})
// Rewrite nostr: links: nostr:naddr1... or nostr:nevent1...
// Format: nostr:naddr1... -> https://alexandria.gitcitadel.eu/events?id=naddr1...
nostrLinkRegex := regexp.MustCompile(`nostr:(naddr1[^\s\]]+|nevent1[^\s\]]+)`)
content = nostrLinkRegex.ReplaceAllStringFunc(content, func(match string) string {
nostrID := strings.TrimPrefix(match, "nostr:")
url := fmt.Sprintf("%s/events?id=%s", p.linkBaseURL, nostrID)
return url
})
return content
}
// normalizeDTag normalizes a d tag according to NIP-54 rules
func normalizeDTag(dTag string) string {
// Convert to lowercase
dTag = strings.ToLower(dTag)
// Convert whitespace to hyphens
dTag = strings.ReplaceAll(dTag, " ", "-")
dTag = strings.ReplaceAll(dTag, "\t", "-")
dTag = strings.ReplaceAll(dTag, "\n", "-")
// Remove punctuation and symbols (keep alphanumeric, hyphens, and non-ASCII)
var result strings.Builder
for _, r := range dTag {
if (r >= 'a' && r <= 'z') || (r >= '0' && r <= '9') || r == '-' || r > 127 {
result.WriteRune(r)
}
}
dTag = result.String()
// Collapse multiple consecutive hyphens
for strings.Contains(dTag, "--") {
dTag = strings.ReplaceAll(dTag, "--", "-")
}
// Remove leading and trailing hyphens
dTag = strings.Trim(dTag, "-")
return dTag
}
// convertToHTML converts AsciiDoc to HTML using asciidoctor.js via Node.js
func (p *Processor) convertToHTML(asciidocContent string) (string, error) {
func (p *Processor) Process(content string) (*ProcessResult, error) {
// Check if node is available
cmd := exec.Command("node", "--version")
if err := cmd.Run(); err != nil {
return "", fmt.Errorf("node.js not found: %w", err)
return nil, fmt.Errorf("node.js not found: %w", err)
}
// JavaScript code to run asciidoctor.js
// Read content from stdin to handle special characters properly
jsCode := `
const asciidoctor = require('@asciidoctor/core')();
let content = '';
process.stdin.setEncoding('utf8');
process.stdin.on('data', (chunk) => {
content += chunk;
});
process.stdin.on('end', () => {
try {
const html = asciidoctor.convert(content, {
safe: 'safe',
backend: 'html5',
doctype: 'article',
attributes: {
'showtitle': true,
'icons': 'font',
'sectanchors': true,
'sectlinks': true,
'toc': 'left',
'toclevels': 3
}
});
process.stdout.write(html);
} catch (error) {
console.error('Error converting AsciiDoc:', error.message);
process.exit(1);
}
});
`
// Run gc-parser script
cmd = exec.Command("node", p.scriptPath, p.linkBaseURL)
cmd.Stdin = strings.NewReader(content)
// Run node with the JavaScript code, passing content via stdin
cmd = exec.Command("node", "-e", jsCode)
cmd.Stdin = strings.NewReader(asciidocContent)
var stdout, stderr bytes.Buffer
var stdout, stderr strings.Builder
cmd.Stdout = &stdout
cmd.Stderr = &stderr
if err := cmd.Run(); err != nil {
return "", fmt.Errorf("asciidoctor.js conversion failed: %w, stderr: %s", err, stderr.String())
return nil, fmt.Errorf("gc-parser failed: %w, stderr: %s", err, stderr.String())
}
return stdout.String(), nil
}
// sanitizeHTML performs basic HTML sanitization to prevent XSS
// Note: This is a basic implementation. For production, consider using a proper HTML sanitizer library
func (p *Processor) sanitizeHTML(html string) string {
// Remove script tags and their content
scriptRegex := regexp.MustCompile(`(?i)<script[^>]*>.*?</script>`)
html = scriptRegex.ReplaceAllString(html, "")
// Remove event handlers (onclick, onerror, etc.)
eventHandlerRegex := regexp.MustCompile(`(?i)\s*on\w+\s*=\s*["'][^"']*["']`)
html = eventHandlerRegex.ReplaceAllString(html, "")
// Remove javascript: protocol in links
javascriptRegex := regexp.MustCompile(`(?i)javascript:`)
html = javascriptRegex.ReplaceAllString(html, "")
// Remove data: URLs that could be dangerous
dataURLRegex := regexp.MustCompile(`(?i)data:\s*text/html`)
html = dataURLRegex.ReplaceAllString(html, "")
return html
}
// extractTOC extracts the table of contents from AsciiDoc HTML output
// Returns the TOC HTML and the content HTML without the TOC
func (p *Processor) extractTOC(html string) (string, string) {
// AsciiDoc with toc: 'left' generates a TOC in a div with id="toc" or class="toc"
// We need to match the entire TOC div including nested content
// Since divs can be nested, we need to count opening/closing tags
var tocContent string
contentWithoutTOC := html
// Find the start of the TOC div - try multiple patterns
tocStartPatterns := []*regexp.Regexp{
// Pattern 1: <div id="toc" class="toc">
regexp.MustCompile(`(?i)<div\s+id=["']toc["']\s+class=["']toc["'][^>]*>`),
// Pattern 2: <div id="toc">
regexp.MustCompile(`(?i)<div\s+id=["']toc["'][^>]*>`),
// Pattern 3: <div class="toc">
regexp.MustCompile(`(?i)<div\s+class=["']toc["'][^>]*>`),
// Pattern 4: <nav id="toc">
regexp.MustCompile(`(?i)<nav\s+id=["']toc["'][^>]*>`),
}
var tocStartIdx int = -1
var tocStartTag string
for _, pattern := range tocStartPatterns {
loc := pattern.FindStringIndex(html)
if loc != nil {
tocStartIdx = loc[0]
tocStartTag = html[loc[0]:loc[1]]
break
}
}
if tocStartIdx == -1 {
// No TOC found
return "", html
// Parse JSON output
var result gcParserResult
output := stdout.String()
if err := json.Unmarshal([]byte(output), &result); err != nil {
return nil, fmt.Errorf("failed to parse gc-parser output: %w, output: %s", err, output)
}
// Find the matching closing tag by counting div tags
// Start after the opening tag
searchStart := tocStartIdx + len(tocStartTag)
depth := 1
i := searchStart
for i < len(html) && depth > 0 {
// Look for opening or closing div/nav tags
if i+4 < len(html) && html[i:i+4] == "<div" {
// Check if it's a closing tag
if i+5 < len(html) && html[i+4] == '/' {
depth--
// Find the end of this closing tag
closeIdx := strings.Index(html[i:], ">")
if closeIdx == -1 {
break
}
i += closeIdx + 1
} else {
// Opening tag - find the end
closeIdx := strings.Index(html[i:], ">")
if closeIdx == -1 {
break
}
// Check if it's self-closing
if html[i+closeIdx-1] != '/' {
depth++
}
i += closeIdx + 1
}
} else if i+5 < len(html) && html[i:i+5] == "</div" {
depth--
closeIdx := strings.Index(html[i:], ">")
if closeIdx == -1 {
break
}
i += closeIdx + 1
} else if i+5 < len(html) && html[i:i+5] == "</nav" {
depth--
closeIdx := strings.Index(html[i:], ">")
if closeIdx == -1 {
break
}
i += closeIdx + 1
} else {
i++
}
// Check for error in result
if result.Error != "" {
return nil, fmt.Errorf("gc-parser error: %s", result.Error)
}
if depth == 0 {
// Found the matching closing tag
tocEndIdx := i
// Extract the TOC content (inner HTML)
tocFullHTML := html[tocStartIdx:tocEndIdx]
// Extract just the inner content (without the outer div tags)
innerStart := len(tocStartTag)
innerEnd := len(tocFullHTML)
// Find the last </div> or </nav>
if strings.HasSuffix(tocFullHTML, "</div>") {
innerEnd -= 6
} else if strings.HasSuffix(tocFullHTML, "</nav>") {
innerEnd -= 7
}
tocContent = strings.TrimSpace(tocFullHTML[innerStart:innerEnd])
// Remove the toctitle div if present (AsciiDoc adds "Table of Contents" title)
toctitlePattern := regexp.MustCompile(`(?s)<div\s+id=["']toctitle["'][^>]*>.*?</div>\s*`)
tocContent = toctitlePattern.ReplaceAllString(tocContent, "")
tocContent = strings.TrimSpace(tocContent)
// Remove the TOC from the content
contentWithoutTOC = html[:tocStartIdx] + html[tocEndIdx:]
}
return tocContent, contentWithoutTOC
}
// processLinks processes HTML links to add target="_blank" to external links
// External links are those that start with http:// or https:// and don't point to the linkBaseURL domain
// Local links (including relative links and links to linkBaseURL) open in the same tab
func (p *Processor) processLinks(html string) string {
// Extract domain from linkBaseURL for comparison
linkBaseDomain := ""
if strings.HasPrefix(p.linkBaseURL, "http://") || strings.HasPrefix(p.linkBaseURL, "https://") {
// Extract domain (e.g., "alexandria.gitcitadel.eu" from "https://alexandria.gitcitadel.eu")
parts := strings.Split(strings.TrimPrefix(strings.TrimPrefix(p.linkBaseURL, "https://"), "http://"), "/")
if len(parts) > 0 {
linkBaseDomain = parts[0]
}
}
// Regex to match <a> tags with href attributes (more flexible pattern)
linkRegex := regexp.MustCompile(`<a\s+([^>]*?)href\s*=\s*["']([^"']+)["']([^>]*?)>`)
html = linkRegex.ReplaceAllStringFunc(html, func(match string) string {
// Extract href value
hrefMatch := regexp.MustCompile(`href\s*=\s*["']([^"']+)["']`)
hrefSubmatch := hrefMatch.FindStringSubmatch(match)
if len(hrefSubmatch) < 2 {
return match // No href found, return as-is
}
href := hrefSubmatch[1]
// Check if it's an external link (starts with http:// or https://)
isExternal := strings.HasPrefix(href, "http://") || strings.HasPrefix(href, "https://")
if isExternal {
// Check if it's pointing to our own domain
if linkBaseDomain != "" && strings.Contains(href, linkBaseDomain) {
// Same domain - open in same tab (remove any existing target attribute)
targetRegex := regexp.MustCompile(`\s*target\s*=\s*["'][^"']*["']`)
match = targetRegex.ReplaceAllString(match, "")
return match
}
// External link - add target="_blank" and rel="noopener noreferrer" if not already present
if !strings.Contains(match, `target=`) {
// Insert before the closing >
match = strings.TrimSuffix(match, ">")
if !strings.Contains(match, `rel=`) {
match += ` target="_blank" rel="noopener noreferrer">`
} else {
// Update existing rel attribute to include noopener if not present
relRegex := regexp.MustCompile(`rel\s*=\s*["']([^"']*)["']`)
match = relRegex.ReplaceAllStringFunc(match, func(relMatch string) string {
relValue := relRegex.FindStringSubmatch(relMatch)[1]
if !strings.Contains(relValue, "noopener") {
relValue += " noopener noreferrer"
}
return `rel="` + strings.TrimSpace(relValue) + `"`
})
match += ` target="_blank">`
}
}
} else {
// Local/relative link - ensure it opens in same tab (remove target if present)
targetRegex := regexp.MustCompile(`\s*target\s*=\s*["'][^"']*["']`)
match = targetRegex.ReplaceAllString(match, "")
}
return match
})
return html
return &ProcessResult{
Content: result.Content,
TableOfContents: result.TableOfContents,
HasLaTeX: result.HasLaTeX,
HasMusicalNotation: result.HasMusicalNotation,
}, nil
}

89
internal/cache/cache.go vendored

@ -0,0 +1,89 @@ @@ -0,0 +1,89 @@
package cache
import (
"bytes"
"compress/gzip"
"sync"
"time"
)
// Cache stores generated HTML pages
type Cache struct {
pages map[string]*CachedPage
mu sync.RWMutex
}
// NewCache creates a new cache
func NewCache() *Cache {
return &Cache{
pages: make(map[string]*CachedPage),
}
}
// Get retrieves a page from cache
func (c *Cache) Get(path string) (*CachedPage, bool) {
c.mu.RLock()
defer c.mu.RUnlock()
page, exists := c.pages[path]
return page, exists
}
// Set stores a page in cache
func (c *Cache) Set(path string, content string) error {
c.mu.Lock()
defer c.mu.Unlock()
// Generate ETag
etag := GenerateETag(content)
// Pre-compress content
var compressed bytes.Buffer
writer := gzip.NewWriter(&compressed)
if _, err := writer.Write([]byte(content)); err != nil {
return err
}
if err := writer.Close(); err != nil {
return err
}
c.pages[path] = &CachedPage{
Content: content,
ETag: etag,
LastUpdated: time.Now(),
Compressed: compressed.Bytes(),
}
return nil
}
// Delete removes a page from cache
func (c *Cache) Delete(path string) {
c.mu.Lock()
defer c.mu.Unlock()
delete(c.pages, path)
}
// Clear clears all cached pages
func (c *Cache) Clear() {
c.mu.Lock()
defer c.mu.Unlock()
c.pages = make(map[string]*CachedPage)
}
// Size returns the number of cached pages
func (c *Cache) Size() int {
c.mu.RLock()
defer c.mu.RUnlock()
return len(c.pages)
}
// GetAllPaths returns all cached page paths
func (c *Cache) GetAllPaths() []string {
c.mu.RLock()
defer c.mu.RUnlock()
paths := make([]string, 0, len(c.pages))
for path := range c.pages {
paths = append(paths, path)
}
return paths
}

54
internal/cache/feed_cache.go vendored

@ -0,0 +1,54 @@ @@ -0,0 +1,54 @@
package cache
import (
"sync"
"time"
)
// FeedItem represents a cached feed item
type FeedItem struct {
EventID string
Author string
Content string
Time time.Time
Link string
Title string
Summary string
Image string
}
// FeedCache stores the kind 1 feed
type FeedCache struct {
items []FeedItem
mu sync.RWMutex
lastUpdated time.Time
}
// NewFeedCache creates a new feed cache
func NewFeedCache() *FeedCache {
return &FeedCache{
items: make([]FeedItem, 0),
}
}
// Set updates the feed cache
func (fc *FeedCache) Set(items []FeedItem) {
fc.mu.Lock()
defer fc.mu.Unlock()
fc.items = items
fc.lastUpdated = time.Now()
}
// Get retrieves feed items
func (fc *FeedCache) Get() []FeedItem {
fc.mu.RLock()
defer fc.mu.RUnlock()
return fc.items
}
// GetLastUpdated returns when the feed was last updated
func (fc *FeedCache) GetLastUpdated() time.Time {
fc.mu.RLock()
defer fc.mu.RUnlock()
return fc.lastUpdated
}

198
internal/cache/media_cache.go vendored

@ -0,0 +1,198 @@ @@ -0,0 +1,198 @@
package cache
import (
"context"
"crypto/sha256"
"encoding/hex"
"fmt"
"io"
"net/http"
"os"
"path/filepath"
"sync"
"time"
"gitcitadel-online/internal/logger"
)
// MediaCache handles caching of images and other media from events
type MediaCache struct {
cacheDir string
activeEvents map[string]time.Time // eventID -> last seen time
mu sync.RWMutex
httpClient *http.Client
}
// NewMediaCache creates a new media cache
func NewMediaCache(cacheDir string) (*MediaCache, error) {
// Create cache directory if it doesn't exist
if err := os.MkdirAll(cacheDir, 0755); err != nil {
return nil, fmt.Errorf("failed to create media cache directory: %w", err)
}
mc := &MediaCache{
cacheDir: cacheDir,
activeEvents: make(map[string]time.Time),
httpClient: &http.Client{
Timeout: 30 * time.Second,
},
}
// Start cleanup goroutine
go mc.cleanupLoop(context.Background())
return mc, nil
}
// CacheMedia downloads and caches a media file from a URL
// Returns the local path to the cached file, or the original URL if caching fails
func (mc *MediaCache) CacheMedia(ctx context.Context, url string, eventID string) (string, error) {
if url == "" {
return "", fmt.Errorf("empty URL")
}
// Mark event as active
mc.mu.Lock()
mc.activeEvents[eventID] = time.Now()
mc.mu.Unlock()
// Generate cache filename from URL hash
hash := sha256.Sum256([]byte(url))
filename := hex.EncodeToString(hash[:]) + filepath.Ext(url)
cachePath := filepath.Join(mc.cacheDir, filename)
// Check if already cached
if _, err := os.Stat(cachePath); err == nil {
return "/cache/media/" + filename, nil
}
// Download the media
req, err := http.NewRequestWithContext(ctx, "GET", url, nil)
if err != nil {
return url, fmt.Errorf("failed to create request: %w", err)
}
// Set user agent
req.Header.Set("User-Agent", "GitCitadel-Online/1.0")
resp, err := mc.httpClient.Do(req)
if err != nil {
logger.WithFields(map[string]interface{}{
"url": url,
"eventID": eventID,
"error": err,
}).Warn("Failed to download media")
return url, fmt.Errorf("failed to download: %w", err)
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
return url, fmt.Errorf("unexpected status code: %d", resp.StatusCode)
}
// Check content type - only cache images
contentType := resp.Header.Get("Content-Type")
if !isImageContentType(contentType) {
logger.WithFields(map[string]interface{}{
"url": url,
"contentType": contentType,
}).Debug("Skipping non-image media")
return url, nil
}
// Create cache file
file, err := os.Create(cachePath)
if err != nil {
return url, fmt.Errorf("failed to create cache file: %w", err)
}
defer file.Close()
// Copy response to file
_, err = io.Copy(file, resp.Body)
if err != nil {
os.Remove(cachePath) // Clean up on error
return url, fmt.Errorf("failed to write cache file: %w", err)
}
logger.WithFields(map[string]interface{}{
"url": url,
"eventID": eventID,
"cachePath": cachePath,
}).Debug("Cached media file")
return "/cache/media/" + filename, nil
}
// GetCacheDir returns the cache directory path
func (mc *MediaCache) GetCacheDir() string {
return mc.cacheDir
}
// isImageContentType checks if a content type is an image
func isImageContentType(contentType string) bool {
imageTypes := []string{
"image/jpeg",
"image/jpg",
"image/png",
"image/gif",
"image/webp",
"image/svg+xml",
"image/bmp",
"image/x-icon",
}
for _, imgType := range imageTypes {
if contentType == imgType {
return true
}
}
return false
}
// MarkEventActive marks an event as currently active (displayed)
func (mc *MediaCache) MarkEventActive(eventID string) {
mc.mu.Lock()
defer mc.mu.Unlock()
mc.activeEvents[eventID] = time.Now()
}
// cleanupLoop periodically removes media for events that are no longer active
func (mc *MediaCache) cleanupLoop(ctx context.Context) {
ticker := time.NewTicker(1 * time.Hour) // Run cleanup every hour
defer ticker.Stop()
for {
select {
case <-ctx.Done():
return
case <-ticker.C:
mc.cleanup()
}
}
}
// cleanup removes media files for events that haven't been seen in 24 hours
func (mc *MediaCache) cleanup() {
mc.mu.Lock()
defer mc.mu.Unlock()
cutoff := time.Now().Add(-24 * time.Hour)
var toRemove []string
// Find events that are no longer active
for eventID, lastSeen := range mc.activeEvents {
if lastSeen.Before(cutoff) {
toRemove = append(toRemove, eventID)
}
}
// Remove inactive events from tracking
for _, eventID := range toRemove {
delete(mc.activeEvents, eventID)
}
// Note: We don't delete the actual files here because multiple events might use the same image
// Instead, we rely on the fact that if an event is no longer displayed, its media won't be accessed
// A more sophisticated cleanup would track which files are used by which events
logger.WithField("removed_events", len(toRemove)).Debug("Cleaned up inactive events from media cache")
}

29
internal/cache/page.go vendored

@ -0,0 +1,29 @@ @@ -0,0 +1,29 @@
package cache
import (
"fmt"
"time"
)
// CachedPage represents a cached HTML page
type CachedPage struct {
Content string
ETag string
LastUpdated time.Time
Compressed []byte // Pre-compressed gzip content
}
// IsStale checks if the cached page is stale based on maxAge
func (cp *CachedPage) IsStale(maxAge time.Duration) bool {
return time.Since(cp.LastUpdated) > maxAge
}
// GenerateETag generates an ETag for the content
func GenerateETag(content string) string {
// Simple ETag based on content hash
hash := 0
for _, b := range []byte(content) {
hash = hash*31 + int(b)
}
return fmt.Sprintf(`"%x"`, hash)
}

411
internal/cache/rewarm.go vendored

@ -0,0 +1,411 @@ @@ -0,0 +1,411 @@
package cache
import (
"context"
"html/template"
"time"
"gitcitadel-online/internal/generator"
"gitcitadel-online/internal/logger"
"gitcitadel-online/internal/nostr"
)
// Rewarmer handles cache rewarming
type Rewarmer struct {
cache *Cache
feedCache *FeedCache
wikiService *nostr.WikiService
feedService *nostr.FeedService
ebooksService *nostr.EBooksService
htmlGenerator *generator.HTMLGenerator
wikiIndex string
blogIndex string
feedRelay string
maxFeedEvents int
interval time.Duration
feedInterval time.Duration
}
// NewRewarmer creates a new cache rewarming service
func NewRewarmer(
cache *Cache,
feedCache *FeedCache,
wikiService *nostr.WikiService,
feedService *nostr.FeedService,
ebooksService *nostr.EBooksService,
htmlGenerator *generator.HTMLGenerator,
wikiIndex, blogIndex, feedRelay string,
maxFeedEvents int,
interval, feedInterval time.Duration,
) *Rewarmer {
return &Rewarmer{
cache: cache,
feedCache: feedCache,
wikiService: wikiService,
feedService: feedService,
ebooksService: ebooksService,
htmlGenerator: htmlGenerator,
wikiIndex: wikiIndex,
blogIndex: blogIndex,
feedRelay: feedRelay,
maxFeedEvents: maxFeedEvents,
interval: interval,
feedInterval: feedInterval,
}
}
// Start starts the rewarming goroutines
func (r *Rewarmer) Start(ctx context.Context) {
// Initial population
go r.rewarmPages(ctx)
go r.rewarmFeed(ctx)
// Periodic rewarming
go r.periodicRewarmPages(ctx)
go r.periodicRewarmFeed(ctx)
}
// rewarmPages rewarms the page cache
func (r *Rewarmer) rewarmPages(ctx context.Context) {
logger.Info("Starting page cache rewarming...")
// Initialize wikiPages as empty - will be populated if wiki fetch succeeds
wikiPages := make([]generator.WikiPageInfo, 0)
// Fetch wiki index (non-blocking - landing page can still be generated)
// If theforest fails, leave pages as-is (don't remove existing events)
wikiIndex, err := r.wikiService.FetchWikiIndex(ctx, r.wikiIndex)
if err != nil {
logger.Warnf("Error fetching wiki index from theforest: %v - keeping existing pages", err)
// Don't update cache - leave existing pages as-is
// Continue to generate landing page even if wiki fetch fails
} else {
// Fetch wiki events
// If theforest fails, leave pages as-is (don't remove existing events)
wikiEvents, err := r.wikiService.FetchWikiEvents(ctx, wikiIndex)
if err != nil {
logger.Warnf("Error fetching wiki events from theforest: %v - keeping existing pages", err)
// Don't update cache - leave existing pages as-is
} else {
// Build wiki page info for navigation
wikiPages = make([]generator.WikiPageInfo, 0, len(wikiEvents))
for _, event := range wikiEvents {
wikiPages = append(wikiPages, generator.WikiPageInfo{
DTag: event.DTag,
Title: event.Title,
})
}
// Generate and cache wiki index page
wikiIndexHTML, err := r.htmlGenerator.GenerateWikiIndexPage(wikiIndex, wikiPages, []generator.FeedItemInfo{})
if err != nil {
logger.Errorf("Error generating wiki index page: %v", err)
} else {
if err := r.cache.Set("/wiki", wikiIndexHTML); err != nil {
logger.Errorf("Error caching wiki index page: %v", err)
} else {
logger.WithField("pages", len(wikiPages)).Info("Wiki index page cached successfully")
}
}
// Generate and cache wiki pages
for _, event := range wikiEvents {
html, err := r.htmlGenerator.GenerateWikiPage(event, wikiPages, []generator.FeedItemInfo{})
if err != nil {
logger.WithField("dtag", event.DTag).Errorf("Error generating wiki page: %v", err)
continue
}
if err := r.cache.Set("/wiki/"+event.DTag, html); err != nil {
logger.WithField("dtag", event.DTag).Errorf("Error caching wiki page: %v", err)
}
}
}
}
// Fetch blog index if configured (needed for landing page)
// If theforest fails, leave pages as-is (don't remove existing events)
var newestBlogItem *generator.BlogItemInfo
if r.blogIndex != "" {
blogIndex, err := r.wikiService.FetchWikiIndex(ctx, r.blogIndex)
if err != nil {
logger.Warnf("Error fetching blog index from theforest: %v - keeping existing pages", err)
// Don't update cache - leave existing pages as-is
} else {
// Fetch blog events using the generic FetchIndexEvents function
// If theforest fails, leave pages as-is (don't remove existing events)
blogKind := r.wikiService.GetBlogKind()
blogEventList, err := r.wikiService.FetchIndexEvents(ctx, blogIndex, blogKind)
if err != nil {
logger.Warnf("Error fetching blog events from theforest: %v - keeping existing pages", err)
// Don't update cache - leave existing pages as-is
} else {
logger.WithFields(map[string]interface{}{
"events": len(blogEventList),
"kind": blogKind,
}).Debug("Fetched blog events")
blogItems := make([]generator.BlogItemInfo, 0, len(blogEventList))
for _, event := range blogEventList {
// Parse the blog event
blog, err := nostr.ParseBlogEvent(event, blogKind)
if err != nil {
logger.WithField("event_id", event.ID).Warnf("Error parsing blog event: %v", err)
continue
}
html, err := r.htmlGenerator.ProcessAsciiDoc(blog.Content)
if err != nil {
logger.WithField("dtag", blog.DTag).Warnf("Error processing blog content: %v", err)
html = blog.Content // Fallback to raw content
}
blogItems = append(blogItems, generator.BlogItemInfo{
DTag: blog.DTag,
Title: blog.Title,
Summary: blog.Summary,
Content: template.HTML(html),
Author: event.PubKey,
Image: blog.Image,
CreatedAt: int64(event.CreatedAt),
})
}
logger.WithField("items", len(blogItems)).Debug("Generated blog items")
// Get newest blog item for landing page
if len(blogItems) > 0 {
newestBlogItem = &blogItems[0]
}
// Generate blog page without feed items (feed only on landing page)
blogHTML, err := r.htmlGenerator.GenerateBlogPage(blogIndex, blogItems, []generator.FeedItemInfo{})
if err != nil {
logger.Errorf("Error generating blog page: %v", err)
} else {
if err := r.cache.Set("/blog", blogHTML); err != nil {
logger.Errorf("Error caching blog page: %v", err)
} else {
logger.WithField("items", len(blogItems)).Info("Blog page cached successfully")
}
}
}
}
}
// Fetch and cache articles page (longform articles) - needed for landing page
// If theforest fails, leave pages as-is (don't remove existing events)
var allArticleItems []generator.ArticleItemInfo
var newestArticleItem *generator.ArticleItemInfo
longformKind := r.wikiService.GetLongformKind()
if longformKind > 0 {
articleEvents, err := r.wikiService.FetchLongformArticles(ctx, "wss://theforest.nostr1.com", longformKind, 50)
if err != nil {
logger.Warnf("Error fetching longform articles from theforest: %v - keeping existing pages", err)
// Don't update cache - leave existing pages as-is
} else {
articleItems := make([]generator.ArticleItemInfo, 0, len(articleEvents))
for _, event := range articleEvents {
// Parse the longform article
article, err := nostr.ParseLongformEvent(event, longformKind)
if err != nil {
logger.WithField("event_id", event.ID).Warnf("Error parsing longform article: %v", err)
continue
}
// Process content using gc-parser (handles Markdown, AsciiDoc, etc.)
result, err := r.htmlGenerator.ProcessAsciiDoc(article.Content)
var html string
if err != nil {
logger.WithField("dtag", article.DTag).Warnf("Error processing content: %v", err)
html = article.Content // Fallback to raw content
} else {
html = result
}
articleItems = append(articleItems, generator.ArticleItemInfo{
DTag: article.DTag,
Title: article.Title,
Summary: article.Summary,
Content: template.HTML(html),
Author: event.PubKey,
Image: article.Image,
CreatedAt: int64(event.CreatedAt),
})
}
logger.WithField("items", len(articleItems)).Debug("Generated article items")
// Store all article items for landing page
allArticleItems = articleItems
// Get newest article item for landing page
if len(articleItems) > 0 {
newestArticleItem = &articleItems[0]
}
// Generate articles page
articlesHTML, err := r.htmlGenerator.GenerateArticlesPage(articleItems, []generator.FeedItemInfo{})
if err != nil {
logger.Errorf("Error generating articles page: %v", err)
} else {
if err := r.cache.Set("/articles", articlesHTML); err != nil {
logger.Errorf("Error caching articles page: %v", err)
} else {
logger.WithField("items", len(articleItems)).Info("Articles page cached successfully")
}
}
}
}
// Fetch and cache e-books page (needed for landing page)
// If theforest fails, leave pages as-is (don't remove existing events)
var allEBooks []generator.EBookInfo
if r.ebooksService != nil {
ebooks, err := r.ebooksService.FetchTopLevelIndexEvents(ctx)
if err != nil {
logger.Warnf("Error fetching e-books from theforest: %v - keeping existing pages", err)
// Don't update cache - leave existing pages as-is
} else {
// Convert to generator.EBookInfo
generatorEBooks := make([]generator.EBookInfo, 0, len(ebooks))
for _, ebook := range ebooks {
generatorEBooks = append(generatorEBooks, generator.EBookInfo{
EventID: ebook.EventID,
Title: ebook.Title,
DTag: ebook.DTag,
Author: ebook.Author,
Summary: ebook.Summary,
Image: ebook.Image,
Type: ebook.Type,
CreatedAt: ebook.CreatedAt,
Naddr: ebook.Naddr,
})
}
// Store all e-books for landing page
allEBooks = generatorEBooks
ebooksHTML, err := r.htmlGenerator.GenerateEBooksPage(generatorEBooks, []generator.FeedItemInfo{})
if err != nil {
logger.Errorf("Error generating e-books page: %v", err)
} else {
if err := r.cache.Set("/ebooks", ebooksHTML); err != nil {
logger.Errorf("Error caching e-books page: %v", err)
} else {
logger.WithField("ebooks", len(generatorEBooks)).Info("E-books page cached successfully")
}
}
}
}
// Always generate landing page AFTER blog, articles, and e-books are fetched and cached
// Now we have all the data needed for the landing page
landingHTML, err := r.htmlGenerator.GenerateLandingPage(wikiPages, newestBlogItem, newestArticleItem, allArticleItems, allEBooks)
if err != nil {
logger.Errorf("Error generating landing page: %v", err)
} else {
if err := r.cache.Set("/", landingHTML); err != nil {
logger.Errorf("Error caching landing page: %v", err)
} else {
logger.WithField("pages", len(wikiPages)).Info("Landing page cached successfully")
}
}
// Generate and cache Feed page (using feed items from cache)
feedItems := r.convertFeedItemsToInfo(r.feedCache.Get())
feedHTML, err := r.htmlGenerator.GenerateFeedPage(feedItems)
if err != nil {
logger.Errorf("Error generating feed page: %v", err)
} else {
if err := r.cache.Set("/feed", feedHTML); err != nil {
logger.Errorf("Error caching feed page: %v", err)
} else {
logger.WithField("items", len(feedItems)).Info("Feed page cached successfully")
}
}
logger.Info("Page cache rewarming completed")
}
// rewarmFeed rewarms the feed cache
func (r *Rewarmer) rewarmFeed(ctx context.Context) {
logger.WithFields(map[string]interface{}{
"relay": r.feedRelay,
"max_events": r.maxFeedEvents,
}).Info("Starting feed cache rewarming")
nostrItems, err := r.feedService.FetchFeedItems(ctx, r.feedRelay, r.maxFeedEvents)
if err != nil {
logger.WithField("relay", r.feedRelay).Warnf("Error fetching feed: %v", err)
// Don't clear the cache on error - keep old items
return
}
if len(nostrItems) == 0 {
logger.WithField("relay", r.feedRelay).Warn("No feed items fetched")
// Don't clear the cache - keep old items
return
}
// Convert nostr.FeedItem to cache.FeedItem
items := make([]FeedItem, 0, len(nostrItems))
for _, item := range nostrItems {
items = append(items, FeedItem{
EventID: item.EventID,
Author: item.Author,
Content: item.Content,
Time: item.Time,
Link: item.Link,
Title: item.Title,
Summary: item.Summary,
Image: item.Image,
})
}
r.feedCache.Set(items)
logger.WithFields(map[string]interface{}{
"items": len(items),
"relay": r.feedRelay,
}).Info("Feed cache rewarmed successfully")
}
// periodicRewarmPages periodically rewarms pages
func (r *Rewarmer) periodicRewarmPages(ctx context.Context) {
ticker := time.NewTicker(r.interval)
defer ticker.Stop()
for {
select {
case <-ctx.Done():
return
case <-ticker.C:
r.rewarmPages(ctx)
}
}
}
// convertFeedItemsToInfo converts cache.FeedItem to generator.FeedItemInfo
func (r *Rewarmer) convertFeedItemsToInfo(items []FeedItem) []generator.FeedItemInfo {
feedItems := make([]generator.FeedItemInfo, 0, len(items))
for _, item := range items {
feedItems = append(feedItems, generator.FeedItemInfo{
EventID: item.EventID,
Author: item.Author,
Content: item.Content,
Time: item.Time.Format("2006-01-02 15:04:05"),
TimeISO: item.Time.Format(time.RFC3339),
Link: item.Link,
})
}
return feedItems
}
// periodicRewarmFeed periodically rewarms feed
func (r *Rewarmer) periodicRewarmFeed(ctx context.Context) {
ticker := time.NewTicker(r.feedInterval)
defer ticker.Stop()
for {
select {
case <-ctx.Done():
return
case <-ticker.C:
r.rewarmFeed(ctx)
}
}
}

144
internal/generator/html.go

@ -7,7 +7,6 @@ import ( @@ -7,7 +7,6 @@ import (
"fmt"
"html/template"
"os"
"os/exec"
"path/filepath"
"strings"
"time"
@ -179,6 +178,21 @@ type EBookInfo struct { @@ -179,6 +178,21 @@ type EBookInfo struct {
TimeISO string // ISO time
}
// EventCardInfo represents info about an event card for the events page
type EventCardInfo struct {
EventID string
Title string
DTag string
Author string
Summary string
Image string
Kind int
URL string // URL to the event page (based on kind:pubkey:dtag)
CreatedAt int64
Time string // Formatted time
TimeISO string // ISO time
}
// NewHTMLGenerator creates a new HTML generator
func NewHTMLGenerator(templateDir string, linkBaseURL, siteName, siteURL, defaultImage string, nostrClient *nostr.Client) (*HTMLGenerator, error) {
tmpl := template.New("base").Funcs(getTemplateFuncs())
@ -193,6 +207,7 @@ func NewHTMLGenerator(templateDir string, linkBaseURL, siteName, siteURL, defaul @@ -193,6 +207,7 @@ func NewHTMLGenerator(templateDir string, linkBaseURL, siteName, siteURL, defaul
"wiki.html",
"contact.html",
"feed.html",
"events.html",
"404.html",
"500.html",
}
@ -261,59 +276,6 @@ func (g *HTMLGenerator) ProcessAsciiDoc(content string) (string, error) { @@ -261,59 +276,6 @@ func (g *HTMLGenerator) ProcessAsciiDoc(content string) (string, error) {
return result.Content, nil
}
// ProcessMarkdown processes Markdown content to HTML using marked via Node.js
func (g *HTMLGenerator) ProcessMarkdown(markdownContent string) (string, error) {
// Check if node is available
cmd := exec.Command("node", "--version")
if err := cmd.Run(); err != nil {
return "", fmt.Errorf("node.js not found: %w", err)
}
// JavaScript code to run marked
jsCode := `
const { marked } = require('marked');
let content = '';
process.stdin.setEncoding('utf8');
process.stdin.on('data', (chunk) => {
content += chunk;
});
process.stdin.on('end', () => {
try {
// Configure marked options
marked.setOptions({
breaks: true,
gfm: true,
headerIds: true,
mangle: false
});
const html = marked.parse(content);
process.stdout.write(html);
} catch (error) {
console.error('Error converting Markdown:', error.message);
process.exit(1);
}
});
`
// Run node with the JavaScript code, passing content via stdin
cmd = exec.Command("node", "-e", jsCode)
cmd.Stdin = strings.NewReader(markdownContent)
var stdout, stderr bytes.Buffer
cmd.Stdout = &stdout
cmd.Stderr = &stderr
if err := cmd.Run(); err != nil {
return "", fmt.Errorf("marked conversion failed: %w, stderr: %s", err, stderr.String())
}
return stdout.String(), nil
}
// GenerateLandingPage generates the static landing page
func (g *HTMLGenerator) GenerateLandingPage(wikiPages []WikiPageInfo, newestBlogItem *BlogItemInfo, newestArticleItem *ArticleItemInfo, allArticleItems []ArticleItemInfo, allEBooks []EBookInfo) (string, error) {
// Collect pubkeys from blog and article items
@ -883,6 +845,80 @@ func (g *HTMLGenerator) GenerateErrorPage(statusCode int, feedItems []FeedItemIn @@ -883,6 +845,80 @@ func (g *HTMLGenerator) GenerateErrorPage(statusCode int, feedItems []FeedItemIn
return g.renderTemplate(fmt.Sprintf("%d.html", statusCode), data)
}
// GenerateEventsPage generates the events page for a specific d-tag
func (g *HTMLGenerator) GenerateEventsPage(dTag string, eventCards []EventCardInfo, feedItems []FeedItemInfo) (string, error) {
canonicalURL := g.siteURL + "/events?d=" + dTag
// Collect pubkeys from event cards
pubkeys := make([]string, 0, len(eventCards))
seenPubkeys := make(map[string]bool)
for _, card := range eventCards {
if card.Author != "" && !seenPubkeys[card.Author] {
pubkeys = append(pubkeys, card.Author)
seenPubkeys[card.Author] = true
}
}
// Fetch profiles for all authors
ctx := context.Background()
profiles := g.fetchProfilesBatch(ctx, pubkeys)
description := fmt.Sprintf("Events with d-tag: %s", dTag)
if len(eventCards) > 0 {
description = fmt.Sprintf("Browse %d events with d-tag: %s", len(eventCards), dTag)
}
data := PageData{
Title: fmt.Sprintf("Events: %s", dTag),
Description: description,
CanonicalURL: canonicalURL,
OGImage: g.siteURL + g.defaultImage,
OGType: "website",
SiteName: g.siteName,
SiteURL: g.siteURL,
CurrentYear: time.Now().Year(),
WikiPages: []WikiPageInfo{},
FeedItems: feedItems,
Content: template.HTML(""), // Content comes from template
Profiles: profiles,
}
// Add event cards to a custom field - we'll need to extend PageData or use a map
// For now, let's use a custom template data structure
type EventsPageData struct {
PageData
EventCards []EventCardInfo
DTag string
}
eventsData := EventsPageData{
PageData: data,
EventCards: eventCards,
DTag: dTag,
}
// Use renderTemplate but with custom data for events
renderTmpl := template.New("events-render").Funcs(getTemplateFuncs())
files := []string{
filepath.Join(g.templateDir, "components.html"),
filepath.Join(g.templateDir, "base.html"),
filepath.Join(g.templateDir, "events.html"),
}
_, err := renderTmpl.ParseFiles(files...)
if err != nil {
return "", fmt.Errorf("failed to parse events templates: %w", err)
}
var buf bytes.Buffer
if err := renderTmpl.ExecuteTemplate(&buf, "base.html", eventsData); err != nil {
return "", fmt.Errorf("failed to execute events template: %w", err)
}
return buf.String(), nil
}
// GenerateFeedPage generates a full feed page
func (g *HTMLGenerator) GenerateFeedPage(feedItems []FeedItemInfo) (string, error) {
canonicalURL := g.siteURL + "/feed"

219
internal/server/handlers.go

@ -36,6 +36,7 @@ func (s *Server) setupRoutes(mux *http.ServeMux) { @@ -36,6 +36,7 @@ func (s *Server) setupRoutes(mux *http.ServeMux) {
mux.HandleFunc("/ebooks", s.handleEBooks)
mux.HandleFunc("/contact", s.handleContact)
mux.HandleFunc("/feed", s.handleFeed)
mux.HandleFunc("/events", s.handleEvents)
// Health and metrics
mux.HandleFunc("/health", s.handleHealth)
@ -69,6 +70,90 @@ func (s *Server) handleLanding(w http.ResponseWriter, r *http.Request) { @@ -69,6 +70,90 @@ func (s *Server) handleLanding(w http.ResponseWriter, r *http.Request) {
func (s *Server) handleWiki(w http.ResponseWriter, r *http.Request) {
path := r.URL.Path
// Check for event ID first (fastest lookup)
eventID := r.URL.Query().Get("e")
if eventID != "" {
// Fetch event by ID (fastest method)
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
defer cancel()
filter := gonostr.Filter{
IDs: []string{eventID},
}
event, err := s.nostrClient.FetchEvent(ctx, filter)
if err == nil && event != nil {
// Parse as wiki event
wikiEvent, err := nostr.ParseWikiEvent(event, event.Kind)
if err == nil {
// Get wiki pages for navigation (empty for now, could be fetched)
wikiPages := []generator.WikiPageInfo{}
// Generate the wiki page
html, err := s.htmlGenerator.GenerateWikiPage(wikiEvent, wikiPages, []generator.FeedItemInfo{})
if err == nil {
w.Header().Set("Content-Type", "text/html; charset=utf-8")
w.Write([]byte(html))
return
}
}
}
}
// Fallback: Check for hash-based event reference in query parameter
// Format: kind:pubkey:dtag (e.g., "30818:dd664d5e...:nkbip-04")
ref := r.URL.Query().Get("ref")
if ref == "" {
// Try to get from fragment (though browsers don't send this, JavaScript can convert it)
ref = r.URL.Query().Get("k") + ":" + r.URL.Query().Get("a") + ":" + r.URL.Query().Get("d")
if strings.HasPrefix(ref, ":") {
ref = ""
}
}
// If we have a reference, fetch and render that specific event
if ref != "" {
parts := strings.Split(ref, ":")
if len(parts) == 3 {
var kind int
if _, err := fmt.Sscanf(parts[0], "%d", &kind); err == nil {
pubkey := parts[1]
dTag := parts[2]
// Fetch the specific wiki event
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
defer cancel()
filter := gonostr.Filter{
Kinds: []int{kind},
Authors: []string{pubkey},
Tags: map[string][]string{
"d": {dTag},
},
Limit: 1,
}
events, err := s.nostrClient.FetchEvents(ctx, filter)
if err == nil && len(events) > 0 {
// Parse as wiki event
wikiEvent, err := nostr.ParseWikiEvent(events[0], kind)
if err == nil {
// Get wiki pages for navigation (empty for now, could be fetched)
wikiPages := []generator.WikiPageInfo{}
// Generate the wiki page
html, err := s.htmlGenerator.GenerateWikiPage(wikiEvent, wikiPages, []generator.FeedItemInfo{})
if err == nil {
w.Header().Set("Content-Type", "text/html; charset=utf-8")
w.Write([]byte(html))
return
}
}
}
}
}
}
// Handle wiki index page (/wiki or /wiki/)
if path == "/wiki" || path == "/wiki/" {
page, exists := s.cache.Get("/wiki")
@ -122,6 +207,136 @@ func (s *Server) handleFeed(w http.ResponseWriter, r *http.Request) { @@ -122,6 +207,136 @@ func (s *Server) handleFeed(w http.ResponseWriter, r *http.Request) {
s.handleCachedPage("/feed")(w, r)
}
// handleEvents handles the /events?d=... page to show events with a specific d-tag
func (s *Server) handleEvents(w http.ResponseWriter, r *http.Request) {
if r.Method != http.MethodGet {
http.Error(w, "Method not allowed", http.StatusMethodNotAllowed)
return
}
// Get d-tag from query parameter
dTag := r.URL.Query().Get("d")
if dTag == "" {
s.handle404(w, r)
return
}
// Query events with this d-tag across multiple kinds
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Second)
defer cancel()
// Search across wiki, blog, and longform kinds
kinds := []int{nostr.KindWiki, nostr.KindBlog, nostr.KindLongform}
var allEvents []*gonostr.Event
for _, kind := range kinds {
filter := gonostr.Filter{
Kinds: []int{kind},
Tags: map[string][]string{
"d": {dTag},
},
Limit: 100, // Get up to 100 events per kind
}
events, err := s.nostrClient.FetchEvents(ctx, filter)
if err != nil {
logger.WithFields(map[string]interface{}{
"kind": kind,
"dtag": dTag,
"error": err,
}).Warn("Failed to fetch events for d-tag")
continue
}
allEvents = append(allEvents, events...)
}
if len(allEvents) == 0 {
// No events found - show 404 or empty page
html, err := s.htmlGenerator.GenerateErrorPage(http.StatusNotFound, []generator.FeedItemInfo{})
if err != nil {
http.Error(w, "Not found", http.StatusNotFound)
return
}
w.Header().Set("Content-Type", "text/html; charset=utf-8")
w.WriteHeader(http.StatusNotFound)
w.Write([]byte(html))
return
}
// Convert events to EventCardInfo
eventCards := make([]generator.EventCardInfo, 0, len(allEvents))
for _, event := range allEvents {
// Extract title, summary, image from tags
title := dTag
summary := ""
image := ""
for _, tag := range event.Tags {
if len(tag) > 0 && len(tag) > 1 {
switch tag[0] {
case "title":
title = tag[1]
case "summary":
summary = tag[1]
case "image":
image = tag[1]
}
}
}
// Build URL based on kind:pubkey:dtag format
var url string
switch event.Kind {
case nostr.KindBlog:
// Blog uses hash format with full identifier
url = fmt.Sprintf("/blog#%d:%s:%s", event.Kind, event.PubKey, dTag)
case nostr.KindLongform:
// Articles use hash format with full identifier
url = fmt.Sprintf("/articles#%d:%s:%s", event.Kind, event.PubKey, dTag)
case nostr.KindWiki:
// Wiki uses query parameter format with event ID for faster lookup
// Fallback to ref format if needed
url = fmt.Sprintf("/wiki?e=%s&ref=%d:%s:%s", event.ID, event.Kind, event.PubKey, dTag)
default:
// Fallback
url = fmt.Sprintf("/events?d=%s", dTag)
}
// Format time
createdTime := time.Unix(int64(event.CreatedAt), 0)
timeStr := createdTime.Format("Jan 2, 2006")
timeISO := createdTime.Format(time.RFC3339)
eventCards = append(eventCards, generator.EventCardInfo{
EventID: event.ID,
Title: title,
DTag: dTag,
Author: event.PubKey,
Summary: summary,
Image: image,
Kind: event.Kind,
URL: url,
CreatedAt: int64(event.CreatedAt),
Time: timeStr,
TimeISO: timeISO,
})
}
// Generate HTML page using the HTML generator
html, err := s.htmlGenerator.GenerateEventsPage(dTag, eventCards, []generator.FeedItemInfo{})
if err != nil {
logger.WithFields(map[string]interface{}{
"dtag": dTag,
"error": err,
}).Error("Failed to generate events page")
http.Error(w, "Internal server error", http.StatusInternalServerError)
return
}
w.Header().Set("Content-Type", "text/html; charset=utf-8")
w.Write([]byte(html))
}
// handleContact handles the contact form (GET and POST)
func (s *Server) handleContact(w http.ResponseWriter, r *http.Request) {
if r.Method == http.MethodGet {
@ -528,8 +743,8 @@ func (s *Server) middleware(next http.Handler) http.Handler { @@ -528,8 +743,8 @@ func (s *Server) middleware(next http.Handler) http.Handler {
w.Header().Set("X-XSS-Protection", "1; mode=block")
w.Header().Set("Referrer-Policy", "strict-origin-when-cross-origin")
// CSP header - allow unpkg.com for Lucide icons and jsdelivr.net for nostr-tools
w.Header().Set("Content-Security-Policy", "default-src 'self'; script-src 'self' 'unsafe-inline' https://unpkg.com https://cdn.jsdelivr.net; style-src 'self' 'unsafe-inline'; img-src 'self' data: https:; font-src 'self' data:;")
// CSP header - all scripts and styles are served locally
w.Header().Set("Content-Security-Policy", "default-src 'self'; script-src 'self' 'unsafe-inline'; style-src 'self' 'unsafe-inline'; img-src 'self' data: https:; font-src 'self' data:;")
// Log request (only in debug mode to reduce noise)
start := time.Now()

2
internal/server/server.go

@ -42,6 +42,8 @@ type IssueServiceInterface interface { @@ -42,6 +42,8 @@ type IssueServiceInterface interface {
type HTMLGeneratorInterface interface {
GenerateContactPage(success bool, errorMsg string, eventID string, formData map[string]string, repoAnnouncement *nostr.RepoAnnouncement, feedItems []generator.FeedItemInfo, profile *nostr.Profile) (string, error)
GenerateErrorPage(statusCode int, feedItems []generator.FeedItemInfo) (string, error)
GenerateEventsPage(dTag string, eventCards []generator.EventCardInfo, feedItems []generator.FeedItemInfo) (string, error)
GenerateWikiPage(wiki *nostr.WikiEvent, wikiPages []generator.WikiPageInfo, feedItems []generator.FeedItemInfo) (string, error)
}
// NewServer creates a new HTTP server

42
package-lock.json generated

@ -5,8 +5,10 @@ @@ -5,8 +5,10 @@
"packages": {
"": {
"dependencies": {
"@asciidoctor/core": "^3.0.4",
"marked": "^12.0.0"
"gc-parser": "git+https://git.imwald.eu/silberengel/gc-parser.git"
},
"devDependencies": {
"highlight.js": "^11.11.1"
}
},
"node_modules/@asciidoctor/core": {
@ -57,6 +59,14 @@ @@ -57,6 +59,14 @@
"integrity": "sha512-OO0pH2lK6a0hZnAdau5ItzHPI6pUlvI7jMVnxUQRtw4owF2wk8lOSabtGDCTP4Ggrg2MbGnWO9X8K1t4+fGMDw==",
"license": "ISC"
},
"node_modules/gc-parser": {
"version": "1.0.0",
"resolved": "git+https://git.imwald.eu/silberengel/gc-parser.git#f02450c08aeaf3f96b7afd979c959473fe0235a5",
"license": "MIT",
"dependencies": {
"@asciidoctor/core": "^3.0.4"
}
},
"node_modules/glob": {
"version": "8.1.0",
"resolved": "https://registry.npmjs.org/glob/-/glob-8.1.0.tgz",
@ -77,6 +87,16 @@ @@ -77,6 +87,16 @@
"url": "https://github.com/sponsors/isaacs"
}
},
"node_modules/highlight.js": {
"version": "11.11.1",
"resolved": "https://registry.npmjs.org/highlight.js/-/highlight.js-11.11.1.tgz",
"integrity": "sha512-Xwwo44whKBVCYoliBQwaPvtd/2tYFkRQtXDWj1nackaV2JPXx3L0+Jvd8/qCJ2p+ML0/XVkJ2q+Mr+UVdpJK5w==",
"dev": true,
"license": "BSD-3-Clause",
"engines": {
"node": ">=12.0.0"
}
},
"node_modules/inflight": {
"version": "1.0.6",
"resolved": "https://registry.npmjs.org/inflight/-/inflight-1.0.6.tgz",
@ -94,22 +114,10 @@ @@ -94,22 +114,10 @@
"integrity": "sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ==",
"license": "ISC"
},
"node_modules/marked": {
"version": "12.0.2",
"resolved": "https://registry.npmjs.org/marked/-/marked-12.0.2.tgz",
"integrity": "sha512-qXUm7e/YKFoqFPYPa3Ukg9xlI5cyAtGmyEIzMfW//m6kXwCy2Ps9DYf5ioijFKQ8qyuscrHoY04iJGctu2Kg0Q==",
"license": "MIT",
"bin": {
"marked": "bin/marked.js"
},
"engines": {
"node": ">= 18"
}
},
"node_modules/minimatch": {
"version": "5.1.6",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-5.1.6.tgz",
"integrity": "sha512-lKwV/1brpG6mBUFHtb7NUmtABCb2WZZmm2wNiOA5hAb8VdCS4B3dtMWyvcoViccwAW/COERjXLt0zP1zXUN26g==",
"version": "5.1.9",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-5.1.9.tgz",
"integrity": "sha512-7o1wEA2RyMP7Iu7GNba9vc0RWWGACJOCZBJX2GJWip0ikV+wcOsgVuY9uE8CPiyQhkGFSlhuSkZPavN7u1c2Fw==",
"license": "ISC",
"dependencies": {
"brace-expansion": "^2.0.1"

6
package.json

@ -1,6 +1,8 @@ @@ -1,6 +1,8 @@
{
"dependencies": {
"@asciidoctor/core": "^3.0.4",
"marked": "^12.0.0"
"gc-parser": "git+https://git.imwald.eu/silberengel/gc-parser.git"
},
"devDependencies": {
"highlight.js": "^11.11.1"
}
}

45
scripts/process-content.js

@ -0,0 +1,45 @@ @@ -0,0 +1,45 @@
#!/usr/bin/env node
/**
* Wrapper script to process content using gc-parser
* Called from Go code via exec
*/
const { Parser } = require('gc-parser');
// Read content from stdin
let content = '';
process.stdin.setEncoding('utf8');
process.stdin.on('data', (chunk) => {
content += chunk;
});
process.stdin.on('end', async () => {
try {
// Parse options from environment or command line args
const linkBaseURL = process.env.LINK_BASE_URL || process.argv[2] || '';
// Create parser with options
const parser = new Parser({
linkBaseURL: linkBaseURL,
enableAsciiDoc: true,
enableMarkdown: true,
enableCodeHighlighting: true,
enableLaTeX: true,
enableMusicalNotation: true,
enableNostrAddresses: true,
});
// Process content
const result = await parser.process(content);
// Output as JSON
console.log(JSON.stringify(result));
} catch (error) {
console.error(JSON.stringify({
error: error.message,
stack: error.stack,
}));
process.exit(1);
}
});

10
static/css/highlight.js.css

@ -0,0 +1,10 @@ @@ -0,0 +1,10 @@
pre code.hljs{display:block;overflow-x:auto;padding:1em}code.hljs{padding:3px 5px}/*!
Theme: GitHub Dark
Description: Dark theme as seen on github.com
Author: github.com
Maintainer: @Hirse
Updated: 2021-05-15
Outdated base version: https://github.com/primer/github-syntax-dark
Current colors taken from GitHub's CSS
*/.hljs{color:#c9d1d9;background:#0d1117}.hljs-doctag,.hljs-keyword,.hljs-meta .hljs-keyword,.hljs-template-tag,.hljs-template-variable,.hljs-type,.hljs-variable.language_{color:#ff7b72}.hljs-title,.hljs-title.class_,.hljs-title.class_.inherited__,.hljs-title.function_{color:#d2a8ff}.hljs-attr,.hljs-attribute,.hljs-literal,.hljs-meta,.hljs-number,.hljs-operator,.hljs-selector-attr,.hljs-selector-class,.hljs-selector-id,.hljs-variable{color:#79c0ff}.hljs-meta .hljs-string,.hljs-regexp,.hljs-string{color:#a5d6ff}.hljs-built_in,.hljs-symbol{color:#ffa657}.hljs-code,.hljs-comment,.hljs-formula{color:#8b949e}.hljs-name,.hljs-quote,.hljs-selector-pseudo,.hljs-selector-tag{color:#7ee787}.hljs-subst{color:#c9d1d9}.hljs-section{color:#1f6feb;font-weight:700}.hljs-bullet{color:#f2cc60}.hljs-emphasis{color:#c9d1d9;font-style:italic}.hljs-strong{color:#c9d1d9;font-weight:700}.hljs-addition{color:#aff5b4;background-color:#033a16}.hljs-deletion{color:#ffdcd7;background-color:#67060c}

113
static/css/main.css

@ -2169,3 +2169,116 @@ p.icon-inline, h1.icon-inline, h2.icon-inline, h3.icon-inline, h4.icon-inline, h @@ -2169,3 +2169,116 @@ p.icon-inline, h1.icon-inline, h2.icon-inline, h3.icon-inline, h4.icon-inline, h
font-weight: normal;
font-size: 0.95rem;
}
/* Events Page */
.events-page {
padding: 2rem 0;
}
.events-grid {
margin-top: 2rem;
}
.events-container {
display: grid;
grid-template-columns: repeat(auto-fill, minmax(300px, 1fr));
gap: 1.5rem;
margin-top: 2rem;
}
.event-card {
background: var(--bg-secondary);
border-radius: 8px;
border: 1px solid var(--border-color);
overflow: hidden;
transition: transform 0.2s, box-shadow 0.2s;
}
.event-card:hover {
transform: translateY(-2px);
box-shadow: 0 4px 12px rgba(0, 0, 0, 0.3);
}
.event-card-link {
display: block;
text-decoration: none;
color: inherit;
}
.event-card-image {
width: 100%;
aspect-ratio: 16 / 9;
overflow: hidden;
background: var(--bg-primary);
}
.event-card-image img {
width: 100%;
height: 100%;
object-fit: cover;
}
.event-card-content {
padding: 1.5rem;
}
.event-card-header {
display: flex;
justify-content: space-between;
align-items: flex-start;
gap: 1rem;
margin-bottom: 0.75rem;
}
.event-card-title {
font-size: 1.25rem;
font-weight: 600;
color: var(--text-primary);
margin: 0;
flex: 1;
line-height: 1.3;
}
.event-card-kind {
display: flex;
align-items: center;
gap: 0.25rem;
font-size: 0.875rem;
color: var(--text-secondary);
white-space: nowrap;
}
.event-card-summary {
color: var(--text-secondary);
margin: 0 0 1rem 0;
line-height: 1.5;
}
.event-card-meta {
display: flex;
flex-wrap: wrap;
gap: 1rem;
font-size: 0.875rem;
color: var(--text-secondary);
}
.event-card-author,
.event-card-date {
display: flex;
align-items: center;
gap: 0.5rem;
}
.empty-state {
text-align: center;
padding: 3rem 2rem;
color: var(--text-secondary);
}
.empty-state p {
font-size: 1.1rem;
display: flex;
align-items: center;
justify-content: center;
gap: 0.5rem;
}

1213
static/js/highlight.min.js vendored

File diff suppressed because one or more lines are too long

8
templates/base.html

@ -27,6 +27,9 @@ @@ -27,6 +27,9 @@
<link rel="stylesheet" href="/static/css/main.css">
<link rel="stylesheet" href="/static/css/responsive.css">
<link rel="stylesheet" href="/static/css/print.css" media="print">
<!-- Highlight.js for code syntax highlighting -->
<link rel="stylesheet" href="/static/css/highlight.js.css">
<script src="/static/js/highlight.min.js"></script>
{{if .StructuredData}}<script type="application/ld+json">{{.StructuredData}}</script>{{end}}
</head>
@ -124,6 +127,11 @@ @@ -124,6 +127,11 @@
window.location.href = this.value;
});
}
// Initialize highlight.js for code blocks
if (typeof hljs !== 'undefined') {
hljs.highlightAll();
}
</script>
</body>
</html>

68
templates/events.html

@ -0,0 +1,68 @@ @@ -0,0 +1,68 @@
{{define "content"}}
<article class="events-page">
<section class="hero">
<div class="hero-content">
<div class="hero-text">
<h1>Events: {{.DTag}}</h1>
<p class="lead">Found {{len .EventCards}} event{{if ne (len .EventCards) 1}}s{{end}} with this d-tag</p>
</div>
</div>
</section>
<section class="events-grid">
{{if .EventCards}}
<div class="events-container">
{{range .EventCards}}
<div class="event-card">
<a href="{{.URL}}" class="event-card-link">
{{if and .Image (ne .Image "")}}
<div class="event-card-image">
<img src="{{.Image}}" alt="{{.Title}}" />
</div>
{{end}}
<div class="event-card-content">
<div class="event-card-header">
<h3 class="event-card-title">{{.Title}}</h3>
<span class="event-card-kind">
{{if eq .Kind 30818}}
<span class="icon-inline">{{icon "book-open"}}</span> Wiki
{{else if eq .Kind 30041}}
<span class="icon-inline">{{icon "file-text"}}</span> Blog
{{else if eq .Kind 30023}}
<span class="icon-inline">{{icon "file-text"}}</span> Article
{{else}}
<span class="icon-inline">{{icon "file"}}</span> Kind {{.Kind}}
{{end}}
</span>
</div>
{{if .Summary}}
<p class="event-card-summary">{{truncate .Summary 200}}</p>
{{end}}
<div class="event-card-meta">
{{if .Author}}
<span class="event-card-author">
<span class="icon-inline">{{icon "user"}}</span>
{{template "user-badge-simple" (dict "Pubkey" .Author "Profiles" $.Profiles)}}
</span>
{{end}}
{{if .Time}}
<span class="event-card-date">
<span class="icon-inline">{{icon "clock"}}</span> {{.Time}}
</span>
{{end}}
</div>
</div>
</a>
</div>
{{end}}
</div>
{{else}}
<div class="empty-state">
<p><span class="icon-inline">{{icon "inbox"}}</span> No events found with d-tag: {{.DTag}}</p>
</div>
{{end}}
</section>
</article>
{{end}}
{{/* Feed is defined in components.html */}}

23
templates/page.html

@ -5,17 +5,32 @@ @@ -5,17 +5,32 @@
{{if .Summary}}<p class="page-summary">{{.Summary}}</p>{{end}}
</header>
<div class="page-content">
{{.Content}}
</div>
{{if .TableOfContents}}
<aside class="table-of-contents">
<h2><span class="icon-inline">{{icon "list"}}</span> Table of Contents</h2>
{{.TableOfContents}}
</aside>
{{end}}
<div class="page-content">
{{.Content}}
</div>
</article>
<script>
// Handle hash-based wiki URLs: convert /wiki#kind:pubkey:dtag to /wiki?ref=kind:pubkey:dtag
(function() {
if (window.location.pathname === '/wiki' || window.location.pathname === '/wiki/') {
const hash = window.location.hash.substring(1); // Remove the '#'
if (hash && /^\d+:[a-fA-F0-9]+:[^:]+$/.test(hash)) {
// Hash matches format: kind:pubkey:dtag
// Convert to query parameter and reload
const newUrl = window.location.pathname + '?ref=' + encodeURIComponent(hash);
window.location.replace(newUrl);
}
}
})();
</script>
{{end}}
{{/* Feed is defined in components.html */}}

Loading…
Cancel
Save