Compare commits
No commits in common. '2c56e353c1b6a5cd8d0e21ba67f28de4ba52b142' and 'e472bfd10826ac2205fcd90285c768e45afa0914' have entirely different histories.
2c56e353c1
...
e472bfd108
22 changed files with 725 additions and 2754 deletions
@ -0,0 +1,235 @@ |
|||||||
|
# Docker Setup for GitCitadel Online |
||||||
|
|
||||||
|
This guide explains how to run GitCitadel Online using Docker. |
||||||
|
|
||||||
|
## Prerequisites |
||||||
|
|
||||||
|
- Docker (version 20.10 or later) |
||||||
|
- Docker Compose (version 2.0 or later, optional but recommended) |
||||||
|
- Network access (for downloading dependencies and connecting to Nostr relays) |
||||||
|
|
||||||
|
## Image Details |
||||||
|
|
||||||
|
The Docker image uses **Alpine Linux** for a smaller footprint (~50MB base image vs ~200MB+ for Debian). This works well because: |
||||||
|
|
||||||
|
- The Go binary is statically compiled (`CGO_ENABLED=0`), so no C library dependencies |
||||||
|
- Node.js packages (`@asciidoctor/core`, `marked`) are pure JavaScript with no native bindings |
||||||
|
- Alpine's musl libc is sufficient for our use case |
||||||
|
|
||||||
|
If you encounter any compatibility issues, you can modify the Dockerfile to use Debian-based images (`golang:1.22` and `node:20-slim`), though this will increase the image size. |
||||||
|
|
||||||
|
## Quick Start |
||||||
|
|
||||||
|
### Using Docker Compose (Recommended) |
||||||
|
|
||||||
|
1. **Create your configuration file:** |
||||||
|
```bash |
||||||
|
cp config.yaml.example config.yaml |
||||||
|
# Edit config.yaml with your Nostr indices, relay URLs, and settings |
||||||
|
``` |
||||||
|
|
||||||
|
2. **Build and run:** |
||||||
|
```bash |
||||||
|
docker-compose up -d |
||||||
|
``` |
||||||
|
|
||||||
|
3. **View logs:** |
||||||
|
```bash |
||||||
|
docker-compose logs -f |
||||||
|
``` |
||||||
|
|
||||||
|
4. **Stop the container:** |
||||||
|
```bash |
||||||
|
docker-compose down |
||||||
|
``` |
||||||
|
|
||||||
|
### Using Docker directly |
||||||
|
|
||||||
|
1. **Build the image:** |
||||||
|
```bash |
||||||
|
docker build -t gitcitadel-online . |
||||||
|
``` |
||||||
|
|
||||||
|
2. **Create config file:** |
||||||
|
```bash |
||||||
|
cp config.yaml.example config.yaml |
||||||
|
# Edit config.yaml with your settings |
||||||
|
``` |
||||||
|
|
||||||
|
3. **Run the container:** |
||||||
|
```bash |
||||||
|
docker run -d \ |
||||||
|
--name gitcitadel-online \ |
||||||
|
-p 8080:8080 \ |
||||||
|
-v $(pwd)/config.yaml:/app/config.yaml:ro \ |
||||||
|
-v $(pwd)/cache:/app/cache \ |
||||||
|
--restart unless-stopped \ |
||||||
|
gitcitadel-online |
||||||
|
``` |
||||||
|
|
||||||
|
4. **View logs:** |
||||||
|
```bash |
||||||
|
docker logs -f gitcitadel-online |
||||||
|
``` |
||||||
|
|
||||||
|
5. **Stop the container:** |
||||||
|
```bash |
||||||
|
docker stop gitcitadel-online |
||||||
|
docker rm gitcitadel-online |
||||||
|
``` |
||||||
|
|
||||||
|
## Configuration |
||||||
|
|
||||||
|
### Config File |
||||||
|
|
||||||
|
The `config.yaml` file must be mounted into the container. The default path is `/app/config.yaml`. |
||||||
|
|
||||||
|
You can override the config path using the `--config` flag: |
||||||
|
```bash |
||||||
|
docker run ... gitcitadel-online --config /path/to/config.yaml |
||||||
|
``` |
||||||
|
|
||||||
|
### Port Mapping |
||||||
|
|
||||||
|
By default, the application runs on port 8080. You can change the host port mapping: |
||||||
|
```bash |
||||||
|
# Map to different host port |
||||||
|
docker run -p 3000:8080 ... |
||||||
|
``` |
||||||
|
|
||||||
|
Or update `docker-compose.yml`: |
||||||
|
```yaml |
||||||
|
ports: |
||||||
|
- "3000:8080" |
||||||
|
``` |
||||||
|
|
||||||
|
### Cache Persistence |
||||||
|
|
||||||
|
The cache directory (`cache/`) is persisted as a volume to maintain cached pages and media between container restarts. |
||||||
|
|
||||||
|
### Environment Variables |
||||||
|
|
||||||
|
You can pass environment variables, though most configuration should be in `config.yaml`: |
||||||
|
|
||||||
|
```bash |
||||||
|
docker run -e LOG_LEVEL=debug ... |
||||||
|
``` |
||||||
|
|
||||||
|
## Development Mode |
||||||
|
|
||||||
|
To run in development mode with verbose logging: |
||||||
|
|
||||||
|
```bash |
||||||
|
docker run ... gitcitadel-online --dev |
||||||
|
``` |
||||||
|
|
||||||
|
Or with docker-compose, override the command: |
||||||
|
```yaml |
||||||
|
command: ["--config", "/app/config.yaml", "--dev"] |
||||||
|
``` |
||||||
|
|
||||||
|
## Health Check |
||||||
|
|
||||||
|
The container includes a health check that monitors the `/health` endpoint. You can check the health status: |
||||||
|
|
||||||
|
```bash |
||||||
|
docker ps |
||||||
|
# Look for "healthy" status |
||||||
|
|
||||||
|
# Or inspect directly |
||||||
|
docker inspect --format='{{.State.Health.Status}}' gitcitadel-online |
||||||
|
``` |
||||||
|
|
||||||
|
## Troubleshooting |
||||||
|
|
||||||
|
### Container won't start |
||||||
|
|
||||||
|
1. **Check logs:** |
||||||
|
```bash |
||||||
|
docker logs gitcitadel-online |
||||||
|
``` |
||||||
|
|
||||||
|
2. **Verify config file:** |
||||||
|
```bash |
||||||
|
docker exec gitcitadel-online cat /app/config.yaml |
||||||
|
``` |
||||||
|
|
||||||
|
3. **Check file permissions:** |
||||||
|
The container runs as a non-root user (UID 1000). Ensure cache directory is writable: |
||||||
|
```bash |
||||||
|
chmod -R 777 cache/ |
||||||
|
``` |
||||||
|
|
||||||
|
### Can't connect to Nostr relays |
||||||
|
|
||||||
|
- Ensure the container has network access |
||||||
|
- Check firewall rules if running on a remote server |
||||||
|
- Verify relay URLs in `config.yaml` are correct |
||||||
|
|
||||||
|
### Cache not persisting |
||||||
|
|
||||||
|
- Ensure the cache volume is properly mounted |
||||||
|
- Check volume permissions |
||||||
|
- Verify the cache directory exists and is writable |
||||||
|
|
||||||
|
## Building for Different Architectures |
||||||
|
|
||||||
|
The Dockerfile builds for `linux/amd64` by default. To build for other architectures: |
||||||
|
|
||||||
|
```bash |
||||||
|
# For ARM64 (e.g., Raspberry Pi, Apple Silicon) |
||||||
|
docker buildx build --platform linux/arm64 -t gitcitadel-online . |
||||||
|
|
||||||
|
# For multiple architectures |
||||||
|
docker buildx build --platform linux/amd64,linux/arm64 -t gitcitadel-online . |
||||||
|
``` |
||||||
|
|
||||||
|
## Production Deployment |
||||||
|
|
||||||
|
For production deployment: |
||||||
|
|
||||||
|
1. **Use a reverse proxy** (nginx, Traefik, Apache, etc.) in front of the container |
||||||
|
2. **Set up SSL/TLS** certificates |
||||||
|
3. **Configure proper logging** and monitoring |
||||||
|
4. **Use secrets management** for sensitive configuration |
||||||
|
5. **Set resource limits** in docker-compose.yml: |
||||||
|
```yaml |
||||||
|
deploy: |
||||||
|
resources: |
||||||
|
limits: |
||||||
|
cpus: '1' |
||||||
|
memory: 512M |
||||||
|
``` |
||||||
|
|
||||||
|
## Apache Reverse Proxy Setup |
||||||
|
|
||||||
|
The `docker-compose.yml` is configured to expose the container on port **2323** for Apache reverse proxy integration. |
||||||
|
|
||||||
|
### Port Configuration |
||||||
|
|
||||||
|
The Docker container exposes port 2323 on the host, which maps to port 8080 inside the container. This matches Apache configurations that proxy to `127.0.0.1:2323`. |
||||||
|
|
||||||
|
If you need to use a different port, update `docker-compose.yml`: Change `"2323:8080"` to your desired port mapping. |
||||||
|
|
||||||
|
**Note:** For Plesk-managed Apache servers, configure the reverse proxy settings through the Plesk control panel. The Docker container is ready to accept connections on port 2323. |
||||||
|
|
||||||
|
## Updating |
||||||
|
|
||||||
|
To update to a new version: |
||||||
|
|
||||||
|
```bash |
||||||
|
# Pull latest code |
||||||
|
git pull |
||||||
|
|
||||||
|
# Rebuild and restart |
||||||
|
docker-compose build |
||||||
|
docker-compose up -d |
||||||
|
``` |
||||||
|
|
||||||
|
Or with Docker directly: |
||||||
|
```bash |
||||||
|
docker build -t gitcitadel-online . |
||||||
|
docker stop gitcitadel-online |
||||||
|
docker rm gitcitadel-online |
||||||
|
docker run ... # (same command as before) |
||||||
|
``` |
||||||
@ -1,90 +1,392 @@ |
|||||||
package asciidoc |
package asciidoc |
||||||
|
|
||||||
import ( |
import ( |
||||||
"encoding/json" |
"bytes" |
||||||
"fmt" |
"fmt" |
||||||
"os/exec" |
"os/exec" |
||||||
"path/filepath" |
"regexp" |
||||||
"strings" |
"strings" |
||||||
) |
) |
||||||
|
|
||||||
// Processor handles content processing using gc-parser
|
// Processor handles AsciiDoc to HTML conversion
|
||||||
type Processor struct { |
type Processor struct { |
||||||
linkBaseURL string |
linkBaseURL string |
||||||
scriptPath string |
|
||||||
} |
} |
||||||
|
|
||||||
// ProcessResult contains the processed HTML content and extracted table of contents
|
// ProcessResult contains the processed HTML content and extracted table of contents
|
||||||
type ProcessResult struct { |
type ProcessResult struct { |
||||||
Content string |
Content string |
||||||
TableOfContents string |
TableOfContents string |
||||||
HasLaTeX bool |
|
||||||
HasMusicalNotation bool |
|
||||||
} |
} |
||||||
|
|
||||||
// gcParserResult matches the JSON output from gc-parser
|
// NewProcessor creates a new AsciiDoc processor
|
||||||
type gcParserResult struct { |
|
||||||
Content string `json:"content"` |
|
||||||
TableOfContents string `json:"tableOfContents"` |
|
||||||
HasLaTeX bool `json:"hasLaTeX"` |
|
||||||
HasMusicalNotation bool `json:"hasMusicalNotation"` |
|
||||||
NostrLinks []interface{} `json:"nostrLinks"` |
|
||||||
Wikilinks []interface{} `json:"wikilinks"` |
|
||||||
Hashtags []string `json:"hashtags"` |
|
||||||
Links []interface{} `json:"links"` |
|
||||||
Media []string `json:"media"` |
|
||||||
Error string `json:"error,omitempty"` |
|
||||||
} |
|
||||||
|
|
||||||
// NewProcessor creates a new content processor using gc-parser
|
|
||||||
func NewProcessor(linkBaseURL string) *Processor { |
func NewProcessor(linkBaseURL string) *Processor { |
||||||
// Determine script path relative to the executable
|
|
||||||
// In production, the script should be in the same directory as the binary
|
|
||||||
scriptPath := filepath.Join("scripts", "process-content.js") |
|
||||||
|
|
||||||
return &Processor{ |
return &Processor{ |
||||||
linkBaseURL: linkBaseURL, |
linkBaseURL: linkBaseURL, |
||||||
scriptPath: scriptPath, |
|
||||||
} |
} |
||||||
} |
} |
||||||
|
|
||||||
// Process converts content (AsciiDoc, Markdown, etc.) to HTML using gc-parser
|
// Process converts AsciiDoc content to HTML with link rewriting
|
||||||
// Returns both the content HTML and the extracted table of contents
|
// Returns both the content HTML and the extracted table of contents
|
||||||
func (p *Processor) Process(content string) (*ProcessResult, error) { |
func (p *Processor) Process(asciidocContent string) (*ProcessResult, error) { |
||||||
|
// First, rewrite links in the AsciiDoc content
|
||||||
|
processedContent := p.rewriteLinks(asciidocContent) |
||||||
|
|
||||||
|
// Convert AsciiDoc to HTML using asciidoctor CLI
|
||||||
|
html, err := p.convertToHTML(processedContent) |
||||||
|
if err != nil { |
||||||
|
return nil, fmt.Errorf("failed to convert AsciiDoc to HTML: %w", err) |
||||||
|
} |
||||||
|
|
||||||
|
// Extract table of contents from HTML
|
||||||
|
toc, contentWithoutTOC := p.extractTOC(html) |
||||||
|
|
||||||
|
// Sanitize HTML to prevent XSS
|
||||||
|
sanitized := p.sanitizeHTML(contentWithoutTOC) |
||||||
|
|
||||||
|
// Process links: make external links open in new tab, local links in same tab
|
||||||
|
processed := p.processLinks(sanitized) |
||||||
|
|
||||||
|
// Also sanitize and process links in TOC
|
||||||
|
tocSanitized := p.sanitizeHTML(toc) |
||||||
|
tocProcessed := p.processLinks(tocSanitized) |
||||||
|
|
||||||
|
return &ProcessResult{ |
||||||
|
Content: processed, |
||||||
|
TableOfContents: tocProcessed, |
||||||
|
}, nil |
||||||
|
} |
||||||
|
|
||||||
|
// rewriteLinks rewrites wikilinks and nostr: links in AsciiDoc content
|
||||||
|
func (p *Processor) rewriteLinks(content string) string { |
||||||
|
// Rewrite wikilinks: [[target]] or [[target|display text]]
|
||||||
|
// Format: [[target]] -> https://alexandria.gitcitadel.eu/events?d=<normalized-d-tag>
|
||||||
|
wikilinkRegex := regexp.MustCompile(`\[\[([^\]]+)\]\]`) |
||||||
|
content = wikilinkRegex.ReplaceAllStringFunc(content, func(match string) string { |
||||||
|
// Extract the content inside [[ ]]
|
||||||
|
inner := match[2 : len(match)-2] |
||||||
|
|
||||||
|
var target, display string |
||||||
|
if strings.Contains(inner, "|") { |
||||||
|
parts := strings.SplitN(inner, "|", 2) |
||||||
|
target = strings.TrimSpace(parts[0]) |
||||||
|
display = strings.TrimSpace(parts[1]) |
||||||
|
} else { |
||||||
|
target = strings.TrimSpace(inner) |
||||||
|
display = target |
||||||
|
} |
||||||
|
|
||||||
|
// Normalize the d tag (convert to lowercase, replace spaces with hyphens, etc.)
|
||||||
|
normalized := normalizeDTag(target) |
||||||
|
|
||||||
|
// Create the link
|
||||||
|
url := fmt.Sprintf("%s/events?d=%s", p.linkBaseURL, normalized) |
||||||
|
return fmt.Sprintf("link:%s[%s]", url, display) |
||||||
|
}) |
||||||
|
|
||||||
|
// Rewrite nostr: links: nostr:naddr1... or nostr:nevent1...
|
||||||
|
// Format: nostr:naddr1... -> https://alexandria.gitcitadel.eu/events?id=naddr1...
|
||||||
|
nostrLinkRegex := regexp.MustCompile(`nostr:(naddr1[^\s\]]+|nevent1[^\s\]]+)`) |
||||||
|
content = nostrLinkRegex.ReplaceAllStringFunc(content, func(match string) string { |
||||||
|
nostrID := strings.TrimPrefix(match, "nostr:") |
||||||
|
url := fmt.Sprintf("%s/events?id=%s", p.linkBaseURL, nostrID) |
||||||
|
return url |
||||||
|
}) |
||||||
|
|
||||||
|
return content |
||||||
|
} |
||||||
|
|
||||||
|
// normalizeDTag normalizes a d tag according to NIP-54 rules
|
||||||
|
func normalizeDTag(dTag string) string { |
||||||
|
// Convert to lowercase
|
||||||
|
dTag = strings.ToLower(dTag) |
||||||
|
|
||||||
|
// Convert whitespace to hyphens
|
||||||
|
dTag = strings.ReplaceAll(dTag, " ", "-") |
||||||
|
dTag = strings.ReplaceAll(dTag, "\t", "-") |
||||||
|
dTag = strings.ReplaceAll(dTag, "\n", "-") |
||||||
|
|
||||||
|
// Remove punctuation and symbols (keep alphanumeric, hyphens, and non-ASCII)
|
||||||
|
var result strings.Builder |
||||||
|
for _, r := range dTag { |
||||||
|
if (r >= 'a' && r <= 'z') || (r >= '0' && r <= '9') || r == '-' || r > 127 { |
||||||
|
result.WriteRune(r) |
||||||
|
} |
||||||
|
} |
||||||
|
dTag = result.String() |
||||||
|
|
||||||
|
// Collapse multiple consecutive hyphens
|
||||||
|
for strings.Contains(dTag, "--") { |
||||||
|
dTag = strings.ReplaceAll(dTag, "--", "-") |
||||||
|
} |
||||||
|
|
||||||
|
// Remove leading and trailing hyphens
|
||||||
|
dTag = strings.Trim(dTag, "-") |
||||||
|
|
||||||
|
return dTag |
||||||
|
} |
||||||
|
|
||||||
|
// convertToHTML converts AsciiDoc to HTML using asciidoctor.js via Node.js
|
||||||
|
func (p *Processor) convertToHTML(asciidocContent string) (string, error) { |
||||||
// Check if node is available
|
// Check if node is available
|
||||||
cmd := exec.Command("node", "--version") |
cmd := exec.Command("node", "--version") |
||||||
if err := cmd.Run(); err != nil { |
if err := cmd.Run(); err != nil { |
||||||
return nil, fmt.Errorf("node.js not found: %w", err) |
return "", fmt.Errorf("node.js not found: %w", err) |
||||||
} |
} |
||||||
|
|
||||||
// Run gc-parser script
|
// JavaScript code to run asciidoctor.js
|
||||||
cmd = exec.Command("node", p.scriptPath, p.linkBaseURL) |
// Read content from stdin to handle special characters properly
|
||||||
cmd.Stdin = strings.NewReader(content) |
jsCode := ` |
||||||
|
const asciidoctor = require('@asciidoctor/core')(); |
||||||
|
|
||||||
|
let content = ''; |
||||||
|
process.stdin.setEncoding('utf8'); |
||||||
|
|
||||||
var stdout, stderr strings.Builder |
process.stdin.on('data', (chunk) => { |
||||||
|
content += chunk; |
||||||
|
}); |
||||||
|
|
||||||
|
process.stdin.on('end', () => { |
||||||
|
try { |
||||||
|
const html = asciidoctor.convert(content, { |
||||||
|
safe: 'safe', |
||||||
|
backend: 'html5', |
||||||
|
doctype: 'article', |
||||||
|
attributes: { |
||||||
|
'showtitle': true, |
||||||
|
'icons': 'font', |
||||||
|
'sectanchors': true, |
||||||
|
'sectlinks': true, |
||||||
|
'toc': 'left', |
||||||
|
'toclevels': 3 |
||||||
|
} |
||||||
|
}); |
||||||
|
process.stdout.write(html); |
||||||
|
} catch (error) { |
||||||
|
console.error('Error converting AsciiDoc:', error.message); |
||||||
|
process.exit(1); |
||||||
|
} |
||||||
|
}); |
||||||
|
` |
||||||
|
|
||||||
|
// Run node with the JavaScript code, passing content via stdin
|
||||||
|
cmd = exec.Command("node", "-e", jsCode) |
||||||
|
cmd.Stdin = strings.NewReader(asciidocContent) |
||||||
|
|
||||||
|
var stdout, stderr bytes.Buffer |
||||||
cmd.Stdout = &stdout |
cmd.Stdout = &stdout |
||||||
cmd.Stderr = &stderr |
cmd.Stderr = &stderr |
||||||
|
|
||||||
if err := cmd.Run(); err != nil { |
if err := cmd.Run(); err != nil { |
||||||
return nil, fmt.Errorf("gc-parser failed: %w, stderr: %s", err, stderr.String()) |
return "", fmt.Errorf("asciidoctor.js conversion failed: %w, stderr: %s", err, stderr.String()) |
||||||
|
} |
||||||
|
|
||||||
|
return stdout.String(), nil |
||||||
|
} |
||||||
|
|
||||||
|
// sanitizeHTML performs basic HTML sanitization to prevent XSS
|
||||||
|
// Note: This is a basic implementation. For production, consider using a proper HTML sanitizer library
|
||||||
|
func (p *Processor) sanitizeHTML(html string) string { |
||||||
|
// Remove script tags and their content
|
||||||
|
scriptRegex := regexp.MustCompile(`(?i)<script[^>]*>.*?</script>`) |
||||||
|
html = scriptRegex.ReplaceAllString(html, "") |
||||||
|
|
||||||
|
// Remove event handlers (onclick, onerror, etc.)
|
||||||
|
eventHandlerRegex := regexp.MustCompile(`(?i)\s*on\w+\s*=\s*["'][^"']*["']`) |
||||||
|
html = eventHandlerRegex.ReplaceAllString(html, "") |
||||||
|
|
||||||
|
// Remove javascript: protocol in links
|
||||||
|
javascriptRegex := regexp.MustCompile(`(?i)javascript:`) |
||||||
|
html = javascriptRegex.ReplaceAllString(html, "") |
||||||
|
|
||||||
|
// Remove data: URLs that could be dangerous
|
||||||
|
dataURLRegex := regexp.MustCompile(`(?i)data:\s*text/html`) |
||||||
|
html = dataURLRegex.ReplaceAllString(html, "") |
||||||
|
|
||||||
|
return html |
||||||
|
} |
||||||
|
|
||||||
|
// extractTOC extracts the table of contents from AsciiDoc HTML output
|
||||||
|
// Returns the TOC HTML and the content HTML without the TOC
|
||||||
|
func (p *Processor) extractTOC(html string) (string, string) { |
||||||
|
// AsciiDoc with toc: 'left' generates a TOC in a div with id="toc" or class="toc"
|
||||||
|
// We need to match the entire TOC div including nested content
|
||||||
|
// Since divs can be nested, we need to count opening/closing tags
|
||||||
|
|
||||||
|
var tocContent string |
||||||
|
contentWithoutTOC := html |
||||||
|
|
||||||
|
// Find the start of the TOC div - try multiple patterns
|
||||||
|
tocStartPatterns := []*regexp.Regexp{ |
||||||
|
// Pattern 1: <div id="toc" class="toc">
|
||||||
|
regexp.MustCompile(`(?i)<div\s+id=["']toc["']\s+class=["']toc["'][^>]*>`), |
||||||
|
// Pattern 2: <div id="toc">
|
||||||
|
regexp.MustCompile(`(?i)<div\s+id=["']toc["'][^>]*>`), |
||||||
|
// Pattern 3: <div class="toc">
|
||||||
|
regexp.MustCompile(`(?i)<div\s+class=["']toc["'][^>]*>`), |
||||||
|
// Pattern 4: <nav id="toc">
|
||||||
|
regexp.MustCompile(`(?i)<nav\s+id=["']toc["'][^>]*>`), |
||||||
} |
} |
||||||
|
|
||||||
// Parse JSON output
|
var tocStartIdx int = -1 |
||||||
var result gcParserResult |
var tocStartTag string |
||||||
output := stdout.String() |
|
||||||
if err := json.Unmarshal([]byte(output), &result); err != nil { |
for _, pattern := range tocStartPatterns { |
||||||
return nil, fmt.Errorf("failed to parse gc-parser output: %w, output: %s", err, output) |
loc := pattern.FindStringIndex(html) |
||||||
|
if loc != nil { |
||||||
|
tocStartIdx = loc[0] |
||||||
|
tocStartTag = html[loc[0]:loc[1]] |
||||||
|
break |
||||||
|
} |
||||||
} |
} |
||||||
|
|
||||||
// Check for error in result
|
if tocStartIdx == -1 { |
||||||
if result.Error != "" { |
// No TOC found
|
||||||
return nil, fmt.Errorf("gc-parser error: %s", result.Error) |
return "", html |
||||||
} |
} |
||||||
|
|
||||||
return &ProcessResult{ |
// Find the matching closing tag by counting div tags
|
||||||
Content: result.Content, |
// Start after the opening tag
|
||||||
TableOfContents: result.TableOfContents, |
searchStart := tocStartIdx + len(tocStartTag) |
||||||
HasLaTeX: result.HasLaTeX, |
depth := 1 |
||||||
HasMusicalNotation: result.HasMusicalNotation, |
i := searchStart |
||||||
}, nil |
|
||||||
|
for i < len(html) && depth > 0 { |
||||||
|
// Look for opening or closing div/nav tags
|
||||||
|
if i+4 < len(html) && html[i:i+4] == "<div" { |
||||||
|
// Check if it's a closing tag
|
||||||
|
if i+5 < len(html) && html[i+4] == '/' { |
||||||
|
depth-- |
||||||
|
// Find the end of this closing tag
|
||||||
|
closeIdx := strings.Index(html[i:], ">") |
||||||
|
if closeIdx == -1 { |
||||||
|
break |
||||||
|
} |
||||||
|
i += closeIdx + 1 |
||||||
|
} else { |
||||||
|
// Opening tag - find the end
|
||||||
|
closeIdx := strings.Index(html[i:], ">") |
||||||
|
if closeIdx == -1 { |
||||||
|
break |
||||||
|
} |
||||||
|
// Check if it's self-closing
|
||||||
|
if html[i+closeIdx-1] != '/' { |
||||||
|
depth++ |
||||||
|
} |
||||||
|
i += closeIdx + 1 |
||||||
|
} |
||||||
|
} else if i+5 < len(html) && html[i:i+5] == "</div" { |
||||||
|
depth-- |
||||||
|
closeIdx := strings.Index(html[i:], ">") |
||||||
|
if closeIdx == -1 { |
||||||
|
break |
||||||
|
} |
||||||
|
i += closeIdx + 1 |
||||||
|
} else if i+5 < len(html) && html[i:i+5] == "</nav" { |
||||||
|
depth-- |
||||||
|
closeIdx := strings.Index(html[i:], ">") |
||||||
|
if closeIdx == -1 { |
||||||
|
break |
||||||
|
} |
||||||
|
i += closeIdx + 1 |
||||||
|
} else { |
||||||
|
i++ |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
if depth == 0 { |
||||||
|
// Found the matching closing tag
|
||||||
|
tocEndIdx := i |
||||||
|
// Extract the TOC content (inner HTML)
|
||||||
|
tocFullHTML := html[tocStartIdx:tocEndIdx] |
||||||
|
// Extract just the inner content (without the outer div tags)
|
||||||
|
innerStart := len(tocStartTag) |
||||||
|
innerEnd := len(tocFullHTML) |
||||||
|
// Find the last </div> or </nav>
|
||||||
|
if strings.HasSuffix(tocFullHTML, "</div>") { |
||||||
|
innerEnd -= 6 |
||||||
|
} else if strings.HasSuffix(tocFullHTML, "</nav>") { |
||||||
|
innerEnd -= 7 |
||||||
|
} |
||||||
|
tocContent = strings.TrimSpace(tocFullHTML[innerStart:innerEnd]) |
||||||
|
|
||||||
|
// Remove the toctitle div if present (AsciiDoc adds "Table of Contents" title)
|
||||||
|
toctitlePattern := regexp.MustCompile(`(?s)<div\s+id=["']toctitle["'][^>]*>.*?</div>\s*`) |
||||||
|
tocContent = toctitlePattern.ReplaceAllString(tocContent, "") |
||||||
|
tocContent = strings.TrimSpace(tocContent) |
||||||
|
|
||||||
|
// Remove the TOC from the content
|
||||||
|
contentWithoutTOC = html[:tocStartIdx] + html[tocEndIdx:] |
||||||
|
} |
||||||
|
|
||||||
|
return tocContent, contentWithoutTOC |
||||||
|
} |
||||||
|
|
||||||
|
// processLinks processes HTML links to add target="_blank" to external links
|
||||||
|
// External links are those that start with http:// or https:// and don't point to the linkBaseURL domain
|
||||||
|
// Local links (including relative links and links to linkBaseURL) open in the same tab
|
||||||
|
func (p *Processor) processLinks(html string) string { |
||||||
|
// Extract domain from linkBaseURL for comparison
|
||||||
|
linkBaseDomain := "" |
||||||
|
if strings.HasPrefix(p.linkBaseURL, "http://") || strings.HasPrefix(p.linkBaseURL, "https://") { |
||||||
|
// Extract domain (e.g., "alexandria.gitcitadel.eu" from "https://alexandria.gitcitadel.eu")
|
||||||
|
parts := strings.Split(strings.TrimPrefix(strings.TrimPrefix(p.linkBaseURL, "https://"), "http://"), "/") |
||||||
|
if len(parts) > 0 { |
||||||
|
linkBaseDomain = parts[0] |
||||||
|
} |
||||||
|
} |
||||||
|
|
||||||
|
// Regex to match <a> tags with href attributes (more flexible pattern)
|
||||||
|
linkRegex := regexp.MustCompile(`<a\s+([^>]*?)href\s*=\s*["']([^"']+)["']([^>]*?)>`) |
||||||
|
|
||||||
|
html = linkRegex.ReplaceAllStringFunc(html, func(match string) string { |
||||||
|
// Extract href value
|
||||||
|
hrefMatch := regexp.MustCompile(`href\s*=\s*["']([^"']+)["']`) |
||||||
|
hrefSubmatch := hrefMatch.FindStringSubmatch(match) |
||||||
|
if len(hrefSubmatch) < 2 { |
||||||
|
return match // No href found, return as-is
|
||||||
|
} |
||||||
|
href := hrefSubmatch[1] |
||||||
|
|
||||||
|
// Check if it's an external link (starts with http:// or https://)
|
||||||
|
isExternal := strings.HasPrefix(href, "http://") || strings.HasPrefix(href, "https://") |
||||||
|
|
||||||
|
if isExternal { |
||||||
|
// Check if it's pointing to our own domain
|
||||||
|
if linkBaseDomain != "" && strings.Contains(href, linkBaseDomain) { |
||||||
|
// Same domain - open in same tab (remove any existing target attribute)
|
||||||
|
targetRegex := regexp.MustCompile(`\s*target\s*=\s*["'][^"']*["']`) |
||||||
|
match = targetRegex.ReplaceAllString(match, "") |
||||||
|
return match |
||||||
|
} |
||||||
|
|
||||||
|
// External link - add target="_blank" and rel="noopener noreferrer" if not already present
|
||||||
|
if !strings.Contains(match, `target=`) { |
||||||
|
// Insert before the closing >
|
||||||
|
match = strings.TrimSuffix(match, ">") |
||||||
|
if !strings.Contains(match, `rel=`) { |
||||||
|
match += ` target="_blank" rel="noopener noreferrer">` |
||||||
|
} else { |
||||||
|
// Update existing rel attribute to include noopener if not present
|
||||||
|
relRegex := regexp.MustCompile(`rel\s*=\s*["']([^"']*)["']`) |
||||||
|
match = relRegex.ReplaceAllStringFunc(match, func(relMatch string) string { |
||||||
|
relValue := relRegex.FindStringSubmatch(relMatch)[1] |
||||||
|
if !strings.Contains(relValue, "noopener") { |
||||||
|
relValue += " noopener noreferrer" |
||||||
|
} |
||||||
|
return `rel="` + strings.TrimSpace(relValue) + `"` |
||||||
|
}) |
||||||
|
match += ` target="_blank">` |
||||||
|
} |
||||||
|
} |
||||||
|
} else { |
||||||
|
// Local/relative link - ensure it opens in same tab (remove target if present)
|
||||||
|
targetRegex := regexp.MustCompile(`\s*target\s*=\s*["'][^"']*["']`) |
||||||
|
match = targetRegex.ReplaceAllString(match, "") |
||||||
|
} |
||||||
|
|
||||||
|
return match |
||||||
|
}) |
||||||
|
|
||||||
|
return html |
||||||
} |
} |
||||||
|
|||||||
@ -1,89 +0,0 @@ |
|||||||
package cache |
|
||||||
|
|
||||||
import ( |
|
||||||
"bytes" |
|
||||||
"compress/gzip" |
|
||||||
"sync" |
|
||||||
"time" |
|
||||||
) |
|
||||||
|
|
||||||
// Cache stores generated HTML pages
|
|
||||||
type Cache struct { |
|
||||||
pages map[string]*CachedPage |
|
||||||
mu sync.RWMutex |
|
||||||
} |
|
||||||
|
|
||||||
// NewCache creates a new cache
|
|
||||||
func NewCache() *Cache { |
|
||||||
return &Cache{ |
|
||||||
pages: make(map[string]*CachedPage), |
|
||||||
} |
|
||||||
} |
|
||||||
|
|
||||||
// Get retrieves a page from cache
|
|
||||||
func (c *Cache) Get(path string) (*CachedPage, bool) { |
|
||||||
c.mu.RLock() |
|
||||||
defer c.mu.RUnlock() |
|
||||||
page, exists := c.pages[path] |
|
||||||
return page, exists |
|
||||||
} |
|
||||||
|
|
||||||
// Set stores a page in cache
|
|
||||||
func (c *Cache) Set(path string, content string) error { |
|
||||||
c.mu.Lock() |
|
||||||
defer c.mu.Unlock() |
|
||||||
|
|
||||||
// Generate ETag
|
|
||||||
etag := GenerateETag(content) |
|
||||||
|
|
||||||
// Pre-compress content
|
|
||||||
var compressed bytes.Buffer |
|
||||||
writer := gzip.NewWriter(&compressed) |
|
||||||
if _, err := writer.Write([]byte(content)); err != nil { |
|
||||||
return err |
|
||||||
} |
|
||||||
if err := writer.Close(); err != nil { |
|
||||||
return err |
|
||||||
} |
|
||||||
|
|
||||||
c.pages[path] = &CachedPage{ |
|
||||||
Content: content, |
|
||||||
ETag: etag, |
|
||||||
LastUpdated: time.Now(), |
|
||||||
Compressed: compressed.Bytes(), |
|
||||||
} |
|
||||||
|
|
||||||
return nil |
|
||||||
} |
|
||||||
|
|
||||||
// Delete removes a page from cache
|
|
||||||
func (c *Cache) Delete(path string) { |
|
||||||
c.mu.Lock() |
|
||||||
defer c.mu.Unlock() |
|
||||||
delete(c.pages, path) |
|
||||||
} |
|
||||||
|
|
||||||
// Clear clears all cached pages
|
|
||||||
func (c *Cache) Clear() { |
|
||||||
c.mu.Lock() |
|
||||||
defer c.mu.Unlock() |
|
||||||
c.pages = make(map[string]*CachedPage) |
|
||||||
} |
|
||||||
|
|
||||||
// Size returns the number of cached pages
|
|
||||||
func (c *Cache) Size() int { |
|
||||||
c.mu.RLock() |
|
||||||
defer c.mu.RUnlock() |
|
||||||
return len(c.pages) |
|
||||||
} |
|
||||||
|
|
||||||
// GetAllPaths returns all cached page paths
|
|
||||||
func (c *Cache) GetAllPaths() []string { |
|
||||||
c.mu.RLock() |
|
||||||
defer c.mu.RUnlock() |
|
||||||
paths := make([]string, 0, len(c.pages)) |
|
||||||
for path := range c.pages { |
|
||||||
paths = append(paths, path) |
|
||||||
} |
|
||||||
return paths |
|
||||||
} |
|
||||||
@ -1,54 +0,0 @@ |
|||||||
package cache |
|
||||||
|
|
||||||
import ( |
|
||||||
"sync" |
|
||||||
"time" |
|
||||||
) |
|
||||||
|
|
||||||
// FeedItem represents a cached feed item
|
|
||||||
type FeedItem struct { |
|
||||||
EventID string |
|
||||||
Author string |
|
||||||
Content string |
|
||||||
Time time.Time |
|
||||||
Link string |
|
||||||
Title string |
|
||||||
Summary string |
|
||||||
Image string |
|
||||||
} |
|
||||||
|
|
||||||
// FeedCache stores the kind 1 feed
|
|
||||||
type FeedCache struct { |
|
||||||
items []FeedItem |
|
||||||
mu sync.RWMutex |
|
||||||
lastUpdated time.Time |
|
||||||
} |
|
||||||
|
|
||||||
// NewFeedCache creates a new feed cache
|
|
||||||
func NewFeedCache() *FeedCache { |
|
||||||
return &FeedCache{ |
|
||||||
items: make([]FeedItem, 0), |
|
||||||
} |
|
||||||
} |
|
||||||
|
|
||||||
// Set updates the feed cache
|
|
||||||
func (fc *FeedCache) Set(items []FeedItem) { |
|
||||||
fc.mu.Lock() |
|
||||||
defer fc.mu.Unlock() |
|
||||||
fc.items = items |
|
||||||
fc.lastUpdated = time.Now() |
|
||||||
} |
|
||||||
|
|
||||||
// Get retrieves feed items
|
|
||||||
func (fc *FeedCache) Get() []FeedItem { |
|
||||||
fc.mu.RLock() |
|
||||||
defer fc.mu.RUnlock() |
|
||||||
return fc.items |
|
||||||
} |
|
||||||
|
|
||||||
// GetLastUpdated returns when the feed was last updated
|
|
||||||
func (fc *FeedCache) GetLastUpdated() time.Time { |
|
||||||
fc.mu.RLock() |
|
||||||
defer fc.mu.RUnlock() |
|
||||||
return fc.lastUpdated |
|
||||||
} |
|
||||||
@ -1,198 +0,0 @@ |
|||||||
package cache |
|
||||||
|
|
||||||
import ( |
|
||||||
"context" |
|
||||||
"crypto/sha256" |
|
||||||
"encoding/hex" |
|
||||||
"fmt" |
|
||||||
"io" |
|
||||||
"net/http" |
|
||||||
"os" |
|
||||||
"path/filepath" |
|
||||||
"sync" |
|
||||||
"time" |
|
||||||
|
|
||||||
"gitcitadel-online/internal/logger" |
|
||||||
) |
|
||||||
|
|
||||||
// MediaCache handles caching of images and other media from events
|
|
||||||
type MediaCache struct { |
|
||||||
cacheDir string |
|
||||||
activeEvents map[string]time.Time // eventID -> last seen time
|
|
||||||
mu sync.RWMutex |
|
||||||
httpClient *http.Client |
|
||||||
} |
|
||||||
|
|
||||||
// NewMediaCache creates a new media cache
|
|
||||||
func NewMediaCache(cacheDir string) (*MediaCache, error) { |
|
||||||
// Create cache directory if it doesn't exist
|
|
||||||
if err := os.MkdirAll(cacheDir, 0755); err != nil { |
|
||||||
return nil, fmt.Errorf("failed to create media cache directory: %w", err) |
|
||||||
} |
|
||||||
|
|
||||||
mc := &MediaCache{ |
|
||||||
cacheDir: cacheDir, |
|
||||||
activeEvents: make(map[string]time.Time), |
|
||||||
httpClient: &http.Client{ |
|
||||||
Timeout: 30 * time.Second, |
|
||||||
}, |
|
||||||
} |
|
||||||
|
|
||||||
// Start cleanup goroutine
|
|
||||||
go mc.cleanupLoop(context.Background()) |
|
||||||
|
|
||||||
return mc, nil |
|
||||||
} |
|
||||||
|
|
||||||
// CacheMedia downloads and caches a media file from a URL
|
|
||||||
// Returns the local path to the cached file, or the original URL if caching fails
|
|
||||||
func (mc *MediaCache) CacheMedia(ctx context.Context, url string, eventID string) (string, error) { |
|
||||||
if url == "" { |
|
||||||
return "", fmt.Errorf("empty URL") |
|
||||||
} |
|
||||||
|
|
||||||
// Mark event as active
|
|
||||||
mc.mu.Lock() |
|
||||||
mc.activeEvents[eventID] = time.Now() |
|
||||||
mc.mu.Unlock() |
|
||||||
|
|
||||||
// Generate cache filename from URL hash
|
|
||||||
hash := sha256.Sum256([]byte(url)) |
|
||||||
filename := hex.EncodeToString(hash[:]) + filepath.Ext(url) |
|
||||||
cachePath := filepath.Join(mc.cacheDir, filename) |
|
||||||
|
|
||||||
// Check if already cached
|
|
||||||
if _, err := os.Stat(cachePath); err == nil { |
|
||||||
return "/cache/media/" + filename, nil |
|
||||||
} |
|
||||||
|
|
||||||
// Download the media
|
|
||||||
req, err := http.NewRequestWithContext(ctx, "GET", url, nil) |
|
||||||
if err != nil { |
|
||||||
return url, fmt.Errorf("failed to create request: %w", err) |
|
||||||
} |
|
||||||
|
|
||||||
// Set user agent
|
|
||||||
req.Header.Set("User-Agent", "GitCitadel-Online/1.0") |
|
||||||
|
|
||||||
resp, err := mc.httpClient.Do(req) |
|
||||||
if err != nil { |
|
||||||
logger.WithFields(map[string]interface{}{ |
|
||||||
"url": url, |
|
||||||
"eventID": eventID, |
|
||||||
"error": err, |
|
||||||
}).Warn("Failed to download media") |
|
||||||
return url, fmt.Errorf("failed to download: %w", err) |
|
||||||
} |
|
||||||
defer resp.Body.Close() |
|
||||||
|
|
||||||
if resp.StatusCode != http.StatusOK { |
|
||||||
return url, fmt.Errorf("unexpected status code: %d", resp.StatusCode) |
|
||||||
} |
|
||||||
|
|
||||||
// Check content type - only cache images
|
|
||||||
contentType := resp.Header.Get("Content-Type") |
|
||||||
if !isImageContentType(contentType) { |
|
||||||
logger.WithFields(map[string]interface{}{ |
|
||||||
"url": url, |
|
||||||
"contentType": contentType, |
|
||||||
}).Debug("Skipping non-image media") |
|
||||||
return url, nil |
|
||||||
} |
|
||||||
|
|
||||||
// Create cache file
|
|
||||||
file, err := os.Create(cachePath) |
|
||||||
if err != nil { |
|
||||||
return url, fmt.Errorf("failed to create cache file: %w", err) |
|
||||||
} |
|
||||||
defer file.Close() |
|
||||||
|
|
||||||
// Copy response to file
|
|
||||||
_, err = io.Copy(file, resp.Body) |
|
||||||
if err != nil { |
|
||||||
os.Remove(cachePath) // Clean up on error
|
|
||||||
return url, fmt.Errorf("failed to write cache file: %w", err) |
|
||||||
} |
|
||||||
|
|
||||||
logger.WithFields(map[string]interface{}{ |
|
||||||
"url": url, |
|
||||||
"eventID": eventID, |
|
||||||
"cachePath": cachePath, |
|
||||||
}).Debug("Cached media file") |
|
||||||
|
|
||||||
return "/cache/media/" + filename, nil |
|
||||||
} |
|
||||||
|
|
||||||
// GetCacheDir returns the cache directory path
|
|
||||||
func (mc *MediaCache) GetCacheDir() string { |
|
||||||
return mc.cacheDir |
|
||||||
} |
|
||||||
|
|
||||||
// isImageContentType checks if a content type is an image
|
|
||||||
func isImageContentType(contentType string) bool { |
|
||||||
imageTypes := []string{ |
|
||||||
"image/jpeg", |
|
||||||
"image/jpg", |
|
||||||
"image/png", |
|
||||||
"image/gif", |
|
||||||
"image/webp", |
|
||||||
"image/svg+xml", |
|
||||||
"image/bmp", |
|
||||||
"image/x-icon", |
|
||||||
} |
|
||||||
for _, imgType := range imageTypes { |
|
||||||
if contentType == imgType { |
|
||||||
return true |
|
||||||
} |
|
||||||
} |
|
||||||
return false |
|
||||||
} |
|
||||||
|
|
||||||
// MarkEventActive marks an event as currently active (displayed)
|
|
||||||
func (mc *MediaCache) MarkEventActive(eventID string) { |
|
||||||
mc.mu.Lock() |
|
||||||
defer mc.mu.Unlock() |
|
||||||
mc.activeEvents[eventID] = time.Now() |
|
||||||
} |
|
||||||
|
|
||||||
// cleanupLoop periodically removes media for events that are no longer active
|
|
||||||
func (mc *MediaCache) cleanupLoop(ctx context.Context) { |
|
||||||
ticker := time.NewTicker(1 * time.Hour) // Run cleanup every hour
|
|
||||||
defer ticker.Stop() |
|
||||||
|
|
||||||
for { |
|
||||||
select { |
|
||||||
case <-ctx.Done(): |
|
||||||
return |
|
||||||
case <-ticker.C: |
|
||||||
mc.cleanup() |
|
||||||
} |
|
||||||
} |
|
||||||
} |
|
||||||
|
|
||||||
// cleanup removes media files for events that haven't been seen in 24 hours
|
|
||||||
func (mc *MediaCache) cleanup() { |
|
||||||
mc.mu.Lock() |
|
||||||
defer mc.mu.Unlock() |
|
||||||
|
|
||||||
cutoff := time.Now().Add(-24 * time.Hour) |
|
||||||
var toRemove []string |
|
||||||
|
|
||||||
// Find events that are no longer active
|
|
||||||
for eventID, lastSeen := range mc.activeEvents { |
|
||||||
if lastSeen.Before(cutoff) { |
|
||||||
toRemove = append(toRemove, eventID) |
|
||||||
} |
|
||||||
} |
|
||||||
|
|
||||||
// Remove inactive events from tracking
|
|
||||||
for _, eventID := range toRemove { |
|
||||||
delete(mc.activeEvents, eventID) |
|
||||||
} |
|
||||||
|
|
||||||
// Note: We don't delete the actual files here because multiple events might use the same image
|
|
||||||
// Instead, we rely on the fact that if an event is no longer displayed, its media won't be accessed
|
|
||||||
// A more sophisticated cleanup would track which files are used by which events
|
|
||||||
|
|
||||||
logger.WithField("removed_events", len(toRemove)).Debug("Cleaned up inactive events from media cache") |
|
||||||
} |
|
||||||
@ -1,29 +0,0 @@ |
|||||||
package cache |
|
||||||
|
|
||||||
import ( |
|
||||||
"fmt" |
|
||||||
"time" |
|
||||||
) |
|
||||||
|
|
||||||
// CachedPage represents a cached HTML page
|
|
||||||
type CachedPage struct { |
|
||||||
Content string |
|
||||||
ETag string |
|
||||||
LastUpdated time.Time |
|
||||||
Compressed []byte // Pre-compressed gzip content
|
|
||||||
} |
|
||||||
|
|
||||||
// IsStale checks if the cached page is stale based on maxAge
|
|
||||||
func (cp *CachedPage) IsStale(maxAge time.Duration) bool { |
|
||||||
return time.Since(cp.LastUpdated) > maxAge |
|
||||||
} |
|
||||||
|
|
||||||
// GenerateETag generates an ETag for the content
|
|
||||||
func GenerateETag(content string) string { |
|
||||||
// Simple ETag based on content hash
|
|
||||||
hash := 0 |
|
||||||
for _, b := range []byte(content) { |
|
||||||
hash = hash*31 + int(b) |
|
||||||
} |
|
||||||
return fmt.Sprintf(`"%x"`, hash) |
|
||||||
} |
|
||||||
@ -1,411 +0,0 @@ |
|||||||
package cache |
|
||||||
|
|
||||||
import ( |
|
||||||
"context" |
|
||||||
"html/template" |
|
||||||
"time" |
|
||||||
|
|
||||||
"gitcitadel-online/internal/generator" |
|
||||||
"gitcitadel-online/internal/logger" |
|
||||||
"gitcitadel-online/internal/nostr" |
|
||||||
) |
|
||||||
|
|
||||||
// Rewarmer handles cache rewarming
|
|
||||||
type Rewarmer struct { |
|
||||||
cache *Cache |
|
||||||
feedCache *FeedCache |
|
||||||
wikiService *nostr.WikiService |
|
||||||
feedService *nostr.FeedService |
|
||||||
ebooksService *nostr.EBooksService |
|
||||||
htmlGenerator *generator.HTMLGenerator |
|
||||||
wikiIndex string |
|
||||||
blogIndex string |
|
||||||
feedRelay string |
|
||||||
maxFeedEvents int |
|
||||||
interval time.Duration |
|
||||||
feedInterval time.Duration |
|
||||||
} |
|
||||||
|
|
||||||
// NewRewarmer creates a new cache rewarming service
|
|
||||||
func NewRewarmer( |
|
||||||
cache *Cache, |
|
||||||
feedCache *FeedCache, |
|
||||||
wikiService *nostr.WikiService, |
|
||||||
feedService *nostr.FeedService, |
|
||||||
ebooksService *nostr.EBooksService, |
|
||||||
htmlGenerator *generator.HTMLGenerator, |
|
||||||
wikiIndex, blogIndex, feedRelay string, |
|
||||||
maxFeedEvents int, |
|
||||||
interval, feedInterval time.Duration, |
|
||||||
) *Rewarmer { |
|
||||||
return &Rewarmer{ |
|
||||||
cache: cache, |
|
||||||
feedCache: feedCache, |
|
||||||
wikiService: wikiService, |
|
||||||
feedService: feedService, |
|
||||||
ebooksService: ebooksService, |
|
||||||
htmlGenerator: htmlGenerator, |
|
||||||
wikiIndex: wikiIndex, |
|
||||||
blogIndex: blogIndex, |
|
||||||
feedRelay: feedRelay, |
|
||||||
maxFeedEvents: maxFeedEvents, |
|
||||||
interval: interval, |
|
||||||
feedInterval: feedInterval, |
|
||||||
} |
|
||||||
} |
|
||||||
|
|
||||||
// Start starts the rewarming goroutines
|
|
||||||
func (r *Rewarmer) Start(ctx context.Context) { |
|
||||||
// Initial population
|
|
||||||
go r.rewarmPages(ctx) |
|
||||||
go r.rewarmFeed(ctx) |
|
||||||
|
|
||||||
// Periodic rewarming
|
|
||||||
go r.periodicRewarmPages(ctx) |
|
||||||
go r.periodicRewarmFeed(ctx) |
|
||||||
} |
|
||||||
|
|
||||||
// rewarmPages rewarms the page cache
|
|
||||||
func (r *Rewarmer) rewarmPages(ctx context.Context) { |
|
||||||
logger.Info("Starting page cache rewarming...") |
|
||||||
|
|
||||||
// Initialize wikiPages as empty - will be populated if wiki fetch succeeds
|
|
||||||
wikiPages := make([]generator.WikiPageInfo, 0) |
|
||||||
|
|
||||||
// Fetch wiki index (non-blocking - landing page can still be generated)
|
|
||||||
// If theforest fails, leave pages as-is (don't remove existing events)
|
|
||||||
wikiIndex, err := r.wikiService.FetchWikiIndex(ctx, r.wikiIndex) |
|
||||||
if err != nil { |
|
||||||
logger.Warnf("Error fetching wiki index from theforest: %v - keeping existing pages", err) |
|
||||||
// Don't update cache - leave existing pages as-is
|
|
||||||
// Continue to generate landing page even if wiki fetch fails
|
|
||||||
} else { |
|
||||||
// Fetch wiki events
|
|
||||||
// If theforest fails, leave pages as-is (don't remove existing events)
|
|
||||||
wikiEvents, err := r.wikiService.FetchWikiEvents(ctx, wikiIndex) |
|
||||||
if err != nil { |
|
||||||
logger.Warnf("Error fetching wiki events from theforest: %v - keeping existing pages", err) |
|
||||||
// Don't update cache - leave existing pages as-is
|
|
||||||
} else { |
|
||||||
// Build wiki page info for navigation
|
|
||||||
wikiPages = make([]generator.WikiPageInfo, 0, len(wikiEvents)) |
|
||||||
for _, event := range wikiEvents { |
|
||||||
wikiPages = append(wikiPages, generator.WikiPageInfo{ |
|
||||||
DTag: event.DTag, |
|
||||||
Title: event.Title, |
|
||||||
}) |
|
||||||
} |
|
||||||
|
|
||||||
// Generate and cache wiki index page
|
|
||||||
wikiIndexHTML, err := r.htmlGenerator.GenerateWikiIndexPage(wikiIndex, wikiPages, []generator.FeedItemInfo{}) |
|
||||||
if err != nil { |
|
||||||
logger.Errorf("Error generating wiki index page: %v", err) |
|
||||||
} else { |
|
||||||
if err := r.cache.Set("/wiki", wikiIndexHTML); err != nil { |
|
||||||
logger.Errorf("Error caching wiki index page: %v", err) |
|
||||||
} else { |
|
||||||
logger.WithField("pages", len(wikiPages)).Info("Wiki index page cached successfully") |
|
||||||
} |
|
||||||
} |
|
||||||
|
|
||||||
// Generate and cache wiki pages
|
|
||||||
for _, event := range wikiEvents { |
|
||||||
html, err := r.htmlGenerator.GenerateWikiPage(event, wikiPages, []generator.FeedItemInfo{}) |
|
||||||
if err != nil { |
|
||||||
logger.WithField("dtag", event.DTag).Errorf("Error generating wiki page: %v", err) |
|
||||||
continue |
|
||||||
} |
|
||||||
if err := r.cache.Set("/wiki/"+event.DTag, html); err != nil { |
|
||||||
logger.WithField("dtag", event.DTag).Errorf("Error caching wiki page: %v", err) |
|
||||||
} |
|
||||||
} |
|
||||||
} |
|
||||||
} |
|
||||||
|
|
||||||
// Fetch blog index if configured (needed for landing page)
|
|
||||||
// If theforest fails, leave pages as-is (don't remove existing events)
|
|
||||||
var newestBlogItem *generator.BlogItemInfo |
|
||||||
if r.blogIndex != "" { |
|
||||||
blogIndex, err := r.wikiService.FetchWikiIndex(ctx, r.blogIndex) |
|
||||||
if err != nil { |
|
||||||
logger.Warnf("Error fetching blog index from theforest: %v - keeping existing pages", err) |
|
||||||
// Don't update cache - leave existing pages as-is
|
|
||||||
} else { |
|
||||||
// Fetch blog events using the generic FetchIndexEvents function
|
|
||||||
// If theforest fails, leave pages as-is (don't remove existing events)
|
|
||||||
blogKind := r.wikiService.GetBlogKind() |
|
||||||
blogEventList, err := r.wikiService.FetchIndexEvents(ctx, blogIndex, blogKind) |
|
||||||
if err != nil { |
|
||||||
logger.Warnf("Error fetching blog events from theforest: %v - keeping existing pages", err) |
|
||||||
// Don't update cache - leave existing pages as-is
|
|
||||||
} else { |
|
||||||
logger.WithFields(map[string]interface{}{ |
|
||||||
"events": len(blogEventList), |
|
||||||
"kind": blogKind, |
|
||||||
}).Debug("Fetched blog events") |
|
||||||
blogItems := make([]generator.BlogItemInfo, 0, len(blogEventList)) |
|
||||||
for _, event := range blogEventList { |
|
||||||
// Parse the blog event
|
|
||||||
blog, err := nostr.ParseBlogEvent(event, blogKind) |
|
||||||
if err != nil { |
|
||||||
logger.WithField("event_id", event.ID).Warnf("Error parsing blog event: %v", err) |
|
||||||
continue |
|
||||||
} |
|
||||||
|
|
||||||
html, err := r.htmlGenerator.ProcessAsciiDoc(blog.Content) |
|
||||||
if err != nil { |
|
||||||
logger.WithField("dtag", blog.DTag).Warnf("Error processing blog content: %v", err) |
|
||||||
html = blog.Content // Fallback to raw content
|
|
||||||
} |
|
||||||
blogItems = append(blogItems, generator.BlogItemInfo{ |
|
||||||
DTag: blog.DTag, |
|
||||||
Title: blog.Title, |
|
||||||
Summary: blog.Summary, |
|
||||||
Content: template.HTML(html), |
|
||||||
Author: event.PubKey, |
|
||||||
Image: blog.Image, |
|
||||||
CreatedAt: int64(event.CreatedAt), |
|
||||||
}) |
|
||||||
} |
|
||||||
logger.WithField("items", len(blogItems)).Debug("Generated blog items") |
|
||||||
|
|
||||||
// Get newest blog item for landing page
|
|
||||||
if len(blogItems) > 0 { |
|
||||||
newestBlogItem = &blogItems[0] |
|
||||||
} |
|
||||||
|
|
||||||
// Generate blog page without feed items (feed only on landing page)
|
|
||||||
blogHTML, err := r.htmlGenerator.GenerateBlogPage(blogIndex, blogItems, []generator.FeedItemInfo{}) |
|
||||||
if err != nil { |
|
||||||
logger.Errorf("Error generating blog page: %v", err) |
|
||||||
} else { |
|
||||||
if err := r.cache.Set("/blog", blogHTML); err != nil { |
|
||||||
logger.Errorf("Error caching blog page: %v", err) |
|
||||||
} else { |
|
||||||
logger.WithField("items", len(blogItems)).Info("Blog page cached successfully") |
|
||||||
} |
|
||||||
} |
|
||||||
} |
|
||||||
} |
|
||||||
} |
|
||||||
|
|
||||||
// Fetch and cache articles page (longform articles) - needed for landing page
|
|
||||||
// If theforest fails, leave pages as-is (don't remove existing events)
|
|
||||||
var allArticleItems []generator.ArticleItemInfo |
|
||||||
var newestArticleItem *generator.ArticleItemInfo |
|
||||||
longformKind := r.wikiService.GetLongformKind() |
|
||||||
if longformKind > 0 { |
|
||||||
articleEvents, err := r.wikiService.FetchLongformArticles(ctx, "wss://theforest.nostr1.com", longformKind, 50) |
|
||||||
if err != nil { |
|
||||||
logger.Warnf("Error fetching longform articles from theforest: %v - keeping existing pages", err) |
|
||||||
// Don't update cache - leave existing pages as-is
|
|
||||||
} else { |
|
||||||
articleItems := make([]generator.ArticleItemInfo, 0, len(articleEvents)) |
|
||||||
for _, event := range articleEvents { |
|
||||||
// Parse the longform article
|
|
||||||
article, err := nostr.ParseLongformEvent(event, longformKind) |
|
||||||
if err != nil { |
|
||||||
logger.WithField("event_id", event.ID).Warnf("Error parsing longform article: %v", err) |
|
||||||
continue |
|
||||||
} |
|
||||||
|
|
||||||
// Process content using gc-parser (handles Markdown, AsciiDoc, etc.)
|
|
||||||
result, err := r.htmlGenerator.ProcessAsciiDoc(article.Content) |
|
||||||
var html string |
|
||||||
if err != nil { |
|
||||||
logger.WithField("dtag", article.DTag).Warnf("Error processing content: %v", err) |
|
||||||
html = article.Content // Fallback to raw content
|
|
||||||
} else { |
|
||||||
html = result |
|
||||||
} |
|
||||||
articleItems = append(articleItems, generator.ArticleItemInfo{ |
|
||||||
DTag: article.DTag, |
|
||||||
Title: article.Title, |
|
||||||
Summary: article.Summary, |
|
||||||
Content: template.HTML(html), |
|
||||||
Author: event.PubKey, |
|
||||||
Image: article.Image, |
|
||||||
CreatedAt: int64(event.CreatedAt), |
|
||||||
}) |
|
||||||
} |
|
||||||
logger.WithField("items", len(articleItems)).Debug("Generated article items") |
|
||||||
|
|
||||||
// Store all article items for landing page
|
|
||||||
allArticleItems = articleItems |
|
||||||
|
|
||||||
// Get newest article item for landing page
|
|
||||||
if len(articleItems) > 0 { |
|
||||||
newestArticleItem = &articleItems[0] |
|
||||||
} |
|
||||||
|
|
||||||
// Generate articles page
|
|
||||||
articlesHTML, err := r.htmlGenerator.GenerateArticlesPage(articleItems, []generator.FeedItemInfo{}) |
|
||||||
if err != nil { |
|
||||||
logger.Errorf("Error generating articles page: %v", err) |
|
||||||
} else { |
|
||||||
if err := r.cache.Set("/articles", articlesHTML); err != nil { |
|
||||||
logger.Errorf("Error caching articles page: %v", err) |
|
||||||
} else { |
|
||||||
logger.WithField("items", len(articleItems)).Info("Articles page cached successfully") |
|
||||||
} |
|
||||||
} |
|
||||||
} |
|
||||||
} |
|
||||||
|
|
||||||
// Fetch and cache e-books page (needed for landing page)
|
|
||||||
// If theforest fails, leave pages as-is (don't remove existing events)
|
|
||||||
var allEBooks []generator.EBookInfo |
|
||||||
if r.ebooksService != nil { |
|
||||||
ebooks, err := r.ebooksService.FetchTopLevelIndexEvents(ctx) |
|
||||||
if err != nil { |
|
||||||
logger.Warnf("Error fetching e-books from theforest: %v - keeping existing pages", err) |
|
||||||
// Don't update cache - leave existing pages as-is
|
|
||||||
} else { |
|
||||||
// Convert to generator.EBookInfo
|
|
||||||
generatorEBooks := make([]generator.EBookInfo, 0, len(ebooks)) |
|
||||||
for _, ebook := range ebooks { |
|
||||||
generatorEBooks = append(generatorEBooks, generator.EBookInfo{ |
|
||||||
EventID: ebook.EventID, |
|
||||||
Title: ebook.Title, |
|
||||||
DTag: ebook.DTag, |
|
||||||
Author: ebook.Author, |
|
||||||
Summary: ebook.Summary, |
|
||||||
Image: ebook.Image, |
|
||||||
Type: ebook.Type, |
|
||||||
CreatedAt: ebook.CreatedAt, |
|
||||||
Naddr: ebook.Naddr, |
|
||||||
}) |
|
||||||
} |
|
||||||
|
|
||||||
// Store all e-books for landing page
|
|
||||||
allEBooks = generatorEBooks |
|
||||||
|
|
||||||
ebooksHTML, err := r.htmlGenerator.GenerateEBooksPage(generatorEBooks, []generator.FeedItemInfo{}) |
|
||||||
if err != nil { |
|
||||||
logger.Errorf("Error generating e-books page: %v", err) |
|
||||||
} else { |
|
||||||
if err := r.cache.Set("/ebooks", ebooksHTML); err != nil { |
|
||||||
logger.Errorf("Error caching e-books page: %v", err) |
|
||||||
} else { |
|
||||||
logger.WithField("ebooks", len(generatorEBooks)).Info("E-books page cached successfully") |
|
||||||
} |
|
||||||
} |
|
||||||
} |
|
||||||
} |
|
||||||
|
|
||||||
// Always generate landing page AFTER blog, articles, and e-books are fetched and cached
|
|
||||||
// Now we have all the data needed for the landing page
|
|
||||||
landingHTML, err := r.htmlGenerator.GenerateLandingPage(wikiPages, newestBlogItem, newestArticleItem, allArticleItems, allEBooks) |
|
||||||
if err != nil { |
|
||||||
logger.Errorf("Error generating landing page: %v", err) |
|
||||||
} else { |
|
||||||
if err := r.cache.Set("/", landingHTML); err != nil { |
|
||||||
logger.Errorf("Error caching landing page: %v", err) |
|
||||||
} else { |
|
||||||
logger.WithField("pages", len(wikiPages)).Info("Landing page cached successfully") |
|
||||||
} |
|
||||||
} |
|
||||||
|
|
||||||
// Generate and cache Feed page (using feed items from cache)
|
|
||||||
feedItems := r.convertFeedItemsToInfo(r.feedCache.Get()) |
|
||||||
feedHTML, err := r.htmlGenerator.GenerateFeedPage(feedItems) |
|
||||||
if err != nil { |
|
||||||
logger.Errorf("Error generating feed page: %v", err) |
|
||||||
} else { |
|
||||||
if err := r.cache.Set("/feed", feedHTML); err != nil { |
|
||||||
logger.Errorf("Error caching feed page: %v", err) |
|
||||||
} else { |
|
||||||
logger.WithField("items", len(feedItems)).Info("Feed page cached successfully") |
|
||||||
} |
|
||||||
} |
|
||||||
|
|
||||||
logger.Info("Page cache rewarming completed") |
|
||||||
} |
|
||||||
|
|
||||||
// rewarmFeed rewarms the feed cache
|
|
||||||
func (r *Rewarmer) rewarmFeed(ctx context.Context) { |
|
||||||
logger.WithFields(map[string]interface{}{ |
|
||||||
"relay": r.feedRelay, |
|
||||||
"max_events": r.maxFeedEvents, |
|
||||||
}).Info("Starting feed cache rewarming") |
|
||||||
|
|
||||||
nostrItems, err := r.feedService.FetchFeedItems(ctx, r.feedRelay, r.maxFeedEvents) |
|
||||||
if err != nil { |
|
||||||
logger.WithField("relay", r.feedRelay).Warnf("Error fetching feed: %v", err) |
|
||||||
// Don't clear the cache on error - keep old items
|
|
||||||
return |
|
||||||
} |
|
||||||
|
|
||||||
if len(nostrItems) == 0 { |
|
||||||
logger.WithField("relay", r.feedRelay).Warn("No feed items fetched") |
|
||||||
// Don't clear the cache - keep old items
|
|
||||||
return |
|
||||||
} |
|
||||||
|
|
||||||
// Convert nostr.FeedItem to cache.FeedItem
|
|
||||||
items := make([]FeedItem, 0, len(nostrItems)) |
|
||||||
for _, item := range nostrItems { |
|
||||||
items = append(items, FeedItem{ |
|
||||||
EventID: item.EventID, |
|
||||||
Author: item.Author, |
|
||||||
Content: item.Content, |
|
||||||
Time: item.Time, |
|
||||||
Link: item.Link, |
|
||||||
Title: item.Title, |
|
||||||
Summary: item.Summary, |
|
||||||
Image: item.Image, |
|
||||||
}) |
|
||||||
} |
|
||||||
|
|
||||||
r.feedCache.Set(items) |
|
||||||
logger.WithFields(map[string]interface{}{ |
|
||||||
"items": len(items), |
|
||||||
"relay": r.feedRelay, |
|
||||||
}).Info("Feed cache rewarmed successfully") |
|
||||||
} |
|
||||||
|
|
||||||
// periodicRewarmPages periodically rewarms pages
|
|
||||||
func (r *Rewarmer) periodicRewarmPages(ctx context.Context) { |
|
||||||
ticker := time.NewTicker(r.interval) |
|
||||||
defer ticker.Stop() |
|
||||||
|
|
||||||
for { |
|
||||||
select { |
|
||||||
case <-ctx.Done(): |
|
||||||
return |
|
||||||
case <-ticker.C: |
|
||||||
r.rewarmPages(ctx) |
|
||||||
} |
|
||||||
} |
|
||||||
} |
|
||||||
|
|
||||||
// convertFeedItemsToInfo converts cache.FeedItem to generator.FeedItemInfo
|
|
||||||
func (r *Rewarmer) convertFeedItemsToInfo(items []FeedItem) []generator.FeedItemInfo { |
|
||||||
feedItems := make([]generator.FeedItemInfo, 0, len(items)) |
|
||||||
for _, item := range items { |
|
||||||
feedItems = append(feedItems, generator.FeedItemInfo{ |
|
||||||
EventID: item.EventID, |
|
||||||
Author: item.Author, |
|
||||||
Content: item.Content, |
|
||||||
Time: item.Time.Format("2006-01-02 15:04:05"), |
|
||||||
TimeISO: item.Time.Format(time.RFC3339), |
|
||||||
Link: item.Link, |
|
||||||
}) |
|
||||||
} |
|
||||||
return feedItems |
|
||||||
} |
|
||||||
|
|
||||||
// periodicRewarmFeed periodically rewarms feed
|
|
||||||
func (r *Rewarmer) periodicRewarmFeed(ctx context.Context) { |
|
||||||
ticker := time.NewTicker(r.feedInterval) |
|
||||||
defer ticker.Stop() |
|
||||||
|
|
||||||
for { |
|
||||||
select { |
|
||||||
case <-ctx.Done(): |
|
||||||
return |
|
||||||
case <-ticker.C: |
|
||||||
r.rewarmFeed(ctx) |
|
||||||
} |
|
||||||
} |
|
||||||
} |
|
||||||
@ -1,8 +1,6 @@ |
|||||||
{ |
{ |
||||||
"dependencies": { |
"dependencies": { |
||||||
"gc-parser": "git+https://git.imwald.eu/silberengel/gc-parser.git" |
"@asciidoctor/core": "^3.0.4", |
||||||
}, |
"marked": "^12.0.0" |
||||||
"devDependencies": { |
|
||||||
"highlight.js": "^11.11.1" |
|
||||||
} |
} |
||||||
} |
} |
||||||
|
|||||||
@ -1,45 +0,0 @@ |
|||||||
#!/usr/bin/env node
|
|
||||||
/** |
|
||||||
* Wrapper script to process content using gc-parser |
|
||||||
* Called from Go code via exec |
|
||||||
*/ |
|
||||||
|
|
||||||
const { Parser } = require('gc-parser'); |
|
||||||
|
|
||||||
// Read content from stdin
|
|
||||||
let content = ''; |
|
||||||
process.stdin.setEncoding('utf8'); |
|
||||||
|
|
||||||
process.stdin.on('data', (chunk) => { |
|
||||||
content += chunk; |
|
||||||
}); |
|
||||||
|
|
||||||
process.stdin.on('end', async () => { |
|
||||||
try { |
|
||||||
// Parse options from environment or command line args
|
|
||||||
const linkBaseURL = process.env.LINK_BASE_URL || process.argv[2] || ''; |
|
||||||
|
|
||||||
// Create parser with options
|
|
||||||
const parser = new Parser({ |
|
||||||
linkBaseURL: linkBaseURL, |
|
||||||
enableAsciiDoc: true, |
|
||||||
enableMarkdown: true, |
|
||||||
enableCodeHighlighting: true, |
|
||||||
enableLaTeX: true, |
|
||||||
enableMusicalNotation: true, |
|
||||||
enableNostrAddresses: true, |
|
||||||
}); |
|
||||||
|
|
||||||
// Process content
|
|
||||||
const result = await parser.process(content); |
|
||||||
|
|
||||||
// Output as JSON
|
|
||||||
console.log(JSON.stringify(result)); |
|
||||||
} catch (error) { |
|
||||||
console.error(JSON.stringify({ |
|
||||||
error: error.message, |
|
||||||
stack: error.stack, |
|
||||||
})); |
|
||||||
process.exit(1); |
|
||||||
} |
|
||||||
}); |
|
||||||
@ -1,10 +0,0 @@ |
|||||||
pre code.hljs{display:block;overflow-x:auto;padding:1em}code.hljs{padding:3px 5px}/*! |
|
||||||
Theme: GitHub Dark |
|
||||||
Description: Dark theme as seen on github.com |
|
||||||
Author: github.com |
|
||||||
Maintainer: @Hirse |
|
||||||
Updated: 2021-05-15 |
|
||||||
|
|
||||||
Outdated base version: https://github.com/primer/github-syntax-dark |
|
||||||
Current colors taken from GitHub's CSS |
|
||||||
*/.hljs{color:#c9d1d9;background:#0d1117}.hljs-doctag,.hljs-keyword,.hljs-meta .hljs-keyword,.hljs-template-tag,.hljs-template-variable,.hljs-type,.hljs-variable.language_{color:#ff7b72}.hljs-title,.hljs-title.class_,.hljs-title.class_.inherited__,.hljs-title.function_{color:#d2a8ff}.hljs-attr,.hljs-attribute,.hljs-literal,.hljs-meta,.hljs-number,.hljs-operator,.hljs-selector-attr,.hljs-selector-class,.hljs-selector-id,.hljs-variable{color:#79c0ff}.hljs-meta .hljs-string,.hljs-regexp,.hljs-string{color:#a5d6ff}.hljs-built_in,.hljs-symbol{color:#ffa657}.hljs-code,.hljs-comment,.hljs-formula{color:#8b949e}.hljs-name,.hljs-quote,.hljs-selector-pseudo,.hljs-selector-tag{color:#7ee787}.hljs-subst{color:#c9d1d9}.hljs-section{color:#1f6feb;font-weight:700}.hljs-bullet{color:#f2cc60}.hljs-emphasis{color:#c9d1d9;font-style:italic}.hljs-strong{color:#c9d1d9;font-weight:700}.hljs-addition{color:#aff5b4;background-color:#033a16}.hljs-deletion{color:#ffdcd7;background-color:#67060c} |
|
||||||
File diff suppressed because one or more lines are too long
@ -1,68 +0,0 @@ |
|||||||
{{define "content"}} |
|
||||||
<article class="events-page"> |
|
||||||
<section class="hero"> |
|
||||||
<div class="hero-content"> |
|
||||||
<div class="hero-text"> |
|
||||||
<h1>Events: {{.DTag}}</h1> |
|
||||||
<p class="lead">Found {{len .EventCards}} event{{if ne (len .EventCards) 1}}s{{end}} with this d-tag</p> |
|
||||||
</div> |
|
||||||
</div> |
|
||||||
</section> |
|
||||||
|
|
||||||
<section class="events-grid"> |
|
||||||
{{if .EventCards}} |
|
||||||
<div class="events-container"> |
|
||||||
{{range .EventCards}} |
|
||||||
<div class="event-card"> |
|
||||||
<a href="{{.URL}}" class="event-card-link"> |
|
||||||
{{if and .Image (ne .Image "")}} |
|
||||||
<div class="event-card-image"> |
|
||||||
<img src="{{.Image}}" alt="{{.Title}}" /> |
|
||||||
</div> |
|
||||||
{{end}} |
|
||||||
<div class="event-card-content"> |
|
||||||
<div class="event-card-header"> |
|
||||||
<h3 class="event-card-title">{{.Title}}</h3> |
|
||||||
<span class="event-card-kind"> |
|
||||||
{{if eq .Kind 30818}} |
|
||||||
<span class="icon-inline">{{icon "book-open"}}</span> Wiki |
|
||||||
{{else if eq .Kind 30041}} |
|
||||||
<span class="icon-inline">{{icon "file-text"}}</span> Blog |
|
||||||
{{else if eq .Kind 30023}} |
|
||||||
<span class="icon-inline">{{icon "file-text"}}</span> Article |
|
||||||
{{else}} |
|
||||||
<span class="icon-inline">{{icon "file"}}</span> Kind {{.Kind}} |
|
||||||
{{end}} |
|
||||||
</span> |
|
||||||
</div> |
|
||||||
{{if .Summary}} |
|
||||||
<p class="event-card-summary">{{truncate .Summary 200}}</p> |
|
||||||
{{end}} |
|
||||||
<div class="event-card-meta"> |
|
||||||
{{if .Author}} |
|
||||||
<span class="event-card-author"> |
|
||||||
<span class="icon-inline">{{icon "user"}}</span> |
|
||||||
{{template "user-badge-simple" (dict "Pubkey" .Author "Profiles" $.Profiles)}} |
|
||||||
</span> |
|
||||||
{{end}} |
|
||||||
{{if .Time}} |
|
||||||
<span class="event-card-date"> |
|
||||||
<span class="icon-inline">{{icon "clock"}}</span> {{.Time}} |
|
||||||
</span> |
|
||||||
{{end}} |
|
||||||
</div> |
|
||||||
</div> |
|
||||||
</a> |
|
||||||
</div> |
|
||||||
{{end}} |
|
||||||
</div> |
|
||||||
{{else}} |
|
||||||
<div class="empty-state"> |
|
||||||
<p><span class="icon-inline">{{icon "inbox"}}</span> No events found with d-tag: {{.DTag}}</p> |
|
||||||
</div> |
|
||||||
{{end}} |
|
||||||
</section> |
|
||||||
</article> |
|
||||||
{{end}} |
|
||||||
|
|
||||||
{{/* Feed is defined in components.html */}} |
|
||||||
Loading…
Reference in new issue