The Great Astro Migration

This commit is contained in:
Ben Edgington
2025-05-23 19:47:23 +01:00
parent 95713da808
commit 27542ec898
75 changed files with 5059 additions and 18932 deletions

27
.gitignore vendored
View File

@@ -13,12 +13,9 @@ src/charts/font/*
# Files generated during the build process
src/md/pages/
src/md/annotated.md
index.json
public/
src/cache/
# Junk
node_modules/
.cache/
tmp/
test*
*.pdf
@@ -26,3 +23,25 @@ test*
# IntelliJ configs
.idea/
*.iml
# build output
dist/
# generated types
.astro/
# dependencies
node_modules/
# logs
npm-debug.log*
yarn-debug.log*
yarn-error.log*
pnpm-debug.log*
# environment variables
.env
.env.production
# macOS-specific files
.DS_Store

View File

@@ -20,23 +20,17 @@ Kindly note that [British spelling](https://www.oxfordinternationalenglish.com/d
## Installing
As of May 2025, I have migrated the entire build from Gatsby to [Astro](https://astro.build/). Please let me know if you spot any issues!
### Pre-requisites
Install `node`, `npm`, and `gatsby-cli`. These are my versions:
Install `node` and `npm`. These are my versions:
```
> node --version
v22.14.0
> npm --version
11.1.0
> gatsby --version
Gatsby CLI version: 5.14.0
```
`gatsby-cli` can be installed with,
```
npm install -g gatsby-cli
11.4.1
```
You'll also need a working `perl` installed at _/usr/bin/perl_ so that the build can preprocess the book document.
@@ -48,10 +42,10 @@ I've implemented a heap of pre-build checks for linting and spelling issues. You
To cause Git commits to halt when these checks fail, add the following symlink:
```
ln -s bin/util/git-pre-commit-hook.sh .git/hooks/pre-commit
(cd .git/hooks; ln -s ../../bin/util/git-pre-commit-hook.sh pre-commit)
```
The controlling script for the checks is _bin/build/prebuild.mjs_. You can enable and disable specific checks there.
The controlling script for the checks is _bin/build/prebuild.js_. You can enable and disable specific checks there.
If the $\LaTeX$ linting fails you might need to install the following, or just disable that check.
@@ -65,7 +59,7 @@ Clone this repo. `cd` into it, then:
```
npm install
gatsby build --prefix-paths
npm run build
```
### Viewing
@@ -73,28 +67,28 @@ gatsby build --prefix-paths
After building as above, do
```
gatsby serve --prefix-paths
npm run serve
```
and visit http://localhost:9000/main in a web browser.
Astro will tell you where it is serving the content (somewhere like http://localhost:4321/capella).
Instead of building and serving, you can run `gatsby develop` and point your browser at port 8000. This will not pick up real-time changes to _src/book.md_ and will need to be restarted to pick up changes. It is useful, though, for checking CSS and React changes interactively.
Instead of building and serving, you can run `npm run devel` and visit the link Astro shows. This will not pick up real-time changes to _src/book.md_ and will need to be restarted to see them. It is useful, though, for checking CSS and other component changes interactively.
## Workflow
The entire text for the book is in the _src/book.md_ file. Everything under _src/md/pages_ is auto-generated and any changes there will be lost.
There are various npm script commands to help with building and testing:
There are various npm script commands to help with building and testing. See `package.json` for the full list.
- `npm run clean` runs `gatsby clean`.
- Do this after adding new graphics or if anything weird happens.
- `npm run clean` deletes the output directory (`dist/`) and the Astro cache.
- I recommend doing this often. Astro caches aggressively and will often skip things like rebulding the search index.
- `npm run check` runs a bunch of custom linting and checking, controlled by the _bin/build/prebuild.js_ script.
- Check all links to internal anchors, image files, and footnotes.
- Spell check. Add any exceptions to _src/spellings.txt_
- Spell check. Add any exceptions to _src/spellings.en.pws_ (or use `npm run spfix`).
- Markdown linting on both the original source and the generated pages.
- `npm run build` runs `gatsby build --prefix-paths`.
- `npm run serve` runs `gatsby serve --prefix-paths`.
- Visit http://localhost:9000/main/ to see the result.
- HTML checks and LaTeX expression linting.
- `npm run build` runs `astro build`.
- `npm run serve` runs `astro preview`.
- `npm run links` checks external links.
- Checking links to GitHub it will fail due to rate-limiting unless you supply GitHub credentials.
- `npm run spell` runs a spell check
@@ -102,6 +96,8 @@ There are various npm script commands to help with building and testing:
- `npm run valid` submits a page to the [W3C markup validation service](https://validator.w3.org/) and lists any issues above `info` level.
- `npm run pdfit` creates a PDF of the whole thing. See the [README](bin/pdf/README.md).
- `npm run stats` shows some stats about the book. Build the PDF first to get the full set.
- `npm run debug` builds with debugging output for my integrations.
- `npm run minim` does a minimal build with only a couple of pages. See `src/content.config.js`.
## How to

46
astro.config.mjs Normal file
View File

@@ -0,0 +1,46 @@
// @ts-check
import { defineConfig } from 'astro/config';
import { Metadata, SearchOptions } from './src/include/SiteConfig.js';
import remarkMath from 'remark-math';
import rehypeKatex from 'rehype-katex';
import rehypeRaw from 'rehype-raw';
import Prism from 'prismjs';
// Custom integrations
import myBuildChecks from './integrations/my_build_checks';
import myAutolinkHeadings from './integrations/my_autolink_headings';
import mySvgInline from './integrations/my_svg_inline';
import mySearchIndex from './integrations/my_search_index';
import myAddTooltips from './integrations/my_add_tooltips';
import myFixupLinks from './integrations/my_fixup_links';
import myCleanupHtml from './integrations/my_cleanup_html';
import myHtaccess from './integrations/my_htaccess';
Prism.languages.none = Prism.languages.text;
Prism.languages.code = Prism.languages.text;
export default defineConfig({
base: `/` + Metadata.version,
integrations: [
myBuildChecks(),
myAutolinkHeadings(),
mySvgInline({ filePath: 'src/', cachePath: 'src/cache/' }),
mySearchIndex(SearchOptions),
myAddTooltips({ constantsFile: 'src/include/constants.json' }),
myFixupLinks(),
myCleanupHtml(),
myHtaccess(`/` + Metadata.version),
],
markdown: {
syntaxHighlight: 'prism',
smartypants: false,
remarkRehype: { clobberPrefix: '' }, // This is ok as we trust the markdown
remarkPlugins: [
remarkMath,
],
rehypePlugins: [
rehypeRaw, // Insert HTML embedded in MD files into the AST rather than as raw strings
[rehypeKatex, {}],
],
}
});

View File

@@ -9,7 +9,7 @@
#
# Relative page paths are not supported.
#
# Anchors generated from headings have some rules (imposed by Gatsby):
# Anchors generated from headings have some rules. See integrations/my_autolink_headings.js
# - Converted to lower case
# - Spaces become "-"
# - Special characters are omitted: ".,?:'`/[]()" and probably others
@@ -57,10 +57,10 @@ while(<$fh>) {
# Add headings
if (/^#+ (.*)$/) {
my $name = $1 =~ s/ <!-- .* -->$//r;
my $name = $1 =~ s/\s+<!-- .* -->$//r;
$name = lc $name;
$name =~ tr/ /-/;
$name =~ tr/a-z0-9_-//cd;
$name =~ s/\s+/-/g;
$name =~ s/[^a-z0-9_-]//g;
$anchors{$pagePath . '#' . $name} = 1;
}

View File

@@ -15,7 +15,7 @@ my $inPart3 = 0;
print
"---",
"path: /annotated-spec/",
"titles: [\"One Page Annotated Spec\",\"\",\"\"]",
"titles: [\"One Page Annotated Spec\"]",
"index: [999]",
"sequence: 990",
"---";

View File

@@ -1,7 +1,7 @@
import { execSync } from "child_process";
import glob from "glob";
import { lintSourceMarkdown } from "./checks/lint_source_md.mjs"
import { lintSplitMarkdown } from "./checks/lint_split_md.mjs"
import { execSync } from 'child_process';
import { glob } from 'glob';
import { lintSourceMarkdown } from './checks/lint_source_md.mjs';
import { lintSplitMarkdown } from './checks/lint_split_md.mjs';
// Performs the following prebuild tasks:
// - Checks that internal document links look ok
@@ -14,25 +14,25 @@ import { lintSplitMarkdown } from "./checks/lint_split_md.mjs"
// - Splits the source markdown into individual pages
// - Lints the split markdown
const doInternalLinks = true
const doHtmlCheck = true
const doSpellCheck = true
const doRepeatCheck = true
const doWhitespaceCheck = true
const doLatexCheck = true
const doSourceLint = true
const doSplitLint = true
const doInternalLinks = true;
const doHtmlCheck = true;
const doSpellCheck = true;
const doRepeatCheck = true;
const doWhitespaceCheck = true;
const doLatexCheck = true;
const doSourceLint = true;
const doSplitLint = true;
const linkChecker = 'bin/build/checks/links.pl'
const htmlChecker = 'bin/build/checks/html.pl'
const spellChecker = 'bin/build/checks/spellcheck.sh'
const repeatChecker = 'bin/build/checks/repeats.sh'
const whitespaceChecker = 'bin/build/checks/whitespace.pl'
const latexChecker = 'bin/build/checks/latex.pl'
const mdSplitter = 'bin/build/process_markdown.sh'
const linkChecker = 'bin/build/checks/links.pl';
const htmlChecker = 'bin/build/checks/html.pl';
const spellChecker = 'bin/build/checks/spellcheck.sh';
const repeatChecker = 'bin/build/checks/repeats.sh';
const whitespaceChecker = 'bin/build/checks/whitespace.pl';
const latexChecker = 'bin/build/checks/latex.pl';
const mdSplitter = 'bin/build/process_markdown.sh';
const sourceMarkdown = 'src/book.md'
const ourSpellings = 'src/spellings.en.pws'
const sourceMarkdown = 'src/book.md';
const ourSpellings = 'src/spellings.en.pws';
const customReporter = {
// https://tintin.mudhalla.net/info/xterm/
@@ -40,40 +40,40 @@ const customReporter = {
info: function (m) { console.log('\x1b[38;5;19m%s\x1b[0m %s', 'info', m) },
warn: function (m) { console.log('\x1b[38;5;130m%s\x1b[0m %s', 'warn', m) },
error: function (m) { console.log('\x1b[38;5;160m%s\x1b[0m %s', 'error', m) },
}
};
function printLines(s, reporter) {
s.split(/\r?\n/).forEach((line, i) => line && reporter(line))
s.split(/\r?\n/).forEach((line, i) => line && reporter.warn(line));
}
function runCheck(
enabled, checker, infoMessage, failMessage, errorMessage, skipMessage, reporter
) {
let success = true
let success = true;
if (enabled) {
reporter.info(infoMessage)
reporter.info(infoMessage);
try {
const out = checker()
const out = checker();
if (out !== '' && out !== null) {
reporter.warn(failMessage)
printLines(out, reporter.warn)
success = false
reporter.warn(failMessage);
printLines(out, reporter);
success = false;
}
} catch (err) {
reporter.warn(errorMessage)
printLines(err.toString(), reporter.warn)
success = false
reporter.warn(errorMessage);
printLines(err.toString(), reporter);
success = false;
}
} else {
reporter.warn(skipMessage)
reporter.warn(skipMessage);
}
return success
return success;
}
// Set `exitToShell` to false to continue processing after running checks (e.g. while building)
export const runChecks = (reporter = customReporter, exitToShell = true) => {
export default function runChecks(reporter = customReporter, exitToShell = true) {
var allOk = true
var allOk = true;
allOk &= runCheck(
doInternalLinks,
@@ -83,7 +83,7 @@ export const runChecks = (reporter = customReporter, exitToShell = true) => {
'Unable to check internal links:',
'Skipping internal link check',
reporter
)
);
allOk &= runCheck(
doHtmlCheck,
@@ -93,7 +93,7 @@ export const runChecks = (reporter = customReporter, exitToShell = true) => {
'Unable to check HTML:',
'Skipping HTML check',
reporter
)
);
allOk &= runCheck(
doSpellCheck,
@@ -103,7 +103,7 @@ export const runChecks = (reporter = customReporter, exitToShell = true) => {
'Unable to perform spellcheck:',
'Skipping spellcheck',
reporter
)
);
allOk &= runCheck(
doRepeatCheck,
@@ -113,7 +113,7 @@ export const runChecks = (reporter = customReporter, exitToShell = true) => {
'Unable to perform repeat check:',
'Skipping repeat check',
reporter
)
);
allOk &= runCheck(
doWhitespaceCheck,
@@ -123,7 +123,7 @@ export const runChecks = (reporter = customReporter, exitToShell = true) => {
'Unable to perform whitespace check:',
'Skipping whitespace check',
reporter
)
);
allOk &= runCheck(
doLatexCheck,
@@ -133,7 +133,7 @@ export const runChecks = (reporter = customReporter, exitToShell = true) => {
'Unable to perform LaTeX check:',
'Skipping LaTeX check',
reporter
)
);
let sourceLintSucceeded = runCheck(
doSourceLint,
@@ -143,15 +143,15 @@ export const runChecks = (reporter = customReporter, exitToShell = true) => {
'Unable to lint check source markdown:',
'Skipping source markdown linting',
reporter
)
allOk &= sourceLintSucceeded
);
allOk &= sourceLintSucceeded;
reporter.info('Unpacking book source...')
reporter.info('Unpacking book source...');
try {
execSync(`${mdSplitter} ${sourceMarkdown}`)
execSync(`${mdSplitter} ${sourceMarkdown}`);
} catch (err) {
reporter.error('Failed to unpack book source.')
throw err
reporter.error('Failed to unpack book source.');
throw err;
}
if (sourceLintSucceeded) {
@@ -163,12 +163,12 @@ export const runChecks = (reporter = customReporter, exitToShell = true) => {
'Unable to lint check split markdown:',
'Skipping split markdown linting',
reporter
)
);
} else {
reporter.warn('Skipping split markdown linting due to earlier errors')
reporter.warn('Skipping split markdown linting due to earlier errors');
}
if (exitToShell) {
process.exit(allOk ? 0 : 2)
process.exit(allOk ? 0 : 2);
}
}

View File

@@ -82,11 +82,12 @@ while (<>) {
die "Internal error: can't determine heading level.";
}
my $titles = '"'.join('","', grep($_, ($thisPart, $thisChapter, $thisSection))).'"';
print $ofh
"---",
"hide: $hide",
"path: $path",
"titles: [\"$thisPart\",\"$thisChapter\",\"$thisSection\"]",
"titles: [$titles]",
"index: [$idx]",
"sequence: $sequence",
"---";

View File

@@ -1,10 +1,16 @@
#!/bin/sh
# Run the checks only if file `book.md` is staged
if git diff --exit-code -s --staged src/book.md
then
exit 0
fi
# Run the pre-build checks on the book source
node --input-type=module -e 'import { runChecks } from "./bin/build/prebuild.mjs"; runChecks()'
node --input-type=module -e 'import runChecks from "./bin/build/prebuild.js"; runChecks()'
if [ "$?" != "0" ]
then
echo -e "\nError: Not committing due to failed checks.\n" >&2
echo "\nError: Not committing due to failed checks.\n" >&2
exit 1
fi

View File

@@ -22,32 +22,26 @@ cd $(dirname "$0")/../..
# Set the host variable
source bin/priv/server.sh
echo
echo "*** Patching node modules ***"
npx patch-package --error-on-fail
was_it_ok $? "patch-package"
echo
echo "*** Building site..."
gatsby clean
gatsby build --prefix-paths
was_it_ok $? "gatsby build"
npm run clean
npm run build
was_it_ok $? "npm run build"
echo
echo "*** Building PDF..."
bin/pdf/make_pdf src/book.md
was_it_ok $? "make_pdf"
mv book.pdf public/
mv book.pdf dist/
echo
echo "*** Ready to upload - press [ENTER] to continue"
wait_for_input
tar zcf - public | ssh $host tar zxfC - eth2book
tar zcf - dist | ssh $host tar zxfC - eth2book
echo
echo "*** Ready to install - press [ENTER] to continue"
wait_for_input
ssh $host eth2book/install.sh $version
ssh $host eth2book/install_astro.sh $version

View File

@@ -1,7 +1,7 @@
const axios = require('axios')
const fs=require('fs')
import axios from 'axios';
import fs from 'fs';
module.exports.validateHtml = (fileName) => {
export default function validateHtml(fileName) {
const file = fs.readFileSync(fileName)

View File

@@ -1,128 +0,0 @@
const execSync = require('child_process').execSync;
function getGitHash() {
try {
return execSync('git log -1 --format="%h" 2>/dev/null', {encoding: 'utf8'}).replace(/(\r\n|\n|\r)/, '')
} catch(e) {
return 'unknown'
}
}
function getGitBranch() {
try {
return execSync('git branch --show-current 2>/dev/null', {encoding: 'utf8'}).replace(/(\r\n|\n|\r)/, '')
} catch(e) {
return 'unknown'
}
}
const date = new Date().toISOString().substr(0, 16).replace('T', ' ') + ' UTC';
const version = getGitBranch();
const hostname = 'https://eth2book.info';
const canonical = hostname + '/latest';
module.exports = {
siteMetadata: {
title: 'Upgrading Ethereum',
description: 'A technical handbook on Ethereum\'s move to proof of stake and beyond',
author: 'Ben Edgington',
gitHash: getGitHash(),
gitUrl: 'https://github.com/benjaminion/upgrading-ethereum-book',
date: date,
licenceUrl: 'https://creativecommons.org/licenses/by-sa/4.0/',
licence: 'CC BY-SA 4.0',
hostname: hostname,
version: version,
canonical: canonical,
},
pathPrefix: '/' + version,
trailingSlash: 'always',
plugins: [
{
resolve: 'gatsby-source-filesystem',
options: {
name: 'markdown-pages',
path: `${__dirname}/src/md`,
},
},
{
resolve: 'gatsby-transformer-remark',
options: {
gfm: true,
plugins: [
{
resolve: 'gatsby-remark-autolink-headers',
options: {
elements: ['h3', 'h4', 'h5', 'h6'],
},
},
'gatsby-remark-numbered-footnotes',
'gatsby-remark-katex',
{
resolve: 'my-tooltips',
options: {
file: `${__dirname}/src/constants.json`,
}
},
{
resolve: 'my-svg-embed',
options: {
directory: `${__dirname}/src/`,
}
},
'my-strip-html-comments',
{
resolve: 'gatsby-remark-prismjs',
options: {
noInlineHighlight: true,
aliases: {code: 'text'},
},
},
],
},
},
'gatsby-plugin-catch-links',
{
resolve: 'gatsby-plugin-htaccess',
options: {
ErrorDocument: 'ErrorDocument 404 /' + version + '/404.html',
},
},
{
resolve: 'gatsby-plugin-matomo',
options: {
siteId: '1',
matomoUrl: hostname + '/matomo',
siteUrl: hostname + '/' + version,
matomoPhpScript: 'matomo.php',
matomoJsScript: 'matomo.js',
disableCookies: true,
},
},
{
resolve: 'my-search-index',
options: {
enabled: true,
// Matching elements have their text added to the index. First match wins.
chunkTypes: [
{query: 'figcaption', label: 'Figure caption'},
{query: '[id^="fn-"]', label: 'Footnote'},
{query: 'li', label: 'List item'},
{query: 'pre', label: 'Code'},
{query: 'table', label: 'Table'},
{query: 'h3, h4, h5, h6', label: 'Heading', weight: 5},
{query: 'p', label: 'Paragraph'},
],
exclude: {
// Note, only pages under src/md/pages have a "hide" property.
frontmatter: [{hide: true}, {hide: null}],
// The frontmatter filter takes care of excluding a good set of pages for now.
pages: [],
// Elements matching this query are ignored completely, including their text:
ignore: 'svg, details, mtable, mrow, [aria-hidden="true"], .footnote-ref',
}
},
}
],
flags: {},
};

View File

@@ -1,44 +0,0 @@
const path = require('path')
// Set up a hook to pre-process the source file into split files, and perform various
// checking and linting operations prior to building.
exports.onPreInit = ({ reporter }) => {
try {
(async () => {
const {runChecks: runChecks } = await import("./bin/build/prebuild.mjs")
runChecks(reporter, false)
})()
} catch (err) {
reporter.panicOnBuild('Could not run pre-build tasks,', err)
}
}
exports.createPages = async ({ actions, graphql }) => {
const { createPage } = actions
const pageTemplate = path.resolve(`src/templates/pageTemplate.js`)
const result = await graphql(`
{
allMarkdownRemark {
edges {
node {
frontmatter {
path
}
}
}
}
}
`)
if (result.errors) {
reporter.panicOnBuild(`Error while running GraphQL query.`)
}
result.data.allMarkdownRemark.edges.forEach(({ node }) => {
createPage({
path: node.frontmatter.path,
component: pageTemplate,
})
})
}

View File

@@ -0,0 +1,54 @@
import { visit } from 'unist-util-visit';
// Add a tooltip to constant values in the text according to the mapping in the
// supplied file.
let constantsMap = {};
function addTooltips() {
return function(tree) {
try {
visit(tree, 'inlineCode', (node, index, parent) => {
// HTML in headings causes problems for the page index, so skip these
if (parent.type !== 'heading') {
const text = node.value;
const value = constantsMap[text];
if (value) {
node.type = 'html';
node.value = `<code title="${text} = ${value}">${text}</code>`;
node.children = undefined;
}
}
})
} catch (err) {
console.error(err);
}
}
}
export default function(options) {
// Read the constants file and store it for later
const constantsFile = options?.constantsFile || '';
try {
constantsMap = JSON.parse(fs.readFileSync(constantsFile, 'utf8'));
} catch (err) {
console.log(err);
}
return {
name: 'myAddTooltips',
hooks: {
'astro:config:setup': ({ updateConfig }) => {
updateConfig({
markdown: {
remarkPlugins: [
addTooltips,
],
},
});
},
},
};
}

View File

@@ -0,0 +1,63 @@
import { CONTINUE, SKIP, visit } from 'unist-util-visit';
import { fromHtmlIsomorphic } from 'hast-util-from-html-isomorphic'
import { toString } from 'hast-util-to-string';
// Add IDs and SVG permalinks to headings h3 to h6
// (rehype-autolink-headings is good, but can't be configured to ignore h1 and h2)
const anchor = fromHtmlIsomorphic('<a class="anchor" ariaHidden="true"><svg aria-hidden="true" focusable="false" height="16" version="1.1" viewBox="0 0 16 16" width="16"><path fill-rule="evenodd" d="M4 9h1v1H4c-1.5 0-3-1.69-3-3.5S2.55 3 4 3h4c1.45 0 3 1.69 3 3.5 0 1.41-.91 2.72-2 3.25V8.59c.58-.45 1-1.27 1-2.09C10 5.22 8.98 4 8 4H4c-.98 0-2 1.22-2 2.5S3 9 4 9zm9-3h-1v1h1c1 0 2 1.22 2 2.5S13.98 12 13 12H9c-.98 0-2-1.22-2-2.5 0-.83.42-1.64 1-2.09V6.25c-1.09.53-2 1.84-2 3.25C6 11.31 7.55 13 9 13h4c1.45 0 3-1.69 3-3.5S14.5 6 13 6z"></path></svg></a>', {fragment: true}).children[0];
// The headings to process
const headings = ['h3', 'h4', 'h5', 'h6'];
// Should match the method in bin/build/checks/links.pl
function slugIt(heading) {
return (
toString(heading)
.trim()
.toLowerCase()
.replace(/\s+/g, '-')
.replace(/[^a-z0-9_-]/g, '')
);
}
function autolinkHeadings() {
return function(tree) {
try {
visit(tree, 'element', node => {
if (headings.indexOf(node.tagName) === -1) {
return CONTINUE;
}
const newAnchor = structuredClone(anchor);
if (node.properties.id) {
newAnchor.properties = { ...newAnchor.properties, href: '#' + node.properties.id };
} else {
const id = slugIt(node);
newAnchor.properties = { ...newAnchor.properties, href: '#' + id };
node.properties.id = id;
}
node.children = [ newAnchor ].concat(node.children);
return SKIP;
})
} catch (err) {
console.error(err);
}
}
}
export default function() {
return {
name: 'myAutolinkHeadings',
hooks: {
'astro:config:setup': ({ updateConfig }) => {
updateConfig({
markdown: {
rehypePlugins: [
autolinkHeadings,
],
},
});
},
},
};
}

View File

@@ -0,0 +1,23 @@
import runChecks from '../bin/build/prebuild.js';
function buildChecks(logger) {
logger.info('Running pre-build checks:');
runChecks(logger, false);
}
export default function() {
let doChecks;
return {
name: 'myBuildChecks',
hooks: {
'astro:config:setup': ({ command }) => {
doChecks = (command === 'build');
},
'astro:config:done': ({ logger }) => {
if (doChecks) {
buildChecks(logger);
}
},
},
};
}

View File

@@ -0,0 +1,44 @@
import { visit, SKIP } from 'unist-util-visit';
// Clean up any weird HTML artefacts, especially those that fail validation
function cleanupHtml() {
return function(tree) {
try {
// Remove `is:raw=""` that's on `code` elements, probably from Prism.
visit(tree, 'element', node => {
if (node.tagName == 'code'
&& node.properties['is:raw'] !== undefined) {
delete(node.properties['is:raw']);
}
});
// Remove any comments
visit(tree, 'comment', (node, index, parent) => {
parent.children.splice(index, 1);
return SKIP;
});
} catch (err) {
console.error(err);
}
}
}
export default function() {
return {
name: 'myCleanupHtml',
hooks: {
'astro:config:setup': ({ updateConfig }) => {
updateConfig({
markdown: {
rehypePlugins: [
cleanupHtml,
],
},
});
},
},
};
}

View File

@@ -0,0 +1,50 @@
import { visit } from 'unist-util-visit';
// Prepend `base` to URLs in the markdown file.
// It seems that [Astro does not do this](https://github.com/withastro/astro/issues/3626)
function fixupLinks(basePath) {
return function(tree) {
try {
visit(tree, 'element', node => {
if (node.tagName == 'a'
&& node.properties.href) {
// Add basePath prefix to local URLs that lack it
// [Astro does not do this](https://github.com/withastro/astro/issues/3626)
if(node.properties.href.startsWith('/')
&& !node.properties.href.startsWith(basePath + '/')) {
node.properties.href = basePath + node.properties.href;
}
// Add rel="external noopener" and target="_blank" attributes to off-site links
if(!node.properties.href.startsWith('/')
&& !node.properties.href.startsWith('#')) {
node.properties.rel = ['external', 'noopener'];
node.properties.target = '_blank';
}
}
})
} catch (err) {
console.error(err);
}
}
}
export default function() {
return {
name: 'myFixupLinks',
hooks: {
'astro:config:setup': ({ config, updateConfig }) => {
updateConfig({
markdown: {
rehypePlugins: [
[fixupLinks, config.base],
],
},
});
},
},
};
}

View File

@@ -0,0 +1,19 @@
// Write a .htaccess file to set the correct 404 page
function writeHtaccess(base, dir, logger) {
const file = dir.pathname + '.htaccess';
const contents = `ErrorDocument 404 ${base}/404.html\n`;
fs.writeFileSync(file, contents);
logger.info(`Wrote .htaccess file to ${file}`);
}
export default function(base) {
return {
name: 'myHtaccess',
hooks: {
'astro:build:done': ({ dir, logger }) => {
writeHtaccess(base, dir, logger);
},
},
};
}

View File

@@ -0,0 +1,149 @@
import * as cheerio from 'cheerio';
import { unified } from 'unified';
import {rehype} from 'rehype'
import parse from 'rehype-parse';
import { toHtml } from 'hast-util-to-html';
import fs from 'fs';
// File scoped to accumulate the index across calls to mySearchIndex
const searchIndex = [];
function isExcludedFrontmatter (frontmatter, exclude) {
for (let i = 0; i < exclude.frontmatter.length; i++) {
const test = exclude.frontmatter[i];
const [key, ...rest] = Object.keys(test);
if (Object.prototype.hasOwnProperty.call(frontmatter, key)
&& frontmatter[key] == test[key]) {
return true;
}
}
return false;
}
// Concatenate all text in child nodes while respecting exclusions
function getText ($, node, exclude) {
return [...$(node).contents().not(exclude.ignore)]
.map(e => (e.type === 'text') ? e.data : getText($, e, exclude))
.join('');
}
// Recurse until we find an element we want to treat as a chunk, then get all its text content.
function getChunks ($, node, chunkTypes, exclude, counts) {
if (counts === undefined) {
counts = Array(chunkTypes.length).fill(0);
}
for (let idx = 0; idx < chunkTypes.length; idx++) {
const type = chunkTypes[idx];
if ($(node).is(type.query)) {
const text = getText($, node, exclude);
if (text !== '') {
const tagName = $(node).prop('tagName').toLowerCase()
let id = $(node).attr('id');
if ( id === undefined) {
id = tagName + '_' + counts[idx];
$(node).attr('id', id);
++counts[idx];
}
return [{
type: tagName,
label: type.label,
id: id,
text: text,
weight: type.weight === undefined ? 1 : type.weight,
}];
}
}
}
return [...$(node).children().not(exclude.ignore)]
.map(e => getChunks($, e, chunkTypes, exclude, counts))
.flat();
}
function includePage(frontmatter, exclude) {
return (frontmatter !== undefined
&& isExcludedFrontmatter(frontmatter, exclude) === false
&& exclude.pages?.indexOf(frontmatter.path) === -1);
}
function buildSearchIndex(options) {
const { chunkTypes, exclude } = { ...options };
return function (tree, file) {
const frontmatter = file.data.astro.frontmatter;
if (includePage(frontmatter, exclude)) {
// console.log('Processing ' + frontmatter.path);
// We convert between HAST and Cheerio by going via a HTML string.
// TODO: avoid cheerio and just use unist-visit and related tools.
const $ = cheerio.load(toHtml(tree, {allowDangerousHtml: true}), null, false);
const chunks = getChunks($, $.root(), chunkTypes, exclude)
const pageIndexData = {
frontmatter: {
path: frontmatter.path,
titles: frontmatter.titles,
},
chunks: chunks,
}
searchIndex.push(pageIndexData);
return unified().use(parse, {fragment: true}).parse($.html());
} else {
// console.log('Ignoring ' + frontmatter.path);
}
}
}
function writeSearchIndex(dir, file, logger) {
const fileName = dir.pathname + file;
if (searchIndex.length) {
logger.info('Indexed ' + searchIndex.length + ' pages');
} else {
logger.warn('No pages were indexed');
}
fs.writeFileSync(fileName, JSON.stringify(searchIndex));
logger.info('Wrote search index to ' + fileName);
}
export default function(options) {
if (options.enabled === false) {
return {name: 'my-search-index'};
}
return {
name: 'mySearchIndex',
hooks: {
// We build the search index with rehype
'astro:config:setup': ({ updateConfig }) => {
updateConfig({
markdown: {
rehypePlugins: [
[buildSearchIndex, options],
],
},
});
},
// We write the search index to a file once the build is complete
'astro:build:done': ({ dir, logger }) => {
writeSearchIndex(dir, options.indexFile, logger);
},
},
};
}

View File

@@ -0,0 +1,177 @@
import { visit } from 'unist-util-visit';
import { optimize } from 'svgo';
import { getHashDigest } from 'loader-utils';
import path from 'path';
// Inline SVG files into the Markdown AST
// SVGO doesn't really support adding elements, and the API changes.
// The below is based on code from the "reusePaths" plugin.
const addTitle = {
name: 'addTitle',
type: 'visitor',
active: true,
fn: (ast, params) => {
return {
element: {
exit: (node, parentNode) => {
if (node.name === 'svg' && parentNode.type === 'root') {
const hasTitle = node.children.some(
(child) => child.type === 'element' && child.name === 'title'
)
if (!hasTitle) {
const titleElement = {
type: 'element',
name: 'title',
attributes: {},
children: [],
}
Object.defineProperty(titleElement, 'parentNode', {
writable: true,
value: node,
});
const titleContents = {
type: 'text',
value: params.titleText,
}
Object.defineProperty(titleContents, 'parentNode', {
writable: true,
value: titleElement,
});
titleElement.children.push(titleContents)
node.children.unshift(titleElement);
}
}
},
},
}
},
}
// See https://www.npmjs.com/package/svgo
const plugins = [
'preset-default',
'prefixIds',
'removeDimensions',
'removeXMLNS',
{
name: 'addAttributesToSVGElement',
params: {attribute: {'role': 'img'}},
},
]
const addTitleSettings = {
name: addTitle.name,
type: addTitle.type,
active: addTitle.active,
fn: addTitle.fn,
params: undefined,
}
const addAttributes = {
name: 'addAttributesToSVGElement',
params: undefined,
}
function inlineSvg(options) {
const filePath = options.filePath || '';
const cachePathTmp = options.cachePath;
const cachePath = cachePathTmp.endsWith('/') ? cachePathTmp : cachePathTmp + '/';
const { logger, doCache} = options;
return function (tree) {
try {
visit(tree, 'paragraph', async node => {
if (node.children[0].type == 'image') {
const image = node.children[0];
if (image.url.endsWith('.svg')) {
const originalSvg = fs.readFileSync(filePath + image.url, 'utf8');
const basename = path.basename(image.url, '.svg');
// We need to distinguish multiple SVGs on the same page by using "prefixIds"
const digest = getHashDigest(basename, 'md5', 'base52', 4);
// Configure the SVGO addAttributes plugin to add an ID to SVG element
addAttributes['params'] = {attribute: {id: basename + "-svg"}};
// Configure our custom plugin that adds a title element
addTitleSettings['params'] = {titleText: image.alt};
// If the cachePath option is provided, we load the optimised SVG from there
// when it exists and is newer than the original SVG. If a cached version is
// is not available or is older than the original SVG, we rewrite it.
const origMtime = fs.statSync(filePath + image.url).mtime;
const cacheFile = doCache ? cachePath + basename + '.svg' : null;
const goodCache = doCache
&& fs.existsSync(cacheFile)
&& (fs.statSync(cacheFile).mtime > origMtime);
let svg;
if (goodCache) {
svg = fs.readFileSync(cacheFile, 'utf8');
logger.debug(`Using cached ${basename}.svg`);
} else {
svg = optimize(
originalSvg,
{
path: digest,
plugins: plugins.concat([addTitleSettings, addAttributes])
}
).data;
logger.debug(`Optimising ${basename}.svg`);
if (doCache) {
fs.writeFileSync(cacheFile, svg);
logger.debug(`Caching ${basename}.svg`);
} else {
logger.debug(`Not caching ${basename}.svg`);
}
}
// Modify the current node in-place
node.type = 'html';
node.value = svg;
node.children = [];
}
}
})
} catch (err) {
console.error(err);
}
}
}
export default function(options) {
return {
name: 'mySvgInline',
hooks: {
'astro:config:setup': ({ updateConfig, logger }) => {
let doCache = false;
if (options.cachePath) {
try {
if (fs.statSync(options.cachePath).isDirectory()) {
doCache = true;
} else {
logger.warn(`Not caching SVGs: ${options.cachePath} is not a directory`);
}
} catch(e) {
logger.warn(`Not caching SVGs: ${options.cachePath} does not exist`);
}
} else {
logger.info('Not caching SVGs: no cachePath provided');
}
updateConfig({
markdown: {
remarkPlugins: [
[inlineSvg, { ...options, logger: logger, doCache: doCache }],
],
},
});
},
},
};
}

20747
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -2,53 +2,37 @@
"name": "upgrading-ethereum-book",
"version": "0.3.0",
"private": true,
"type": "module",
"description": "A technical handbook on Ethereum's move to proof of stake and beyond",
"author": "Ben Edgington",
"keywords": [],
"scripts": {
"devel": "gatsby develop",
"start": "gatsby develop",
"build": "gatsby build --prefix-paths",
"serve": "gatsby serve --prefix-paths",
"clean": "gatsby clean",
"check": "node --input-type=module -e 'import { runChecks } from \"./bin/build/prebuild.mjs\"; runChecks()'",
"devel": "astro dev",
"build": "astro build",
"minim": "rm -rf dist/ node_modules/.astro && UE_MINIMAL=t astro build",
"serve": "astro preview",
"clean": "rm -rf dist/ node_modules/.astro",
"debug": "DEBUG='astro:my*' astro build",
"astro": "astro",
"check": "node --input-type=module -e 'import runChecks from \"./bin/build/prebuild.js\"; runChecks()'",
"links": "bin/util/check_urls.sh src/book.md",
"spell": "bin/util/check_spellings_list.sh",
"spfix": "bin/util/update_spellings_list.sh",
"valid": "node -e 'require(\"./bin/util/validate\").validateHtml(\"public/index.html\")'",
"gramm": "bin/util/check_grammar.sh src/book.md",
"patch": "npx patch-package",
"pdfit": "bin/pdf/make_pdf src/book.md",
"stats": "bin/util/stats.sh",
"postinstall": "patch-package"
"valid": "node --input-type=module -e 'import validateHtml from \"./bin/util/validate.js\"; validateHtml(\"dist/part2/building_blocks/ssz/index.html\")'"
},
"dependencies": {
"cheerio": "^1.0.0-rc.12",
"gatsby": "^5.14.3",
"gatsby-plugin-catch-links": "^5.14.0",
"gatsby-plugin-htaccess": "^1.4.0",
"gatsby-plugin-matomo": "^0.17.0",
"gatsby-remark-autolink-headers": "^6.14.0",
"gatsby-remark-katex": "^7.14.0",
"gatsby-remark-numbered-footnotes": "^1.0.1",
"gatsby-remark-prismjs": "^7.14.0",
"gatsby-source-filesystem": "^5.14.0",
"gatsby-transformer-remark": "^6.14.0",
"lmdb": "^3.3.0",
"astro": "^5.7.13",
"axios": "^1.9.0",
"cheerio": "^1.0.0",
"glob": "^11.0.2",
"hast-util-to-string": "^3.0.1",
"loader-utils": "^3.3.1",
"markdownlint": "^0.38.0",
"msgpackr": "^1.11.2",
"patch-package": "^8.0.0",
"react": "^18.2.0",
"react-dom": "^18.2.0",
"svgo": "^4.0.0-rc.1",
"webpack": "^5.99.8"
},
"overrides": {
"babel-plugin-lodash": {
"@babel/types": "~7.20.0"
},
"react-server-dom-webpack@0.0.0-experimental-c8b778b7f-20220825": {
"react": "$react"
}
"rehype-katex": "^7.0.1",
"remark-math": "^6.0.0",
"svgo": "^4.0.0-rc.4"
}
}

View File

@@ -1,22 +0,0 @@
"overrides": {
// See https://github.com/lodash/babel-plugin-lodash/issues/259#issuecomment-1438592335
//
// Gets rid of:
// warn `isModuleDeclaration` has been deprecated,
// please migrate to `isImportOrExportDeclaration`
"babel-plugin-lodash": {
"@babel/types": "~7.20.0"
},
// https://github.com/gatsbyjs/gatsby/issues/37242#issuecomment-1396620944
//
// Gets rid of this when using npm:
// ERESOLVE overriding peer dependency
// While resolving: react-server-dom-webpack@0.0.0-experimental-c8b778b7f-20220825
//
// This is ok as I am not using Partial Hydration.
"react-server-dom-webpack@0.0.0-experimental-c8b778b7f-20220825": {
"react": "$react"
}
}

View File

@@ -1,13 +0,0 @@
diff --git a/node_modules/gatsby/dist/commands/serve.js b/node_modules/gatsby/dist/commands/serve.js
index cb6d5b7..89bebfe 100644
--- a/node_modules/gatsby/dist/commands/serve.js
+++ b/node_modules/gatsby/dist/commands/serve.js
@@ -248,7 +248,7 @@ module.exports = async program => {
console.log(` ${_chalk.default.bold(`Local:`)} ${urls.localUrlForTerminal}`);
console.log(` ${_chalk.default.bold(`On Your Network:`)} ${urls.lanUrlForTerminal}`);
} else {
- console.log(` ${urls.localUrlForTerminal}`);
+ console.log(` ${urls.localUrlForTerminal}` + pathPrefix.substring(1));
}
}
const startListening = () => {

View File

@@ -1,61 +0,0 @@
diff --git a/node_modules/gatsby-remark-autolink-headers/gatsby-browser.js b/node_modules/gatsby-remark-autolink-headers/gatsby-browser.js
deleted file mode 100644
index 2f80613..0000000
--- a/node_modules/gatsby-remark-autolink-headers/gatsby-browser.js
+++ /dev/null
@@ -1,33 +0,0 @@
-"use strict";
-
-var offsetY = 0;
-var getTargetOffset = function getTargetOffset(hash) {
- var id = window.decodeURI(hash.replace("#", ""));
- if (id !== "") {
- var element = document.getElementById(id);
- if (element) {
- var scrollTop = window.pageYOffset || document.documentElement.scrollTop || document.body.scrollTop;
- var clientTop = document.documentElement.clientTop || document.body.clientTop || 0;
- var computedStyles = window.getComputedStyle(element);
- var scrollMarginTop = computedStyles.getPropertyValue("scroll-margin-top") || computedStyles.getPropertyValue("scroll-snap-margin-top") || "0px";
- return element.getBoundingClientRect().top + scrollTop - parseInt(scrollMarginTop, 10) - clientTop - offsetY;
- }
- }
- return null;
-};
-exports.onInitialClientRender = function (_, pluginOptions) {
- if (pluginOptions.offsetY) {
- offsetY = pluginOptions.offsetY;
- }
- requestAnimationFrame(function () {
- var offset = getTargetOffset(window.location.hash);
- if (offset !== null) {
- window.scrollTo(0, offset);
- }
- });
-};
-exports.shouldUpdateScroll = function (_ref) {
- var location = _ref.routerProps.location;
- var offset = getTargetOffset(location.hash);
- return offset !== null ? [0, offset] : true;
-};
\ No newline at end of file
diff --git a/node_modules/gatsby-remark-autolink-headers/gatsby-ssr.js b/node_modules/gatsby-remark-autolink-headers/gatsby-ssr.js
index df19ec0..e198855 100644
--- a/node_modules/gatsby-remark-autolink-headers/gatsby-ssr.js
+++ b/node_modules/gatsby-remark-autolink-headers/gatsby-ssr.js
@@ -15,15 +15,9 @@ exports.onRenderBody = function (_ref, pluginOptions) {
offsetY = _Object$assign.offsetY;
var styles = "\n ." + className + ".before {\n position: absolute;\n top: 0;\n left: 0;\n transform: translateX(-100%);\n padding-right: 4px;\n }\n ." + className + ".after {\n display: inline-block;\n padding-left: 4px;\n }\n h1 ." + className + " svg,\n h2 ." + className + " svg,\n h3 ." + className + " svg,\n h4 ." + className + " svg,\n h5 ." + className + " svg,\n h6 ." + className + " svg {\n visibility: hidden;\n }\n h1:hover ." + className + " svg,\n h2:hover ." + className + " svg,\n h3:hover ." + className + " svg,\n h4:hover ." + className + " svg,\n h5:hover ." + className + " svg,\n h6:hover ." + className + " svg,\n h1 ." + className + ":focus svg,\n h2 ." + className + ":focus svg,\n h3 ." + className + ":focus svg,\n h4 ." + className + ":focus svg,\n h5 ." + className + ":focus svg,\n h6 ." + className + ":focus svg {\n visibility: visible;\n }\n ";
- // This script used to have `let scrollTop` and `let clientTop` instead of
- // current ones which are `var`. It is changed due to incompatibility with
- // older browsers (that do not implement `let`). See:
- // - https://github.com/gatsbyjs/gatsby/issues/21058
- // - https://github.com/gatsbyjs/gatsby/pull/21083
- var script = "\n document.addEventListener(\"DOMContentLoaded\", function(event) {\n var hash = window.decodeURI(location.hash.replace('#', ''))\n if (hash !== '') {\n var element = document.getElementById(hash)\n if (element) {\n var scrollTop = window.pageYOffset || document.documentElement.scrollTop || document.body.scrollTop\n var clientTop = document.documentElement.clientTop || document.body.clientTop || 0\n var offset = element.getBoundingClientRect().top + scrollTop - clientTop\n // Wait for the browser to finish rendering before scrolling.\n setTimeout((function() {\n window.scrollTo(0, offset - " + offsetY + ")\n }), 0)\n }\n }\n })\n ";
+ var script = "\n";
var style = icon ? /*#__PURE__*/_react.default.createElement("style", {
key: "gatsby-remark-autolink-headers-style",
- type: "text/css"
}, styles) : undefined;
return setHeadComponents([style, /*#__PURE__*/_react.default.createElement("script", {
key: "gatsby-remark-autolink-headers-script",

View File

@@ -1,154 +0,0 @@
const cheerio = require('cheerio')
/*
* Creates GraphQL nodes containing data for the local search
*/
// Concatenate all text in child nodes while respecting exclusions
const getText = ($, node, exclude) => {
return [...$(node).contents().not(exclude.ignore)]
.map(e => (e.type === 'text') ? e.data : getText($, e, exclude))
.join('')
}
// Recurse until we find an element we want to treat as a chunk, then get all its text content.
const getChunks = ($, node, chunkTypes, exclude, counts) => {
if (counts === undefined) {
counts = Array(chunkTypes.length).fill(0)
}
for (let idx = 0; idx < chunkTypes.length; idx++) {
const type = chunkTypes[idx]
if ($(node).is(type.query)) {
const text = getText($, node, exclude)
if (text !== '') {
const tagName = $(node).prop('tagName').toLowerCase()
let id = $(node).attr('id')
if ( id === undefined) {
id = tagName + '_' + counts[idx]
$(node).attr('id', id)
++counts[idx]
}
return [{
type: tagName,
label: type.label,
id: id,
text: text,
weight: type.weight === undefined ? 1 : type.weight,
}]
}
}
}
return [...$(node).children().not(exclude.ignore)]
.map(e => getChunks($, e, chunkTypes, exclude, counts))
.flat()
}
const isExcludedFrontmatter = (frontmatter, exclude) => {
for (let i = 0; i < exclude.frontmatter.length; i++) {
const test = exclude.frontmatter[i]
const [key, ...rest] = Object.keys(test)
if (Object.prototype.hasOwnProperty.call(frontmatter, key)
&& frontmatter[key] == test[key]) {
return true
}
}
return false
}
exports.createPages = async (
{
actions: { createNode },
graphql,
createNodeId,
createContentDigest,
}, pluginOptions,
) => {
const {
enabled = true,
chunkTypes = [],
exclude = {frontmatter: [], pages: [], ignore: ''},
} = pluginOptions
const result = await graphql(`
{
allMarkdownRemark {
edges {
node {
html
frontmatter {
path
index
sequence
titles
hide
}
}
}
}
}
`)
const pages = result.data.allMarkdownRemark.edges
await Promise.all(pages.map(async (page) => {
const $ = cheerio.load(page.node.html, null, false)
const frontmatter = page.node.frontmatter
let chunks = []
if (enabled
&& frontmatter !== undefined
&& isExcludedFrontmatter(frontmatter, exclude) === false
&& exclude.pages.indexOf(frontmatter.path) === -1) {
chunks = getChunks($, $.root(), chunkTypes, exclude)
}
// It seems to be hard to modify the underlying MarkdownRemark node's HTML, so we add
// the modified HTML to a new node and deal with it in the page template.
const nodeData = {
frontmatter: {
path: frontmatter.path,
titles: frontmatter.titles,
},
chunks: chunks,
html: $.html(),
}
createNode({
...nodeData,
id: createNodeId(nodeData.frontmatter.path),
internal: {
type: 'mySearchData',
contentDigest: createContentDigest(nodeData)
}
})
}))
}
exports.createSchemaCustomization = ({ actions: { createTypes } }) => {
createTypes(`
type Frontmatter {
path: String!
titles: [String]
}
type mySearchData implements Node {
frontmatter: Frontmatter!
chunks: JSON
html: String
}
`)
}

View File

@@ -1 +0,0 @@
// no-op

View File

@@ -1,11 +0,0 @@
{
"name": "my-search-index",
"version": "1.0.0",
"description": "Build the search index",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "Ben Edgington",
"license": "ISC"
}

View File

@@ -1,19 +0,0 @@
const visit = require('unist-util-visit')
// Remove HTML comments from the Markdown AST
module.exports = ({ markdownAST }) => {
try {
visit(markdownAST, 'html', (node, index, parent) => {
if (node.value.startsWith('<!-- ')) {
parent.children.splice(index, 1)
return [visit.SKIP, index]
}
})
} catch (err) {
console.error(err)
}
return markdownAST
}

View File

@@ -1,11 +0,0 @@
{
"name": "my-strip-html-comments",
"version": "1.0.0",
"description": "Remove HTML comments from the Markdown AST",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "Ben Edgington",
"license": "ISC"
}

View File

@@ -1,133 +0,0 @@
const visit = require('unist-util-visit')
const svgo = require('svgo')
const fs = require('fs')
const path = require('path')
const { getHashDigest } = require('loader-utils')
const { md5 } = require('gatsby-core-utils')
// Inline SVG files into the Markdown AST
// SVGO doesn't really support adding elements, and the API changes.
// The below is based on code from the "reusePaths" plugin.
const addTitle = {
name: 'addTitle',
type: 'visitor',
active: true,
fn: (ast, params) => {
return {
element: {
exit: (node, parentNode) => {
if (node.name === 'svg' && parentNode.type === 'root') {
const hasTitle = node.children.some(
(child) => child.type === 'element' && child.name === 'title'
)
if (!hasTitle) {
const titleElement = {
type: 'element',
name: 'title',
attributes: {},
children: [],
}
Object.defineProperty(titleElement, 'parentNode', {
writable: true,
value: node,
});
const titleContents = {
type: 'text',
value: params.titleText,
}
Object.defineProperty(titleContents, 'parentNode', {
writable: true,
value: titleElement,
});
titleElement.children.push(titleContents)
node.children.unshift(titleElement);
}
}
},
},
}
},
}
// See https://www.npmjs.com/package/svgo
const plugins = [
'preset-default',
'prefixIds',
'removeDimensions',
'removeXMLNS',
{
name: 'addAttributesToSVGElement',
params: {attribute: {'role': 'img'}},
},
]
const addTitleSettings = {
name: addTitle.name,
type: addTitle.type,
active: addTitle.active,
fn: addTitle.fn,
params: undefined,
}
const addAttributes = {
name: 'addAttributesToSVGElement',
params: undefined,
}
module.exports = ({ markdownAST, cache }, pluginOptions) => {
try {
visit(markdownAST, 'paragraph', async node => {
if (node.children[0].type == 'image') {
const image = node.children[0]
if (image.url.endsWith('.svg')) {
const originalSvg = await fs.readFileSync(pluginOptions.directory + image.url, 'utf8')
const fileHash = await md5(originalSvg)
const basename = path.basename(image.url, '.svg')
var svgString
cache.get(fileHash).then(ret => {
if (ret) {
svgString = ret
} else {
// We need to distinguish multiple SVGs on the same page by using "prefixIds"
const digest = getHashDigest(basename, 'md5', 'base52', 4)
// Configure the SVGO addAttributes plugin to add an ID to SVG element
addAttributes['params'] = {attribute: {id: basename}}
// Configure our custom plugin that adds a title element
addTitleSettings['params'] = {titleText: image.alt}
const svg = svgo.optimize(
originalSvg,
{
path: digest,
plugins: plugins.concat([addTitleSettings, addAttributes])
}
)
svgString = svg.data
cache.set(fileHash, svgString)
}
// Modify the current node in-place
node.type = 'html'
node.value = svgString
node.children = []
})
}
}
})
} catch (err) {
console.error(err)
}
return markdownAST
}

View File

@@ -1,11 +0,0 @@
{
"name": "my-svg-embed",
"version": "1.0.0",
"description": "Inline SVG files into the Markdown AST",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "Ben Edgington",
"license": "ISC"
}

View File

@@ -1,27 +0,0 @@
const visit = require('unist-util-visit')
const fs = require('fs')
module.exports = ({ markdownAST }, pluginOptions) => {
try {
const data = fs.readFileSync(pluginOptions.file, 'utf8')
const map = JSON.parse(data)
visit(markdownAST, 'inlineCode', (node, index, parent) => {
// HTML in headings causes problems for the page index, so skip these
if (parent.type !== 'heading') {
const text = node.value
const value = map[text]
if (value) {
node.type = 'html'
node.value = `<code title="${text} = ${value}">${text}</code>`
node.children = undefined
}
}
})
} catch (err) {
console.error(err)
}
return markdownAST
}

View File

@@ -1,11 +0,0 @@
{
"name": "my-tooltips",
"version": "1.0.0",
"description": "Add tooltips for constant values",
"main": "index.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "Ben Edgington",
"license": "ISC"
}

View File

@@ -0,0 +1,12 @@
---
import "../css/banner.css"
// This will differ between the git branches for the different historical versions
const addBanner = false;
---
{addBanner ?
<div id="banner">
<p>This version is old (Capella). The latest book is <a href="/latest/">here</a>.</p>
</div>
: null}

View File

@@ -0,0 +1,8 @@
---
import '../css/footer.css';
import { Metadata } from '../include/SiteConfig.js';
---
<footer>
<p>Created by {Metadata.author}. Licensed under <a href={Metadata.licenceUrl} target="_blank" rel="external noreferrer">{Metadata.licence}</a>. Published <span class="date">{Metadata.date}</span>. Commit <a class="githash" href={Metadata.gitUrl} target="_blank" rel="external noreferrer">{Metadata.gitHash}</a>.</p>
</footer>

View File

@@ -0,0 +1,60 @@
---
import '../css/footnote-tooltips.css'
---
<script>
function addFootnoteTooltips() {
document.body.querySelectorAll('a[data-footnote-ref]').forEach( (fn) => {
// Avoid footnote labels we have already processed
if (fn.parentElement.querySelector('span.fn-span') === null) {
// Find the footnote
const footnoteId = fn.attributes.href.textContent.substring(1);
const footnote = document.getElementById(footnoteId);
if (!footnote) {
console.log('Failed to find footnote ID ' + footnoteId);
return;
}
// Create a new span element to hold the tooltip
const fnSpan = document.createElement('span');
fnSpan.className = 'fn-span';
for (let i = 0; i < footnote.childNodes.length; i++) {
fnSpan.appendChild(footnote.childNodes[i].cloneNode(true));
}
// Remove any backlinks (there should be exactly one)
const backlinks = fnSpan.getElementsByClassName('data-footnote-backref');
for (let i = 0; i < backlinks.length; i++) {
backlinks[i].parentElement.removeChild(backlinks[i]);
}
// Ensure the tooltip doesn't overflow the main content area
fnSpan.addEventListener("transitionstart", (event) => {
const span = event.target;
const padding = 16;
const mainRect = document.getElementById('main-content').getBoundingClientRect();
const fnRect = fn.getBoundingClientRect();
const spanRect = span.getBoundingClientRect();
const spanWidth = spanRect.right - spanRect.left;
var left = (fnRect.right + fnRect.left - spanWidth) / 2;
left = Math.max(left, mainRect.left + padding);
left = Math.min(left, mainRect.right - spanWidth - padding);
const zeroPos = fnRect.right;
span.style.left=`${left - zeroPos}px`;
})
fn.parentElement.append(fnSpan);
}
});
}
document.addEventListener('DOMContentLoaded', (e) => {
addFootnoteTooltips();
});
</script>

View File

@@ -0,0 +1,19 @@
---
// This is generated by the Matomo installation at eth2book.info and copied and pasted
// from there.
---
<script>
var _paq = window._paq = window._paq || [];
/* tracker methods like "setCustomDimension" should be called before "trackPageView" */
_paq.push(["setDomains", ["*.eth2book.info"]]);
_paq.push(['trackPageView']);
_paq.push(['enableLinkTracking']);
(function() {
var u="https://eth2book.info/matomo/";
_paq.push(['setTrackerUrl', u+'matomo.php']);
_paq.push(['setSiteId', '1']);
var d=document, g=d.createElement('script'), s=d.getElementsByTagName('script')[0];
g.async=true; g.src=u+'matomo.js'; s.parentNode.insertBefore(g,s);
})();
</script>

View File

@@ -0,0 +1,45 @@
---
import { Metadata } from '../include/SiteConfig.js';
const { items, level, idx } = Astro.props;
function conditionalLink(to, noLink, isCurrent, children) {
const active = isCurrent ? ' class="index-active" aria-current="page"' : '';
const ret = noLink
? children
: `<a href=/${Metadata.version}${to}${active}>${children}</a>`;
return ret;
}
function renderNestedList(items, level, idx) {
let result = '';
let i = idx;
while (i < items.length) {
const item = items[i];
const labelSpan = item.label ? `<span class="label-string">${item.label}</span>` : '';
if (item.level === level) {
let nestedContent = '';
if (i + 1 < items.length && items[i + 1].level > level) {
nestedContent = renderNestedList(items, level + 1, i + 1);
}
result +=
`<li>` +
`${conditionalLink(item.link, item.hide, item.here, labelSpan + " " + item.title)}` +
`${nestedContent}` +
`</li>`;
// Skip child elements
i++;
while (i < items.length && items[i].level > level) i++;
} else {
break;
}
}
return `<ul>${result}</ul>`;
}
---
<Fragment set:html={renderNestedList(items, level, idx)} />

View File

@@ -0,0 +1,20 @@
---
import NestedList from './NestedList.astro';
const { pages, depth, here } = Astro.props;
const filteredPages = pages.filter(p => p.data.index !== null);
if (filteredPages.length === 0) return null;
// Make a flat array of list level and the list info
const layout = filteredPages.map(p => {return ({
level: p.data.index.length,
label: p.data.index.join("."),
title: p.data.titles[p.data.titles.length - 1],
link: p.data.path,
hide: p.data.hide,
here: p.data.sequence === here
})})
---
<NestedList items={layout} level={depth + 1} idx={0} />

View File

@@ -0,0 +1,24 @@
---
import '../css/pagenavi.css';
import NestedList from './NestedList.astro';
const { page } = Astro.props;
const headings = page.rendered.metadata.headings;
if (!headings) {
return null;
}
const items = headings.filter(h => h.depth >= 2 && h.text !== 'Footnotes').map(h => {return ({
level: h.depth,
label: '',
title: h.text,
link: page.data.path + '#' + h.slug,
hide: false,
here: false
})});
---
<div id="page-navi" class="scrollable">
<NestedList items={items} level={2} idx={0} />
</div>

View File

@@ -0,0 +1,24 @@
---
import '../css/prevnext.css';
import { Metadata } from '../include/SiteConfig.js';
import PrevNextLink from './PrevNextLink.astro';
const { seq, pages } = Astro.props;
if (seq === null) return null;
const prevPage = pages.filter(p => p.data.sequence === (seq - 1))[0];
const nextPage = pages.filter(p => p.data.sequence === (seq + 1))[0];
---
<div class="prevnext">
<span class="prev">
<PrevNextLink page={prevPage} rel="prev">Back</PrevNextLink>
</span>
<span class="contents">
<a href={`/${Metadata.version}/contents/`}>Contents</a>
</span>
<span class="next">
<PrevNextLink page={nextPage} rel="next">Next</PrevNextLink>
</span>
</div>

View File

@@ -0,0 +1,14 @@
---
import { Metadata } from '../include/SiteConfig.js';
const {page, rel } = Astro.props;
if (page === null
|| page === undefined
|| page.data.sequence < 0
) return null;
let title = page.data.titles.join(" > ");
---
<a href={`/${Metadata.version}${page.data.path}`} title={title} rel={rel}><slot /></a>

View File

@@ -1,62 +1,69 @@
require("prismjs/themes/prism-tomorrow.css")
require("katex/dist/katex.min.css")
---
exports.onClientEntry = () => {
---
<script>
/*
* Handle opening of <details> elements when printing
* Handle opening of details elements when printing
*/
// Open closed details elements for printing
window.addEventListener('beforeprint', () => {
const allDetails = document.body.querySelectorAll('details')
const allDetails = document.body.querySelectorAll('details');
for (let i = 0; i < allDetails.length; i++) {
if (allDetails[i].open) {
allDetails[i].dataset.open = '1'
allDetails[i].dataset.open = '1';
} else {
allDetails[i].setAttribute('open', '')
allDetails[i].setAttribute('open', '');
}
}
})
});
// After printing, close details elements not opened before
window.addEventListener('afterprint', () => {
const allDetails = document.body.querySelectorAll('details')
const allDetails = document.body.querySelectorAll('details');
for (let i = 0; i < allDetails.length; i++) {
if (allDetails[i].dataset.open) {
allDetails[i].dataset.open = ''
allDetails[i].dataset.open = '';
} else {
allDetails[i].removeAttribute('open')
allDetails[i].removeAttribute('open');
}
}
})
}
});
/*
/*
Awful hack to (partially) work around a pathological issue in Chrome whereby
a link to an anchor near the bottom of the page (such as a footnote) results
in most of the viewport being scrolled off the screen. This is only
recoverable if we keep a scrollbar on the page, which leads to other issues.
Anyway, The following detects the issue on initial page load and fixes
it. But all bets are off for any further loads or navigation.
Anyway, The following detects the issue on initial page load and fixes it.
But all bets are off for any further loads or navigation.
Example test link that shows the behaviour if this "fix" is turned off:
http://eth2book.info/latest/part2/incentives/rewards/#fn-2
http://eth2book.info/latest/preface/#fn-fn-ef-overreach
I've also brutalised gatsby-remark-autolink-headers (see
`patches/gatsby-remark-autolink-headers+*`) to remove its scrolling
functionality as that just makes things worse.
The issue does not occur on FireFox. If anyone knows how to fix it, please
The issue does not occur on FireFox(*). If anyone knows how to fix it, please
save my sanity by letting me know. I suspect it's down to something very
simple in CSS.
*/
exports.shouldUpdateScroll = () => {
if (window.pageYOffset !== 0) {
document.querySelector('html').scroll(0, -window.pageYOffset)
return false
(*) Actually, it occurs when navigating on the same page but not to a new page
on FF.
*/
async function resetWindowOffset() {
if (window.pageYOffset !== 0) {
document.querySelector('html').scroll(0, -window.pageYOffset);
}
}
return true
}
window.addEventListener('load', (e) => {
resetWindowOffset();
});
window.addEventListener('hashchange', (e) => {
resetWindowOffset();
});
</script>

214
src/components/Search.astro Normal file
View File

@@ -0,0 +1,214 @@
---
import '../css/search.css';
import { Metadata, SearchOptions } from '../include/SiteConfig.js';
---
<div id="search-page">
<div id="search-parameters">
<input name="searchText" id="search-text" placeholder="Search" autofocus="" value="" />
<span id="checkbox">
<input type="checkbox" id="is-case-sensitive" />
<label for="is-case-sensitive">Case sensitive</label>
</span>
</div>
<div id="search-results">
<p>No results</p>
</div>
</div>
<script id="astro-metadata" data-prefix={Metadata.version} data-file={SearchOptions.indexFile}>
const logTimings = true;
let t0, t1, t2;
// Passed in from the Astro build via the `data-prefix` attribute
const { prefix, file } = document.getElementById('astro-metadata').dataset;
// Location of the search index file on the server
const searchIndexFile = `/${prefix}/${file}`;
// Save the search index in a module-scoped variable
let searchIndex = null;
// Load search index from the server if necessary
async function loadSearchIndex() {
if (searchIndex) {
return searchIndex;
}
try {
const cachedIndex = sessionStorage.getItem('searchIndex');
if (cachedIndex) {
searchIndex = JSON.parse(cachedIndex);
return searchIndex;
}
} catch(error) {
console.log('Failed read index from session storage');
}
try {
if (logTimings) {
t0 = Date.now();
}
const response = await fetch(searchIndexFile);
if (logTimings) {
t1 = Date.now();
console.log('Fetched search index in ' + (t1 - t0) + 'ms');
}
searchIndex = await response.json();
if (logTimings) {
t2 = Date.now();
console.log('Deserialised index in ' + (t2 - t1) + 'ms');
}
try {
sessionStorage.setItem('searchIndex', JSON.stringify(searchIndex));
} catch(error) {
console.log('Failed to save index to session storage');
}
return searchIndex;
} catch (error) {
console.error('Failed to load search index:', error);
return null;
}
}
const escapeRegExp = (string) => {
return string.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')
}
async function getSearchResults (query, index) {
if (query.searchText.length < 3) {
return [];
}
// Match the starts of words only. The "d" flag gives us the matching indices.
const regex = RegExp('(^|\\W|_)(' + escapeRegExp(query.searchText) + ')',
'gd' + (query.isCaseSensitive ? '' : 'i'));
const result = index.map( (node) => {
let pageScore = 0;
const matches = [];
for (let i = 0; i < node.chunks?.length; i++) {
let chunk = node.chunks[i];
let match;
const indices = [];
while ((match = regex.exec(chunk.text)) !== null) {
indices.push([match.indices[2][0], match.indices[2][1]]);
}
if (indices.length > 0) {
matches.push(
{
type: chunk.type,
label: chunk.label,
id: chunk.id,
text: chunk.text,
indices: indices,
}
);
pageScore += chunk.weight * indices.length;
}
}
return matches.length === 0 ? null : {
url: node.frontmatter.path,
title: node.frontmatter.titles.join(' | '),
matches: matches,
score: pageScore,
};
})
return result.filter(x => x).sort((a, b) => (b.score - a.score));
}
async function updateSearchResults(query, index) {
if (!index) {
return;
}
const results = await getSearchResults(query, index);
const pages = results.map((result) => {
const chunks = result.matches.map((match) => {
const matches = match.indices.map((indices, i) => {
return (match.text.substring((i === 0) ? 0 : match.indices[i-1][1], indices[0]))
+ `<span class='match-text'>${match.text.substring(indices[0], indices[1])}</span>`
+ ((i === match.indices.length - 1) ? match.text.substring(indices[1]) : '');
}).join('');
return (
`<li>
<a
href=/${prefix}${result.url + '#' + match.id}
class="label"
target="_blank"
rel="noreferrer"
>
${match.label}
</a>
<span class=${'chunk-text ' + match.type}>
${matches}
</span>
</li>`
);
}).join('');
return (
`<li>
<a
href=/${prefix}${result.url}
target="_blank"
rel="noreferrer"
>
${result.title}
</a>
<ul>
${chunks}
</ul>
</li>`
);
}).join('');
const resultsDiv = document.getElementById('search-results');
resultsDiv.innerHTML = results.length ? '<ul>' + pages + '</ul>' : '<p>No results</p>';
}
const searchTextInput = document.getElementById('search-text');
const caseSensitiveCheckbox = document.getElementById('is-case-sensitive');
// Clear the inputs on page reload
searchTextInput.value = '';
caseSensitiveCheckbox.checked = false;
async function doSearch () {
const query = {
searchText: searchTextInput.value,
isCaseSensitive: caseSensitiveCheckbox.checked,
}
const index = await loadSearchIndex();
updateSearchResults(query, index);
}
searchTextInput.addEventListener('input', async () => {
doSearch();
});
caseSensitiveCheckbox.addEventListener('change', async () => {
doSearch();
});
// Pre-fetch and cache the search index
document.addEventListener('DOMContentLoaded', (e) => {
searchIndex = loadSearchIndex();
});
</script>

View File

@@ -0,0 +1,29 @@
---
import '../css/sidebar.css';
import { Metadata } from '../include/SiteConfig.js';
import PageList from './PageList.astro';
const { pageIndex, pages, here } = Astro.props;
// List only parts and chapters and immediate children in the sidebar
const index = pageIndex ? pageIndex : [];
const filteredPages = index.length < 2
? pages.filter(p => p.data.index.length <= 2)
: pages.filter(p => p.data.index.length <= 2
|| (p.data.index.length === 3
&& p.data.index[0] === index[0]
&& p.data.index[1] === index[1]
)
);
---
<nav class="sidebar scrollable">
<div class="sidebar-title">
<a href={`/${Metadata.version}/`}>
{Metadata.title}
</a>
</div>
<div id="index">
<PageList pages={filteredPages} depth={0} here={here} />
</div>
</nav>

View File

@@ -0,0 +1,27 @@
---
import PageList from './PageList.astro';
import '../css/subsections.css';
const { indexArray, pages } = Astro.props;
let index = indexArray;
// Only add the auto index for Parts, not any deeper structure
if (index === undefined || index === null || index.length > 1) return null;
// Special hacky handling for the the /contents page
if (index[0] === -1) {
index = [];
}
// Find pages that are subsections of the page we are on
const indexFilterString = index.length === 0 ? '' : index.join() + ',';
const filteredPages = pages.filter(p => p.data.index.join().startsWith(indexFilterString))
---
{filteredPages.length > 0
? <div class="subsection-list">
<PageList pages={filteredPages} depth={index.length} />
</div>
: null
}

View File

@@ -1,56 +0,0 @@
import React from "react"
import { useStaticQuery, graphql } from "gatsby"
import "../css/banner.css"
const Banner = ({path}) => {
const addBanner = false
const data = useStaticQuery(graphql`
{
allMarkdownRemark {
edges {
node {
frontmatter {
path
hide
}
}
}
}
}
`)
const pages = data.allMarkdownRemark.edges
const hide = pages.filter(page => page.node.frontmatter.path === path)[0].node.frontmatter.hide
if (addBanner) {
if (hide) {
return (
<div id="banner">
<p>This version is old (Capella). The latest book is <a href="/latest/">here</a>.</p>
</div>
)
} else {
const newPage = "/latest" + path
return (
<div id="banner">
<p>This version is old (Capella). The latest book is <a href="/latest/">here</a>. An updated page may be <a href={newPage}>here</a>.</p>
</div>
)
}
} else {
return null
}
}
export default Banner

View File

@@ -1,32 +0,0 @@
import React from "react"
import { useStaticQuery, graphql } from "gatsby"
import "../css/footer.css"
const Footer = () => {
const data = useStaticQuery(graphql`
{
site {
siteMetadata {
author
gitHash
gitUrl
date
licenceUrl
licence
}
}
}
`)
const meta = data.site.siteMetadata
return (
<footer>
<p>Created by {meta.author}. Licensed under <a href={meta.licenceUrl} target="_blank" rel="external noreferrer">{meta.licence}</a>. Published <span className="date">{meta.date}</span>. Commit <a className="githash" href={meta.gitUrl} target="_blank" rel="external noreferrer">{meta.gitHash}</a>.</p>
</footer>
)
}
export default Footer

View File

@@ -1,57 +0,0 @@
import { Component } from 'react'
import '../css/footnote-tooltips.css'
// Modify the page to duplicate any footnotes into tooltips.
class FootnoteTooltips extends Component {
componentDidMount() {
document.body.querySelectorAll('.footnote-ref').forEach( (fn) => {
if (fn.parentElement.querySelector('span.fn-span') === null) {
// Find the footnote
const toolTip = document.body.querySelector(fn.attributes.href.textContent)
// Create a new span element to hold the tooltip
const fnSpan = document.createElement('span')
fnSpan.className = 'fn-span'
for (let i = 0; i < toolTip.childNodes.length; i++) {
fnSpan.appendChild(toolTip.childNodes[i].cloneNode(true))
}
// Remove any backlinks (there should be exactly one)
const backlinks = fnSpan.getElementsByClassName('footnote-backref')
for (let i = 0; i < backlinks.length; i++) {
backlinks[i].parentElement.removeChild(backlinks[i])
}
// Ensure the tooltip doesn't overflow the main content area
fnSpan.addEventListener("transitionstart", (event) => {
const span = event.target
const padding = 16;
const mainRect = document.getElementById('main-content').getBoundingClientRect()
const fnRect = fn.getBoundingClientRect()
const spanRect = span.getBoundingClientRect()
const spanWidth = spanRect.right - spanRect.left
var left = (fnRect.right + fnRect.left - spanWidth) / 2
left = Math.max(left, mainRect.left + padding)
left = Math.min(left, mainRect.right - spanWidth - padding)
const zeroPos = fnRect.right
span.style.left=`${left - zeroPos}px`
})
fn.parentElement.append(fnSpan)
}
})
}
render() {
return null
}
}
export default FootnoteTooltips

View File

@@ -1,35 +0,0 @@
import React from "react"
import { Link } from "gatsby"
function ConditionalLink({to, children, nolink}) {
const ret = nolink
? <>{children}</>
: <Link to={to} activeClassName="index-active">{children}</Link>
return (ret)
}
export default function NestedList({idx, items, level}) {
var ret = []
var i = idx
while (i < items.length) {
const item = items[i]
const labelSpan = item.label.length === 0 ? <></ > : <span className="label-string">{item.label}</span>
if (item.level === level) {
var foo = ""
if (i + 1 < items.length && items[i + 1].level > level) {
foo = <NestedList key={i + 1} items={items} level={level + 1} idx={i + 1} />
}
// See https://github.com/facebook/react/issues/14725#issuecomment-460378418
ret.push(
// <li key={i}><ConditionalLink to={item.link} nolink={item.hide}>{labelSpan} {item.title}</ConditionalLink>{foo}</li>
<li key={i}><ConditionalLink to={item.link} nolink={item.hide}>{labelSpan}{` ${item.title}`}</ConditionalLink>{foo}</li>
)
i++
while (i < items.length && items[i].level > level)
i++
} else {
break
}
}
return (<ul>{ret}</ul>)
}

View File

@@ -1,24 +0,0 @@
import React from "react"
import NestedList from "./nestedlist"
// Format pages as a list according to their index data.
// Depth is the length of prefix to ignore
// We assume the pages are given to us in sorted order
const PageList = ({pages, depth}) => {
const filteredPages = pages.filter(p => p.node.frontmatter.index !== null)
if (filteredPages.length === 0) return null
// Make a flat array of list level and the list info
const layout = filteredPages.map(p => {return ({
level: p.node.frontmatter.index.length,
label: p.node.frontmatter.index.join("."),
title: p.node.frontmatter.titles[p.node.frontmatter.index.length - 1],
link: p.node.frontmatter.path,
hide: p.node.frontmatter.hide === true
})})
return (<NestedList items={layout} level={depth + 1} idx={0} />)
}
export default PageList

View File

@@ -1,51 +0,0 @@
import React from "react"
import { useStaticQuery, graphql } from "gatsby"
import NestedList from "./nestedlist"
import "../css/pagenavi.css"
const PageNavi = ({path}) => {
const data = useStaticQuery(graphql`
{
allMarkdownRemark {
edges {
node {
headings {
value
depth
id
}
frontmatter {
path
}
}
}
}
}
`)
const pages = data.allMarkdownRemark.edges
const headings = pages.filter(page => page.node.frontmatter.path === path)[0].node.headings
if (headings.length === 0) {
return null
}
// console.log(JSON.stringify(thisPage.node.headings, undefined, 2))
const items = headings.filter(h => h.depth >= 2).map(h => {return ({
level: h.depth,
label: "",
title: h.value,
link: "#" + h.id,
hide: false
})})
return (
<div id="page-navi" className="scrollable">
<NestedList items={items} level={2} idx={0} />
</div>
)
}
export default PageNavi

View File

@@ -1,59 +0,0 @@
import React from "react"
import { useStaticQuery, graphql, Link } from "gatsby"
import "../css/prevnext.css"
function PrevNextLink(props) {
if (props.page === null
|| props.page === undefined
|| props.page.node.frontmatter.sequence < 0
) return null
const f = props.page.node.frontmatter
let title = f.titles.filter(x => x !== "").join(" > ")
return(
<Link to={f.path} title={title} rel={props.rel}>{props.children}</Link>
)
}
const PrevNext = (props) => {
const data = useStaticQuery(graphql`
{
allMarkdownRemark {
edges {
node {
frontmatter {
path
titles
sequence
}
}
}
}
}
`)
if (props.seq === null) return null
const pages = data.allMarkdownRemark.edges
const prevPage = pages.filter(p => p.node.frontmatter.sequence === (props.seq - 1))[0]
const nextPage = pages.filter(p => p.node.frontmatter.sequence === (props.seq + 1))[0]
return (
<div className="prevnext">
<span className="prev">
<PrevNextLink page={prevPage} rel="prev">Back</PrevNextLink>
</span>
<span className="contents">
<Link to="/contents">Contents</Link>
</span>
<span className="next">
<PrevNextLink page={nextPage} rel="next">Next</PrevNextLink>
</span>
</div>
)
}
export default PrevNext

View File

@@ -1,173 +0,0 @@
import React from 'react'
import { graphql, useStaticQuery, withPrefix } from 'gatsby'
import "../css/search.css"
const escapeRegExp = (string) => {
return string.replace(/[.*+?^${}()|[\]\\]/g, '\\$&')
}
const getSearchResults = (query, data) => {
if (query.searchText.length < 3) {
return []
}
// Match the starts of words only. The "d" flag gives us the matching indices.
const regex = RegExp('(^|\\W|_)(' + escapeRegExp(query.searchText) + ')',
'gd' + (query.isCaseSensitive ? '' : 'i'))
const result = data.map( ({ node }) => {
let pageScore = 0
const matches = []
for (let i = 0; i < node.chunks?.length; i++) {
let chunk = node.chunks[i]
let match
const indices = []
while ((match = regex.exec(chunk.text)) !== null) {
indices.push([match.indices[2][0], match.indices[2][1]])
}
if (indices.length > 0) {
matches.push(
{
type: chunk.type,
label: chunk.label,
id: chunk.id,
text: chunk.text,
indices: indices,
}
)
pageScore += chunk.weight * indices.length
}
}
return matches.length === 0 ? null : {
url: node.frontmatter.path,
title: node.frontmatter.titles.filter(x => x).join(' | '),
matches: matches,
score: pageScore,
}
})
return result.filter(x => x).sort((a, b) => (b.score - a.score))
}
const Search = () => {
const queryData = useStaticQuery(graphql`
query {
allMySearchData {
edges {
node {
frontmatter {
path
titles
}
chunks
}
}
}
}
`)
const searchData = queryData.allMySearchData.edges
const [searchQuery, setQuery] = React.useState({
searchText: '',
isCaseSensitive: false,
})
const setSearchText = (text) => {
setQuery(previousState => {
return { ...previousState, searchText: text }
});
}
const toggleIsCaseSensitive = () => {
setQuery(previousState => {
return { ...previousState, isCaseSensitive: !previousState.isCaseSensitive }
});
}
const results = getSearchResults(searchQuery, searchData)
const pages = results.map((result) => {
const chunks = result.matches.map((match) => {
const matches = match.indices.map((indices, i) => {
return [
match.text.substring((i === 0) ? 0 : match.indices[i-1][1], indices[0]),
<span className='match-text' key={indices[0]}>
{match.text.substring(indices[0], indices[1])}
</span>,
(i === match.indices.length - 1) ? match.text.substring(indices[1]) : '',
]
})
return (
<li key={result.url + match.id}>
<a
href={withPrefix(result.url + '#' + match.id)}
className="label"
target="_blank"
rel="noreferrer"
>
{match.label}
</a>
<span className={'chunk-text ' + match.type}>
{matches}
</span>
</li>
)
})
return (
<li key={result.url}>
<a
href={withPrefix(result.url)}
target="_blank"
rel="noreferrer"
>
{result.title}
</a>
<ul>
{chunks}
</ul>
</li>
)
})
return (
<div id="search-page">
<div id="search-parameters">
<input
name="searchText"
id="search-text"
value={searchQuery.searchText}
placeholder="Search"
autoFocus // eslint-disable-line jsx-a11y/no-autofocus
onChange={(event) => setSearchText(event.target.value)}
/>
<span id="checkbox">
<input
type="checkbox"
id="is-case-sensitive"
checked={searchQuery.isCaseSensitive}
onChange={toggleIsCaseSensitive}
/>
<label htmlFor="is-case-sensitive">Case sensitive</label>
</span>
</div>
<div id="search-results">
{results.length > 0 ? (
<ul>
{pages}
</ul>
) : (
<p>No results</p>
)}
</div>
</div>
)
}
export default Search

View File

@@ -1,65 +0,0 @@
import React from "react"
import { useStaticQuery, graphql, Link } from "gatsby"
import PageList from "./pagelist"
import "../css/sidebar.css"
const Sidebar = (props) => {
const data = useStaticQuery(graphql`
{
allMarkdownRemark(
sort: {frontmatter: {sequence: ASC}}
filter: {frontmatter: {index: {ne: null}}}
) {
edges {
node {
frontmatter {
hide
path
titles
index
sequence
}
}
}
}
site {
siteMetadata {
title
}
}
}
`)
const pages = data.allMarkdownRemark.edges
// List only parts and chapters and immediate children in the sidebar
const index = props.index !== null ? props.index : []
const filteredPages = index.length < 2
? pages.filter(p => p.node.frontmatter.index.length <= 2)
: pages.filter(p => p.node.frontmatter.index.length <= 2
|| (p.node.frontmatter.index.length === 3
&& p.node.frontmatter.index[0] === index[0]
&& p.node.frontmatter.index[1] === index[1]
)
)
// console.log(JSON.stringify(filteredPages, undefined, 2))
return (
<nav className="sidebar scrollable">
<div className="sidebar-title">
<Link
to="/"
>
{data.site.siteMetadata.title}
</Link>
</div>
<div id="index">
<PageList pages={filteredPages} depth={0} />
</div>
</nav>
)
}
export default Sidebar

View File

@@ -1,55 +0,0 @@
import React from "react"
import { useStaticQuery, graphql } from "gatsby"
import PageList from "./pagelist"
import "../css/subsections.css"
// Format subsections of the page with index indexArray as a nested list
const Subsections = ({indexArray}) => {
const data = useStaticQuery(graphql`
{
allMarkdownRemark(
sort: {frontmatter: {sequence: ASC}}
filter: {frontmatter: {index: {ne: null}}}
) {
edges {
node {
frontmatter {
hide
path
titles
index
sequence
}
}
}
}
}
`)
// Only add the auto index for Parts, not any deeper structure
if (indexArray === null || indexArray.length > 1) return null
// Special hacky handling for the the /contents page
if (indexArray[0] === -1) {
indexArray = []
}
// Find pages that are subsections of the page we are on
const pages = data.allMarkdownRemark.edges
const indexFilterString = indexArray.length === 0 ? "" : indexArray.join() + ","
const filteredPages = pages.filter(p => p.node.frontmatter.index.join().startsWith(indexFilterString))
if (filteredPages.length > 0) {
return (
<div className="subsection-list">
<PageList pages={filteredPages} depth={indexArray.length} />
</div>
)
} else {
return null
}
}
export default Subsections

34
src/content.config.js Normal file
View File

@@ -0,0 +1,34 @@
import { defineCollection, z } from 'astro:content';
import { glob } from 'astro/loaders';
const minimal = import.meta.env.UE_MINIMAL ? true : false;
if (minimal) {
console.log('Building minimal configuration');
}
const pages = defineCollection({
loader: minimal
? glob({ pattern: "**/preface.md", base: "./src/md/pages" })
: glob({ pattern: "**/*.md", base: "./src/md/pages" }),
schema: z.object({
hide: z.boolean(),
path: z.string(),
titles: z.array(z.string()),
index: z.array(z.number()),
sequence: z.number(),
}),
});
const special = defineCollection({
loader: minimal
? glob({ pattern: "search.md", base: "./src/md" })
: glob({ pattern: "*.md", base: "./src/md" }),
schema: z.object({
path: z.string(),
titles: z.array(z.string()).optional(),
index: z.array(z.number()).optional(),
sequence: z.number().optional(),
}),
});
export const collections = { pages, special };

View File

@@ -1,14 +1,21 @@
@import "custom.css";
sup[id^='fnref-'] {
sup:has(a[data-footnote-ref]) {
position: relative;
display: inline;
}
sup:hover span.fn-span {
visibility: visible;
opacity: 1;
z-index: 999;
}
.fn-span {
visibility: hidden;
font-family: 'Open Sans', sans-serif;
font-size: 80%;
font-weight: normal;
position: absolute;
top: 30px;
color: var(--fn-tt-foreground);
@@ -22,12 +29,11 @@ sup[id^='fnref-'] {
transition: opacity 0.8s;
}
sup[id^='fnref-']:hover span.fn-span {
visibility: visible;
opacity: 1;
z-index: 999;
.fn-span p {
padding: 0;
margin: 0;
}
@media print {
.fn-span {
display: none;

View File

@@ -18,11 +18,16 @@ body {
background-color var(--transition);
}
/* A fix for PgUp/PgDn not working to scroll */
@media screen {
#gatsby-focus-wrapper {
display: contents;
}
/* Screen reader only content - leave it there but make it take no space */
.sr-only {
position: absolute;
width: 1px;
height: 1px;
padding: 0;
margin: -1px;
overflow: hidden;
clip-path: rect(0 0 0 0);
border: 0;
}
#page {
@@ -149,6 +154,28 @@ h6 {
h4 code, h5 code, h6 code {font-size: inherit; }
/*** Heading autolinks ***/
a.anchor {
position: relative;
}
a.anchor svg {
position: absolute;
top: 0;
right: 0;
padding-right: 4px;
padding-top: 4px;
opacity: 0;
transition: opacity 0.3s ease;
}
h1:hover svg, h2:hover svg, h3:hover svg, h4:hover svg, h5:hover svg, h6:hover svg {
opacity: 1;
}
/*** Others ***/
figure {
text-align: center;
padding: 0;
@@ -227,14 +254,31 @@ div.title-page {
text-align: center;
flex-direction: column;
justify-content: space-around;
font-weight: bold;
}
div.title-page h1, div.title-page h2, div.title-page h3, div.title-page h4 {
div.title-page .h1, div.title-page .h2, div.title-page .h3, div.title-page .h4 {
border: none;
padding: 0;
margin: 1ex 0;
}
div.title-page .h1 {
font-size: 40px;
}
div.title-page .h2 {
font-size: 30px;
}
div.title-page .h3 {
font-size: 26px;
}
div.title-page .h4 {
font-size: 24px;
}
div.title-page svg {
margin: 1ex auto;
}
@@ -300,23 +344,20 @@ pre[class^="language-"] code {
/*** Equations ***/
/* Keep inline equations the same size as surrounding text */
.math-inline .katex {
p span.katex {
font-size: inherit!important;
}
/* Control size of the displayed equations - this makes them a little smaller */
.math-display .katex {
.katex-display .katex {
font-size: 1em!important;
}
/*** Footnotes ***/
div.footnotes hr {
section.footnotes {
margin-top: 6ex;
border: 0;
border-top: solid 1px #ccc;
}
div.footnotes {
font-size: 85%;
}
@@ -336,7 +377,6 @@ svg path, svg g, svg use {
}
a.anchor svg path {
stroke: var(--foreground);
fill: var(--foreground);
}
@@ -427,7 +467,7 @@ div.title-page svg g#layer101 path {
a.anchor {
display: none;
}
a.footnote-backref {
a.data-footnote-backref {
display: none;
}
/* Undo code styling */

View File

@@ -1,75 +0,0 @@
import React from "react"
import PropTypes from "prop-types"
import { withPrefix } from "gatsby"
export default function HTML(props) {
return (
<html lang="en-GB" {...props.htmlAttributes}>
<head>
<meta charSet="utf-8" />
<meta httpEquiv="x-ua-compatible" content="ie=edge" />
<meta
name="viewport"
content="width=device-width, initial-scale=1, shrink-to-fit=no"
/>
<meta property="og:site-name" content="Upgrading Ethereum" />
<meta property="og:image" content="https://eth2book.info/f/android-icon-192x192.png" />
<meta property="og:description" content="A technical handbook on Ethereum's move to proof of stake and beyond." />
<meta property="og:type" content="website" />
<meta property="og:locale" content="en_GB" />
<meta name="twitter:card" content="summary" />
<meta name="twitter:site" content="eth2book.info" />
<meta name="twitter:creator" content="@benjaminion_xyz" />
<meta name="twitter:image" content="https://eth2book.info/f/android-icon-192x192.png" />
<meta name="description" content="A technical handbook on Ethereum's move to proof of stake and beyond." />
<link rel="apple-touch-icon" sizes="57x57" href="https://eth2book.info/f/apple-icon-57x57.png" />
<link rel="apple-touch-icon" sizes="60x60" href="https://eth2book.info/f/apple-icon-60x60.png" />
<link rel="apple-touch-icon" sizes="72x72" href="https://eth2book.info/f/apple-icon-72x72.png" />
<link rel="apple-touch-icon" sizes="76x76" href="https://eth2book.info/f/apple-icon-76x76.png" />
<link rel="apple-touch-icon" sizes="114x114" href="https://eth2book.info/f/apple-icon-114x114.png" />
<link rel="apple-touch-icon" sizes="120x120" href="https://eth2book.info/f/apple-icon-120x120.png" />
<link rel="apple-touch-icon" sizes="144x144" href="https://eth2book.info/f/apple-icon-144x144.png" />
<link rel="apple-touch-icon" sizes="152x152" href="https://eth2book.info/f/apple-icon-152x152.png" />
<link rel="apple-touch-icon" sizes="180x180" href="https://eth2book.info/f/apple-icon-180x180.png" />
<link rel="icon" type="image/png" sizes="192x192" href="https://eth2book.info/f/android-icon-192x192.png" />
<link rel="icon" type="image/png" sizes="32x32" href="https://eth2book.info/f/favicon-32x32.png" />
<link rel="icon" type="image/png" sizes="96x96" href="https://eth2book.info/f/favicon-96x96.png" />
<link rel="icon" type="image/png" sizes="16x16" href="https://eth2book.info/f/favicon-16x16.png" />
<link rel="manifest" href="https://eth2book.info/f/manifest.json" />
<meta name="msapplication-TileColor" content="#ffffff" />
<meta name="msapplication-TileImage" content="https://eth2book.info/f/ms-icon-144x144.png" />
<meta name="theme-color" content="#ffffff" />
{/* Dark mode stuff */}
<link rel="stylesheet" href={withPrefix('/dark_230103.css')} media="(prefers-color-scheme: dark)" />
<link rel="stylesheet" href={withPrefix('/light_230103.css')} media="(prefers-color-scheme: light)" />
<script type="module" src="https://eth2book.info/inc/dark-mode-toggle.js" />
{props.headComponents}
</head>
<body {...props.bodyAttributes}>
{props.preBodyComponents}
<div id="all-content">
<aside id="dark-mode-toggle">
<dark-mode-toggle permanent="true"></dark-mode-toggle>
</aside>
<div
key={`body`}
id="___gatsby"
dangerouslySetInnerHTML={{ __html: props.body }}
/>
</div>
{props.postBodyComponents}
</body>
</html>
)
}
HTML.propTypes = {
htmlAttributes: PropTypes.object,
headComponents: PropTypes.array,
bodyAttributes: PropTypes.object,
preBodyComponents: PropTypes.array,
body: PropTypes.string,
postBodyComponents: PropTypes.array,
}

61
src/include/SiteConfig.js Normal file
View File

@@ -0,0 +1,61 @@
import { execSync } from 'child_process';
function getGitHash() {
try {
return execSync('git log -1 --format="%h" 2>/dev/null', {encoding: 'utf8'}).replace(/(\r\n|\n|\r)/, '')
} catch(e) {
return 'unknown'
}
}
function getGitBranch() {
try {
return execSync('git branch --show-current 2>/dev/null', {encoding: 'utf8'}).replace(/(\r\n|\n|\r)/, '');
} catch(e) {
return 'unknown';
}
}
const date = new Date().toISOString().substr(0, 16).replace('T', ' ') + ' UTC';
const version = getGitBranch();
const hostname = 'https://eth2book.info';
const canonical = hostname + '/latest';
const Metadata = {
title: 'Upgrading Ethereum',
description: 'A technical handbook on Ethereum\'s move to proof of stake and beyond',
author: 'Ben Edgington',
gitHash: getGitHash(),
gitUrl: 'https://github.com/benjaminion/upgrading-ethereum-book',
date: date,
licenceUrl: 'https://creativecommons.org/licenses/by-sa/4.0/',
licence: 'CC BY-SA 4.0',
hostname: hostname,
version: version,
canonical: canonical,
}
const SearchOptions = {
enabled: true,
// Matching elements have their text added to the index. First match wins.
indexFile: 'search-index.json',
chunkTypes: [
{query: 'figcaption', label: 'Figure caption'},
{query: 'section[data-footnotes] li', label: 'Footnote'},
{query: 'li', label: 'List item'},
{query: 'pre', label: 'Code'},
{query: 'table', label: 'Table'},
{query: 'h3, h4, h5, h6', label: 'Heading', weight: 5},
{query: 'p', label: 'Paragraph'},
],
exclude: {
// Note, only pages under src/md/pages have a "hide" property.
frontmatter: [{hide: true}],
// No point indexing these.
pages: ['/', '/404.html','/contents/','/search/', '/annotated-spec/'],
// Elements matching this query are ignored completely, including their text:
ignore: 'svg, details, mtable, mrow, [aria-hidden="true"], .footnote-ref',
}
};
export { Metadata, SearchOptions };

54
src/layouts/Html.astro Normal file
View File

@@ -0,0 +1,54 @@
---
import 'katex/dist/katex.min.css';
import 'prismjs/themes/prism-tomorrow.min.css'
import { Metadata } from '../include/SiteConfig.js';
import Matomo from '../components/Matomo.astro';
const { pageTitle, canonical, pageUrl } = Astro.props;
---
<html lang="en-GB">
<head>
<meta charSet="utf-8" />
<meta name="viewport" content="width=device-width, initial-scale=1" />
<meta property="og:site-name" content="Upgrading Ethereum" />
<meta property="og:image" content="https://eth2book.info/f/android-icon-192x192.png" />
<meta property="og:description" content="A technical handbook on Ethereum's move to proof of stake and beyond." />
<meta property="og:type" content="website" />
<meta property="og:locale" content="en_GB" />
<meta property="og:title" content={pageTitle} />
<meta property="og:url" content={pageUrl} />
<meta name="twitter:card" content="summary" />
<meta name="twitter:site" content="eth2book.info" />
<meta name="twitter:creator" content="@benjaminion_xyz" />
<meta name="twitter:image" content="https://eth2book.info/f/android-icon-192x192.png" />
<meta name="description" content="A technical handbook on Ethereum's move to proof of stake and beyond." />
<link rel="apple-touch-icon" sizes="57x57" href="https://eth2book.info/f/apple-icon-57x57.png" />
<link rel="apple-touch-icon" sizes="60x60" href="https://eth2book.info/f/apple-icon-60x60.png" />
<link rel="apple-touch-icon" sizes="72x72" href="https://eth2book.info/f/apple-icon-72x72.png" />
<link rel="apple-touch-icon" sizes="76x76" href="https://eth2book.info/f/apple-icon-76x76.png" />
<link rel="apple-touch-icon" sizes="114x114" href="https://eth2book.info/f/apple-icon-114x114.png" />
<link rel="apple-touch-icon" sizes="120x120" href="https://eth2book.info/f/apple-icon-120x120.png" />
<link rel="apple-touch-icon" sizes="144x144" href="https://eth2book.info/f/apple-icon-144x144.png" />
<link rel="apple-touch-icon" sizes="152x152" href="https://eth2book.info/f/apple-icon-152x152.png" />
<link rel="apple-touch-icon" sizes="180x180" href="https://eth2book.info/f/apple-icon-180x180.png" />
<link rel="icon" type="image/png" sizes="192x192" href="https://eth2book.info/f/android-icon-192x192.png" />
<link rel="icon" type="image/png" sizes="32x32" href="https://eth2book.info/f/favicon-32x32.png" />
<link rel="icon" type="image/png" sizes="96x96" href="https://eth2book.info/f/favicon-96x96.png" />
<link rel="icon" type="image/png" sizes="16x16" href="https://eth2book.info/f/favicon-16x16.png" />
{/* Dark mode stuff */}
<link rel="stylesheet" href={`/${Metadata.version}/dark_230103.css`} media="(prefers-color-scheme: dark)" />
<link rel="stylesheet" href={`/${Metadata.version}/light_230103.css`} media="(prefers-color-scheme: light)" />
<script type="module" src="https://eth2book.info/inc/dark-mode-toggle.js" />
<link rel="canonical" href={canonical} />
<title>{pageTitle}</title>
<Matomo />
</head>
<body>
<div id="all-content">
<aside id="dark-mode-toggle">
<dark-mode-toggle permanent="true" />
</aside>
<slot />
</div>
</body>
</html>

View File

@@ -1,5 +1,5 @@
---
path: /404.html
path: /404/
---
# Not Found

View File

@@ -1,6 +1,6 @@
---
path: /contact/
titles: ["Contact me","",""]
titles: ["Contact me"]
index: [999]
sequence: 995
---

View File

@@ -1,6 +1,6 @@
---
path: /contents/
titles: ["Contents","",""]
titles: ["Contents"]
index: [-1]
sequence: -1
---

View File

@@ -1,6 +1,6 @@
---
path: /pdf/
titles: ["PDF","",""]
titles: ["PDF"]
index: [999]
sequence: 995
---

View File

@@ -1,6 +1,6 @@
---
path: /search/
titles: ["Search","",""]
titles: ["Search"]
index: [-1]
sequence: -2
---

File diff suppressed because one or more lines are too long

75
src/pages/[...path].astro Normal file
View File

@@ -0,0 +1,75 @@
---
import { getCollection, render } from 'astro:content';
import { Metadata } from '../include/SiteConfig.js';
import '../css/page.css';
import Html from '../layouts/Html.astro';
import Sidebar from '../components/Sidebar.astro';
import Subsections from '../components/Subsections.astro';
import PrevNext from '../components/PrevNext.astro';
import Footer from '../components/Footer.astro';
import PageNavi from '../components/PageNavi.astro';
import Search from '../components/Search.astro';
import FootnoteTooltips from '../components/FootnoteTooltips.astro';
import Scripts from '../components/Scripts.astro';
import Banner from '../components/Banner.astro';
export async function getStaticPaths() {
const pages = await getCollection('pages');
const special = await getCollection('special');
// Serve pages according to their frontmatter paths
return pages.concat(special).map((page) => ({
params: { path: page.data.path },
props: { page },
}));
}
//console.log(JSON.stringify(Astro, null, 2));
const allPages = (await getCollection('pages'))
.concat(await getCollection('special'))
.sort((a,b) => a.data.sequence - b.data.sequence);
const { page } = Astro.props;
const frontmatter = page.data;
const indexArray = frontmatter.index;
var pageTitle = Metadata.title;
if (frontmatter.titles !== undefined) {
const titles = frontmatter.titles.filter(x => x !== '');
const number = (indexArray.length >= 2) ? indexArray.join('.') : '';
pageTitle += ' | ' + number + ' ' + titles[titles.length - 1];
}
const pageUrl = Metadata.hostname + '/' + Metadata.version + frontmatter.path;
const canonical = Metadata.canonical + (frontmatter.hide ? '/' : frontmatter.path);
const path = frontmatter.path;
const index = frontmatter.index;
const here = frontmatter.sequence;
const { Content } = await render(page);
---
<Html pageTitle={pageTitle} canonical={canonical} pageUrl={pageUrl}>
<div id="page">
<Sidebar pageIndex={index} pages={allPages.filter(p => p.data.titles !== undefined)} here={here} />
<div id="main-content" class="scrollable">
<Banner />
<div id="padded-content">
<PrevNext seq={here} pages={allPages} />
<main>
<Content />
</main>
{path.startsWith('/search')
? <Search />
: <Subsections indexArray={index} pages={allPages.filter(p => p.data.index)}} /> }
<Footer />
<PrevNext seq={here} pages={allPages} />
</div>
</div>
<PageNavi page={page} />
</div>
<FootnoteTooltips />
<Scripts />
</Html>

View File

@@ -1,121 +0,0 @@
import React from "react"
import { graphql } from "gatsby"
import cheerio from "cheerio"
import "../css/page.css"
import Banner from "../components/banner"
import Sidebar from "../components/sidebar"
import Subsections from "../components/subsections"
import PrevNext from "../components/prevnext"
import Footer from "../components/footer"
import PageNavi from "../components/pagenavi"
import FootnoteTooltips from "../components/footnote-tooltips"
import Search from "../components/search"
function postProcessHast($) {
// Remove `align` attributes from <td> and <th> elements - it's obsolete in HTML5
$('td[align]').removeAttr('align')
$('th[align]').removeAttr('align')
// columnspacing="" on <mtable> is not allowed
$('mtable[columnspacing=""]').removeAttr('columnspacing')
// Add target="_blank" and rel="external noopener" to external links
$('a[href^=http]').each(function (i, e) {
$(e).attr('target', '_blank')
$(e).attr('rel', 'external noopener')
})
return $
}
export function Head({ data }) {
const { markdownRemark, site } = data
const frontmatter = markdownRemark.frontmatter
const metadata = site.siteMetadata
const indexArray = frontmatter.index
var pageTitle = metadata.title
if (frontmatter.titles !== null) {
const titles = frontmatter.titles.filter(x => x !== '')
const number = (indexArray.length >= 2) ? indexArray.join('.') : ''
pageTitle += ' | ' + number + ' ' + titles[titles.length - 1]
}
const pageUrl = metadata.hostname + '/' + metadata.version + frontmatter.path
const canonical = metadata.canonical + (frontmatter.hide ? '/' : frontmatter.path)
return (
<>
<title>{pageTitle}</title>
<link rel="canonical" href={canonical} />
<meta property="og:title" content={pageTitle} />
<meta property="og:url" content={pageUrl} />
</>
)
}
export default function Template({ data }) {
const html = data.mySearchData.html
const frontmatter = data.markdownRemark.frontmatter
const indexArray = frontmatter.index
const path = frontmatter.path
const prevNext = <PrevNext seq={frontmatter.sequence} />
const pageExtras = path.startsWith('/search')
? <Search />
: <Subsections indexArray={indexArray} />
const htmlPostProcessed = postProcessHast(cheerio.load(html, null, false)).html()
return (
<React.StrictMode>
<div id="page">
<Sidebar index={frontmatter.index} />
<div id="main-content" className="scrollable">
<Banner path={path} />
<div id="padded-content">
{prevNext}
<main
dangerouslySetInnerHTML={{ __html: htmlPostProcessed }}
/>
{pageExtras}
<Footer />
{prevNext}
</div>
</div>
<PageNavi path={path} />
</div>
<FootnoteTooltips />
</React.StrictMode>
)
}
export const pageQuery = graphql`
query($path: String!) {
mySearchData(frontmatter: { path: { eq: $path } }) {
html
}
markdownRemark(frontmatter: { path: { eq: $path } }) {
frontmatter {
index
path
sequence
titles
hide
}
}
site {
siteMetadata {
title
hostname
version
canonical
}
}
}
`

5
tsconfig.json Normal file
View File

@@ -0,0 +1,5 @@
{
"extends": "astro/tsconfigs/strict",
"include": [".astro/types.d.ts", "**/*"],
"exclude": ["dist"]
}