mirror of
https://github.com/foambubble/foam.git
synced 2026-01-10 14:38:13 -05:00
Compare commits
23 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
cf70c97fd7 | ||
|
|
5d8b9756a9 | ||
|
|
f63b7b2c9b | ||
|
|
875fde8a45 | ||
|
|
5ef292382e | ||
|
|
0fa0176e1b | ||
|
|
d312eca3d1 | ||
|
|
bff39be923 | ||
|
|
52c16b15df | ||
|
|
f2a5c6c45a | ||
|
|
6c354a13d8 | ||
|
|
8826a2b358 | ||
|
|
761eeb7336 | ||
|
|
6d9a8305ce | ||
|
|
4ee065ff9a | ||
|
|
df3cd90ce7 | ||
|
|
0af6c4de15 | ||
|
|
3d49146087 | ||
|
|
8cb2246278 | ||
|
|
72890d33cb | ||
|
|
70b11a921a | ||
|
|
f457b14ec0 | ||
|
|
bbb4801486 |
@@ -1184,6 +1184,15 @@
|
||||
"contributions": [
|
||||
"code"
|
||||
]
|
||||
},
|
||||
{
|
||||
"login": "meestahp",
|
||||
"name": "meestahp",
|
||||
"avatar_url": "https://avatars.githubusercontent.com/u/177708514?v=4",
|
||||
"profile": "https://github.com/meestahp",
|
||||
"contributions": [
|
||||
"code"
|
||||
]
|
||||
}
|
||||
],
|
||||
"contributorsPerLine": 7,
|
||||
|
||||
59
.claude/commands/prepare-pr.md
Normal file
59
.claude/commands/prepare-pr.md
Normal file
@@ -0,0 +1,59 @@
|
||||
# Prepare PR Command
|
||||
|
||||
Analyze the current branch changes and generate:
|
||||
|
||||
- a PR title
|
||||
- a PR description
|
||||
- considerations for the developer before pushing the PR
|
||||
|
||||
Output the title and description ready to paste into GitHub.
|
||||
|
||||
## PR TITLE
|
||||
|
||||
Use format: `type(scope): description`
|
||||
|
||||
- type: feat/fix/refactor/perf/docs/chore
|
||||
- Keep under 72 characters
|
||||
- Be specific but brief
|
||||
|
||||
## PR DESCRIPTION
|
||||
|
||||
It should have these sections (use a paragraph per section, no need to title them).
|
||||
CONSTRAINTS:
|
||||
|
||||
- 100-200 words total
|
||||
- No file names or "updated X file" statements
|
||||
- Active voice
|
||||
- No filler or pleasantries
|
||||
- Focus on WHAT and WHY, not HOW
|
||||
|
||||
### What Changed
|
||||
|
||||
List 2-4 changes grouped by DOMAIN, not files. Focus on:
|
||||
|
||||
- User-facing changes
|
||||
- Architectural shifts
|
||||
- API changes
|
||||
Skip trivial updates (formatting, minor refactors).
|
||||
|
||||
### Why
|
||||
|
||||
One sentence explaining motivation (skip if obvious from title).
|
||||
|
||||
### Critical Notes
|
||||
|
||||
ONLY include if relevant:
|
||||
|
||||
- Breaking changes
|
||||
- Performance impact
|
||||
- Security implications
|
||||
- New dependencies
|
||||
- Required config/env changes
|
||||
- Database migrations
|
||||
|
||||
If no critical notes exist, omit this section.
|
||||
|
||||
## Considerations
|
||||
|
||||
Run the `yarn lint` command and report any failures.
|
||||
Also analize the changeset, and act as a PR reviewer to provide comments about the changes.
|
||||
1
docs/CNAME
Normal file
1
docs/CNAME
Normal file
@@ -0,0 +1 @@
|
||||
foamnotes.com
|
||||
@@ -279,6 +279,7 @@ Foam is an evolving project and we welcome contributions:
|
||||
<td align="center" valign="top" width="14.28%"><a href="https://github.com/s-jacob-powell"><img src="https://avatars.githubusercontent.com/u/109111499?v=4?s=60" width="60px;" alt="S. Jacob Powell"/><br /><sub><b>S. Jacob Powell</b></sub></a><br /><a href="https://github.com/foambubble/foam/commits?author=s-jacob-powell" title="Code">💻</a></td>
|
||||
<td align="center" valign="top" width="14.28%"><a href="https://github.com/figdavi"><img src="https://avatars.githubusercontent.com/u/99026991?v=4?s=60" width="60px;" alt="Davi Figueiredo"/><br /><sub><b>Davi Figueiredo</b></sub></a><br /><a href="https://github.com/foambubble/foam/commits?author=figdavi" title="Documentation">📖</a></td>
|
||||
<td align="center" valign="top" width="14.28%"><a href="https://github.com/ChThH"><img src="https://avatars.githubusercontent.com/u/9499483?v=4?s=60" width="60px;" alt="CT Hall"/><br /><sub><b>CT Hall</b></sub></a><br /><a href="https://github.com/foambubble/foam/commits?author=ChThH" title="Code">💻</a></td>
|
||||
<td align="center" valign="top" width="14.28%"><a href="https://github.com/meestahp"><img src="https://avatars.githubusercontent.com/u/177708514?v=4?s=60" width="60px;" alt="meestahp"/><br /><sub><b>meestahp</b></sub></a><br /><a href="https://github.com/foambubble/foam/commits?author=meestahp" title="Code">💻</a></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
# Sync notes with source control
|
||||
|
||||
Source control is a way to precicely manage the history and content of a directory of files.
|
||||
Source control is a way to precisely manage the history and content of a directory of files.
|
||||
Often used for program code, this feature is very useful for note taking as well.
|
||||
|
||||
There are (too) many ways to commit your changes to source control:
|
||||
|
||||
@@ -4,5 +4,5 @@
|
||||
],
|
||||
"npmClient": "yarn",
|
||||
"useWorkspaces": true,
|
||||
"version": "0.29.0"
|
||||
"version": "0.29.1"
|
||||
}
|
||||
|
||||
@@ -4,6 +4,17 @@ All notable changes to the "foam-vscode" extension will be documented in this fi
|
||||
|
||||
Check [Keep a Changelog](http://keepachangelog.com/) for recommendations on how to structure this file.
|
||||
|
||||
## 0.29.1
|
||||
|
||||
Fixes and Improvements:
|
||||
|
||||
- Load graph in code server (#1400)
|
||||
- Don't treat text in single brackets as links/placeholders if missing a ref (#1545, #1546)
|
||||
- Added option to show graph on startup (#1542)
|
||||
- Added include patterns for Foam notes (#1550, #1422)
|
||||
- Added support for emoji variants in tags (#1536, #1549)
|
||||
- Added support for wikilink with aliases within tables (#1544, #1552)
|
||||
|
||||
## 0.29.0
|
||||
|
||||
Fixes and Improvements:
|
||||
|
||||
@@ -1,9 +1,10 @@
|
||||
# Foam for VSCode
|
||||
|
||||
[](https://marketplace.visualstudio.com/items?itemName=foam.foam-vscode)
|
||||
[](https://marketplace.visualstudio.com/items?itemName=foam.foam-vscode)
|
||||
[](https://marketplace.visualstudio.com/items?itemName=foam.foam-vscode)
|
||||
[](https://marketplace.visualstudio.com/items?itemName=foam.foam-vscode)
|
||||
[](https://marketplace.visualstudio.com/items?itemName=foam.foam-vscode)
|
||||
[](https://marketplace.visualstudio.com/items?itemName=foam.foam-vscode)
|
||||
[](https://marketplace.visualstudio.com/items?itemName=foam.foam-vscode)
|
||||
[](https://marketplace.visualstudio.com/items?itemName=foam.foam-vscode)
|
||||
[](https://foambubble.github.io/join-discord/g)
|
||||
|
||||
> You can join the Foam Community on the [Foam Discord](https://foambubble.github.io/join-discord/e)
|
||||
|
||||
|
||||
@@ -8,7 +8,7 @@
|
||||
"type": "git"
|
||||
},
|
||||
"homepage": "https://github.com/foambubble/foam",
|
||||
"version": "0.29.0",
|
||||
"version": "0.29.1",
|
||||
"license": "MIT",
|
||||
"publisher": "foam",
|
||||
"engines": {
|
||||
@@ -82,6 +82,13 @@
|
||||
"name": "Placeholders",
|
||||
"icon": "$(debug-disconnect)",
|
||||
"contextualTitle": "Foam"
|
||||
},
|
||||
{
|
||||
"when": "config.foam.experimental.ai",
|
||||
"id": "foam-vscode.related-notes",
|
||||
"name": "Related Notes (AI)",
|
||||
"icon": "$(sparkle)",
|
||||
"contextualTitle": "Foam"
|
||||
}
|
||||
]
|
||||
},
|
||||
@@ -101,6 +108,21 @@
|
||||
{
|
||||
"view": "foam-vscode.placeholders",
|
||||
"contents": "No placeholders found for selected resource or workspace."
|
||||
},
|
||||
{
|
||||
"view": "foam-vscode.related-notes",
|
||||
"contents": "Open a note to see related notes.",
|
||||
"when": "config.foam.experimental.ai && foam.relatedNotes.state == 'no-note'"
|
||||
},
|
||||
{
|
||||
"view": "foam-vscode.related-notes",
|
||||
"contents": "Notes haven't been analyzed yet.\n[Analyze Notes](command:foam-vscode.build-embeddings)\nAnalyze your notes to discover similar content.",
|
||||
"when": "config.foam.experimental.ai && foam.relatedNotes.state == 'no-embedding'"
|
||||
},
|
||||
{
|
||||
"view": "foam-vscode.related-notes",
|
||||
"contents": "No similar notes found for the current note.",
|
||||
"when": "config.foam.experimental.ai && foam.relatedNotes.state == 'ready'"
|
||||
}
|
||||
],
|
||||
"menus": {
|
||||
@@ -384,6 +406,16 @@
|
||||
"title": "Foam: Rename Tag",
|
||||
"icon": "$(edit)"
|
||||
},
|
||||
{
|
||||
"command": "foam-vscode.show-similar-notes",
|
||||
"title": "Foam: Show Similar Notes",
|
||||
"when": "config.foam.experimental.ai"
|
||||
},
|
||||
{
|
||||
"command": "foam-vscode.build-embeddings",
|
||||
"title": "Foam: Build Embeddings Index",
|
||||
"when": "config.foam.experimental.ai"
|
||||
},
|
||||
{
|
||||
"command": "foam-vscode.views.orphans.group-by:folder",
|
||||
"title": "Group By Folder",
|
||||
@@ -549,7 +581,24 @@
|
||||
"**/_site/**/*",
|
||||
"**/node_modules/**/*"
|
||||
],
|
||||
"description": "Specifies the list of globs that will be ignored by Foam (e.g. they will not be considered when creating the graph). To ignore the all the content of a given folder, use `<folderName>/**/*`"
|
||||
"description": "Specifies the list of globs that will be ignored by Foam (e.g. they will not be considered when creating the graph). To ignore all the content of a given folder, use `<folderName>/**/*`",
|
||||
"deprecationMessage": "Use 'foam.files.exclude' instead. This setting will be removed in a future version."
|
||||
},
|
||||
"foam.files.exclude": {
|
||||
"type": [
|
||||
"array"
|
||||
],
|
||||
"default": [],
|
||||
"description": "Specifies the list of globs that will be excluded by Foam (e.g. they will not be considered when creating the graph). To exclude all the content of a given folder, use `<folderName>/**/*`. This setting is combined with 'foam.files.ignore' (deprecated) and 'files.exclude'."
|
||||
},
|
||||
"foam.files.include": {
|
||||
"type": [
|
||||
"array"
|
||||
],
|
||||
"default": [
|
||||
"**/*"
|
||||
],
|
||||
"description": "Specifies the list of glob patterns for files to include in Foam. Files must match at least one include pattern and not match any exclude patterns. Use this to limit Foam to specific directories (e.g., [\"notes/**\"]) or file types (e.g., [\"**/*.md\"]). Defaults to all files."
|
||||
},
|
||||
"foam.files.attachmentExtensions": {
|
||||
"type": "string",
|
||||
@@ -677,6 +726,7 @@
|
||||
"description": "Whether or not to navigate to the target daily note when a daily note snippet is selected."
|
||||
},
|
||||
"foam.preview.embedNoteType": {
|
||||
"when": "config.foam.experimental.ai",
|
||||
"type": "string",
|
||||
"default": "full-card",
|
||||
"enum": [
|
||||
@@ -701,6 +751,11 @@
|
||||
"type": "object",
|
||||
"description": "Custom graph styling settings. An example is present in the documentation.",
|
||||
"default": {}
|
||||
},
|
||||
"foam.graph.onStartup": {
|
||||
"type": "boolean",
|
||||
"default": false,
|
||||
"description": "Whether to open the graph on startup."
|
||||
}
|
||||
}
|
||||
},
|
||||
|
||||
17
packages/foam-vscode/src/ai/model/embedding-cache.ts
Normal file
17
packages/foam-vscode/src/ai/model/embedding-cache.ts
Normal file
@@ -0,0 +1,17 @@
|
||||
import { URI } from '../../core/model/uri';
|
||||
import { ICache } from '../../core/utils/cache';
|
||||
|
||||
type Checksum = string;
|
||||
|
||||
/**
|
||||
* Cache entry for embeddings
|
||||
*/
|
||||
export interface EmbeddingCacheEntry {
|
||||
checksum: Checksum;
|
||||
embedding: number[];
|
||||
}
|
||||
|
||||
/**
|
||||
* Cache for embeddings, keyed by URI
|
||||
*/
|
||||
export type EmbeddingCache = ICache<URI, EmbeddingCacheEntry>;
|
||||
365
packages/foam-vscode/src/ai/model/embeddings.test.ts
Normal file
365
packages/foam-vscode/src/ai/model/embeddings.test.ts
Normal file
@@ -0,0 +1,365 @@
|
||||
import { FoamEmbeddings } from './embeddings';
|
||||
import {
|
||||
EmbeddingProvider,
|
||||
EmbeddingProviderInfo,
|
||||
} from '../services/embedding-provider';
|
||||
import {
|
||||
createTestWorkspace,
|
||||
InMemoryDataStore,
|
||||
waitForExpect,
|
||||
} from '../../test/test-utils';
|
||||
import { URI } from '../../core/model/uri';
|
||||
|
||||
// Helper to create a simple mock provider
|
||||
class MockProvider implements EmbeddingProvider {
|
||||
async embed(text: string): Promise<number[]> {
|
||||
const vector = new Array(384).fill(0);
|
||||
vector[0] = text.length / 100; // Deterministic based on text length
|
||||
return vector;
|
||||
}
|
||||
async isAvailable(): Promise<boolean> {
|
||||
return true;
|
||||
}
|
||||
getProviderInfo(): EmbeddingProviderInfo {
|
||||
return {
|
||||
name: 'Test Provider',
|
||||
type: 'local',
|
||||
model: { name: 'test-model', dimensions: 384 },
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
const ROOT = [URI.parse('/', 'file')];
|
||||
|
||||
describe('FoamEmbeddings', () => {
|
||||
describe('cosineSimilarity', () => {
|
||||
it('should return 1 for identical vectors', () => {
|
||||
const datastore = new InMemoryDataStore();
|
||||
const workspace = createTestWorkspace(ROOT, datastore);
|
||||
const embeddings = new FoamEmbeddings(workspace, new MockProvider());
|
||||
const vector = [1, 2, 3, 4, 5];
|
||||
const similarity = embeddings.cosineSimilarity(vector, vector);
|
||||
expect(similarity).toBeCloseTo(1.0, 5);
|
||||
workspace.dispose();
|
||||
});
|
||||
|
||||
it('should return 0 for orthogonal vectors', () => {
|
||||
const datastore = new InMemoryDataStore();
|
||||
const workspace = createTestWorkspace(ROOT, datastore);
|
||||
const embeddings = new FoamEmbeddings(workspace, new MockProvider());
|
||||
const vec1 = [1, 0, 0];
|
||||
const vec2 = [0, 1, 0];
|
||||
const similarity = embeddings.cosineSimilarity(vec1, vec2);
|
||||
expect(similarity).toBeCloseTo(0.0, 5);
|
||||
workspace.dispose();
|
||||
});
|
||||
|
||||
it('should return -1 for opposite vectors', () => {
|
||||
const datastore = new InMemoryDataStore();
|
||||
const workspace = createTestWorkspace(ROOT, datastore);
|
||||
const embeddings = new FoamEmbeddings(workspace, new MockProvider());
|
||||
const vec1 = [1, 0, 0];
|
||||
const vec2 = [-1, 0, 0];
|
||||
const similarity = embeddings.cosineSimilarity(vec1, vec2);
|
||||
expect(similarity).toBeCloseTo(-1.0, 5);
|
||||
workspace.dispose();
|
||||
});
|
||||
|
||||
it('should return 0 for zero vectors', () => {
|
||||
const datastore = new InMemoryDataStore();
|
||||
const workspace = createTestWorkspace(ROOT, datastore);
|
||||
const embeddings = new FoamEmbeddings(workspace, new MockProvider());
|
||||
const vec1 = [0, 0, 0];
|
||||
const vec2 = [1, 2, 3];
|
||||
const similarity = embeddings.cosineSimilarity(vec1, vec2);
|
||||
expect(similarity).toBe(0);
|
||||
workspace.dispose();
|
||||
});
|
||||
|
||||
it('should throw error for vectors of different lengths', () => {
|
||||
const datastore = new InMemoryDataStore();
|
||||
const workspace = createTestWorkspace(ROOT, datastore);
|
||||
const embeddings = new FoamEmbeddings(workspace, new MockProvider());
|
||||
const vec1 = [1, 2, 3];
|
||||
const vec2 = [1, 2];
|
||||
expect(() => embeddings.cosineSimilarity(vec1, vec2)).toThrow();
|
||||
workspace.dispose();
|
||||
});
|
||||
});
|
||||
|
||||
describe('updateResource', () => {
|
||||
it('should create embedding for a resource', async () => {
|
||||
const datastore = new InMemoryDataStore();
|
||||
const workspace = createTestWorkspace(ROOT, datastore);
|
||||
const embeddings = new FoamEmbeddings(workspace, new MockProvider());
|
||||
|
||||
const noteUri = URI.parse('/path/to/note.md', 'file');
|
||||
datastore.set(noteUri, '# Test Note\n\nThis is test content');
|
||||
await workspace.fetchAndSet(noteUri);
|
||||
|
||||
await embeddings.updateResource(noteUri);
|
||||
|
||||
const embedding = embeddings.getEmbedding(noteUri);
|
||||
expect(embedding).not.toBeNull();
|
||||
expect(embedding?.length).toBe(384);
|
||||
|
||||
workspace.dispose();
|
||||
});
|
||||
|
||||
it('should remove embedding when resource is deleted', async () => {
|
||||
const datastore = new InMemoryDataStore();
|
||||
const workspace = createTestWorkspace(ROOT, datastore);
|
||||
const embeddings = new FoamEmbeddings(workspace, new MockProvider());
|
||||
|
||||
const noteUri = URI.parse('/path/to/note.md', 'file');
|
||||
datastore.set(noteUri, '# Test Note\n\nContent');
|
||||
await workspace.fetchAndSet(noteUri);
|
||||
|
||||
await embeddings.updateResource(noteUri);
|
||||
expect(embeddings.getEmbedding(noteUri)).not.toBeNull();
|
||||
|
||||
workspace.delete(noteUri);
|
||||
await embeddings.updateResource(noteUri);
|
||||
|
||||
expect(embeddings.getEmbedding(noteUri)).toBeNull();
|
||||
|
||||
workspace.dispose();
|
||||
});
|
||||
|
||||
it('should create different embeddings for different content', async () => {
|
||||
const datastore = new InMemoryDataStore();
|
||||
const workspace = createTestWorkspace(ROOT, datastore);
|
||||
const embeddings = new FoamEmbeddings(workspace, new MockProvider());
|
||||
|
||||
const note1Uri = URI.parse('/note1.md', 'file');
|
||||
const note2Uri = URI.parse('/note2.md', 'file');
|
||||
|
||||
// Same title, different content
|
||||
datastore.set(note1Uri, '# Same Title\n\nShort content');
|
||||
datastore.set(
|
||||
note2Uri,
|
||||
'# Same Title\n\nThis is much longer content that should produce a different embedding vector'
|
||||
);
|
||||
|
||||
await workspace.fetchAndSet(note1Uri);
|
||||
await workspace.fetchAndSet(note2Uri);
|
||||
|
||||
await embeddings.updateResource(note1Uri);
|
||||
await embeddings.updateResource(note2Uri);
|
||||
|
||||
const embedding1 = embeddings.getEmbedding(note1Uri);
|
||||
const embedding2 = embeddings.getEmbedding(note2Uri);
|
||||
|
||||
expect(embedding1).not.toBeNull();
|
||||
expect(embedding2).not.toBeNull();
|
||||
|
||||
// Embeddings should be different because content is different
|
||||
// Our mock provider uses text.length for the first vector component
|
||||
expect(embedding1![0]).not.toBe(embedding2![0]);
|
||||
|
||||
workspace.dispose();
|
||||
});
|
||||
});
|
||||
|
||||
describe('hasEmbeddings', () => {
|
||||
it('should return false when no embeddings exist', () => {
|
||||
const datastore = new InMemoryDataStore();
|
||||
const workspace = createTestWorkspace(ROOT, datastore);
|
||||
const embeddings = new FoamEmbeddings(workspace, new MockProvider());
|
||||
expect(embeddings.hasEmbeddings()).toBe(false);
|
||||
workspace.dispose();
|
||||
});
|
||||
|
||||
it('should return true when embeddings exist', async () => {
|
||||
const datastore = new InMemoryDataStore();
|
||||
const workspace = createTestWorkspace(ROOT, datastore);
|
||||
const embeddings = new FoamEmbeddings(workspace, new MockProvider());
|
||||
|
||||
const noteUri = URI.parse('/path/to/note.md', 'file');
|
||||
datastore.set(noteUri, '# Note\n\nContent');
|
||||
await workspace.fetchAndSet(noteUri);
|
||||
|
||||
await embeddings.updateResource(noteUri);
|
||||
|
||||
expect(embeddings.hasEmbeddings()).toBe(true);
|
||||
|
||||
workspace.dispose();
|
||||
});
|
||||
});
|
||||
|
||||
describe('getSimilar', () => {
|
||||
it('should return empty array when no embedding exists for target', () => {
|
||||
const datastore = new InMemoryDataStore();
|
||||
const workspace = createTestWorkspace(ROOT, datastore);
|
||||
const embeddings = new FoamEmbeddings(workspace, new MockProvider());
|
||||
const uri = URI.parse('/path/to/note.md', 'file');
|
||||
|
||||
const similar = embeddings.getSimilar(uri, 5);
|
||||
|
||||
expect(similar).toEqual([]);
|
||||
workspace.dispose();
|
||||
});
|
||||
|
||||
it('should return similar notes sorted by similarity', async () => {
|
||||
const datastore = new InMemoryDataStore();
|
||||
const workspace = createTestWorkspace(ROOT, datastore);
|
||||
const embeddings = new FoamEmbeddings(workspace, new MockProvider());
|
||||
|
||||
// Create notes with different content lengths
|
||||
const note1Uri = URI.parse('/note1.md', 'file');
|
||||
const note2Uri = URI.parse('/note2.md', 'file');
|
||||
const note3Uri = URI.parse('/note3.md', 'file');
|
||||
|
||||
datastore.set(note1Uri, '# Note 1\n\nShort');
|
||||
datastore.set(note2Uri, '# Note 2\n\nMedium length text');
|
||||
datastore.set(note3Uri, '# Note 3\n\nVery long text content here');
|
||||
|
||||
await workspace.fetchAndSet(note1Uri);
|
||||
await workspace.fetchAndSet(note2Uri);
|
||||
await workspace.fetchAndSet(note3Uri);
|
||||
|
||||
await embeddings.updateResource(note1Uri);
|
||||
await embeddings.updateResource(note2Uri);
|
||||
await embeddings.updateResource(note3Uri);
|
||||
|
||||
// Get similar to note2
|
||||
const similar = embeddings.getSimilar(note2Uri, 10);
|
||||
|
||||
expect(similar.length).toBe(2); // Excludes self
|
||||
expect(similar[0].uri.path).toBeTruthy();
|
||||
expect(similar[0].similarity).toBeGreaterThanOrEqual(
|
||||
similar[1].similarity
|
||||
);
|
||||
|
||||
workspace.dispose();
|
||||
});
|
||||
|
||||
it('should respect topK parameter', async () => {
|
||||
const datastore = new InMemoryDataStore();
|
||||
const workspace = createTestWorkspace(ROOT, datastore);
|
||||
const embeddings = new FoamEmbeddings(workspace, new MockProvider());
|
||||
|
||||
// Create multiple notes
|
||||
for (let i = 0; i < 10; i++) {
|
||||
const noteUri = URI.parse(`/note${i}.md`, 'file');
|
||||
datastore.set(noteUri, `# Note ${i}\n\nContent ${i}`);
|
||||
await workspace.fetchAndSet(noteUri);
|
||||
await embeddings.updateResource(noteUri);
|
||||
}
|
||||
|
||||
const target = URI.parse('/note0.md', 'file');
|
||||
const similar = embeddings.getSimilar(target, 3);
|
||||
|
||||
expect(similar.length).toBe(3);
|
||||
|
||||
workspace.dispose();
|
||||
});
|
||||
|
||||
it('should not include self in similar results', async () => {
|
||||
const datastore = new InMemoryDataStore();
|
||||
const workspace = createTestWorkspace(ROOT, datastore);
|
||||
const embeddings = new FoamEmbeddings(workspace, new MockProvider());
|
||||
|
||||
const noteUri = URI.parse('/note.md', 'file');
|
||||
datastore.set(noteUri, '# Note\n\nContent');
|
||||
await workspace.fetchAndSet(noteUri);
|
||||
await embeddings.updateResource(noteUri);
|
||||
|
||||
const similar = embeddings.getSimilar(noteUri, 10);
|
||||
|
||||
expect(similar.find(s => s.uri.path === noteUri.path)).toBeUndefined();
|
||||
|
||||
workspace.dispose();
|
||||
});
|
||||
});
|
||||
|
||||
describe('fromWorkspace with monitoring', () => {
|
||||
it('should automatically update when resource is added', async () => {
|
||||
const datastore = new InMemoryDataStore();
|
||||
const workspace = createTestWorkspace(ROOT, datastore);
|
||||
const embeddings = FoamEmbeddings.fromWorkspace(
|
||||
workspace,
|
||||
new MockProvider(),
|
||||
true
|
||||
);
|
||||
|
||||
const noteUri = URI.parse('/new-note.md', 'file');
|
||||
datastore.set(noteUri, '# New Note\n\nContent');
|
||||
await workspace.fetchAndSet(noteUri);
|
||||
|
||||
// Give it a moment to process
|
||||
await new Promise(resolve => setTimeout(resolve, 100));
|
||||
|
||||
const embedding = embeddings.getEmbedding(noteUri);
|
||||
expect(embedding).not.toBeNull();
|
||||
|
||||
embeddings.dispose();
|
||||
workspace.dispose();
|
||||
});
|
||||
|
||||
it('should automatically update when resource is modified', async () => {
|
||||
const datastore = new InMemoryDataStore();
|
||||
const workspace = createTestWorkspace(ROOT, datastore);
|
||||
const noteUri = URI.parse('/note.md', 'file');
|
||||
|
||||
datastore.set(noteUri, '# Note\n\nOriginal content');
|
||||
await workspace.fetchAndSet(noteUri);
|
||||
|
||||
const embeddings = FoamEmbeddings.fromWorkspace(
|
||||
workspace,
|
||||
new MockProvider(),
|
||||
true
|
||||
);
|
||||
|
||||
await embeddings.updateResource(noteUri);
|
||||
const originalEmbedding = embeddings.getEmbedding(noteUri);
|
||||
|
||||
// Update the content of the note to simulate a change
|
||||
datastore.set(noteUri, '# Note\n\nDifferent content that is much longer');
|
||||
|
||||
// Trigger workspace update event
|
||||
await workspace.fetchAndSet(noteUri);
|
||||
|
||||
// Wait for automatic update
|
||||
await waitForExpect(
|
||||
() => {
|
||||
const newEmbedding = embeddings.getEmbedding(noteUri);
|
||||
expect(newEmbedding).not.toEqual(originalEmbedding);
|
||||
},
|
||||
1000,
|
||||
50
|
||||
);
|
||||
|
||||
embeddings.dispose();
|
||||
workspace.dispose();
|
||||
});
|
||||
|
||||
it('should automatically remove embedding when resource is deleted', async () => {
|
||||
const datastore = new InMemoryDataStore();
|
||||
const workspace = createTestWorkspace(ROOT, datastore);
|
||||
const noteUri = URI.parse('/note.md', 'file');
|
||||
|
||||
datastore.set(noteUri, '# Note\n\nContent');
|
||||
await workspace.fetchAndSet(noteUri);
|
||||
|
||||
const embeddings = FoamEmbeddings.fromWorkspace(
|
||||
workspace,
|
||||
new MockProvider(),
|
||||
true
|
||||
);
|
||||
|
||||
await embeddings.updateResource(noteUri);
|
||||
expect(embeddings.getEmbedding(noteUri)).not.toBeNull();
|
||||
|
||||
workspace.delete(noteUri);
|
||||
|
||||
// Give it a moment to process
|
||||
await new Promise(resolve => setTimeout(resolve, 50));
|
||||
|
||||
expect(embeddings.getEmbedding(noteUri)).toBeNull();
|
||||
|
||||
embeddings.dispose();
|
||||
workspace.dispose();
|
||||
});
|
||||
});
|
||||
});
|
||||
382
packages/foam-vscode/src/ai/model/embeddings.ts
Normal file
382
packages/foam-vscode/src/ai/model/embeddings.ts
Normal file
@@ -0,0 +1,382 @@
|
||||
import { Emitter } from '../../core/common/event';
|
||||
import { IDisposable } from '../../core/common/lifecycle';
|
||||
import { Logger } from '../../core/utils/log';
|
||||
import { hash } from '../../core/utils';
|
||||
import { EmbeddingProvider, Embedding } from '../services/embedding-provider';
|
||||
import { EmbeddingCache } from './embedding-cache';
|
||||
import {
|
||||
ProgressCallback,
|
||||
CancellationToken,
|
||||
CancellationError,
|
||||
} from '../../core/services/progress';
|
||||
import { FoamWorkspace } from '../../core/model/workspace';
|
||||
import { URI } from '../../core/model/uri';
|
||||
|
||||
/**
|
||||
* Represents a similar resource with its similarity score
|
||||
*/
|
||||
export interface SimilarResource {
|
||||
uri: URI;
|
||||
similarity: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* Context information for embedding progress
|
||||
*/
|
||||
export interface EmbeddingProgressContext {
|
||||
/** URI of the current resource */
|
||||
uri: URI;
|
||||
/** Title of the current resource */
|
||||
title: string;
|
||||
}
|
||||
|
||||
/**
|
||||
* Manages embeddings for all resources in the workspace
|
||||
*/
|
||||
export class FoamEmbeddings implements IDisposable {
|
||||
/**
|
||||
* Maps resource URIs to their embeddings
|
||||
*/
|
||||
private embeddings: Map<string, Embedding> = new Map();
|
||||
|
||||
private onDidUpdateEmitter = new Emitter<void>();
|
||||
onDidUpdate = this.onDidUpdateEmitter.event;
|
||||
|
||||
/**
|
||||
* List of disposables to destroy with the embeddings
|
||||
*/
|
||||
private disposables: IDisposable[] = [];
|
||||
|
||||
constructor(
|
||||
private readonly workspace: FoamWorkspace,
|
||||
private readonly provider: EmbeddingProvider,
|
||||
private readonly cache?: EmbeddingCache
|
||||
) {}
|
||||
|
||||
/**
|
||||
* Get the embedding for a resource
|
||||
* @param uri The URI of the resource
|
||||
* @returns The embedding vector, or null if not found
|
||||
*/
|
||||
public getEmbedding(uri: URI): number[] | null {
|
||||
const embedding = this.embeddings.get(uri.path);
|
||||
return embedding ? embedding.vector : null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if embeddings are available
|
||||
* @returns true if at least one embedding exists
|
||||
*/
|
||||
public hasEmbeddings(): boolean {
|
||||
return this.embeddings.size > 0;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the number of embeddings
|
||||
* @returns The count of embeddings
|
||||
*/
|
||||
public size(): number {
|
||||
return this.embeddings.size;
|
||||
}
|
||||
|
||||
/**
|
||||
* Find similar resources to a given resource
|
||||
* @param uri The URI of the target resource
|
||||
* @param topK The number of similar resources to return
|
||||
* @returns Array of similar resources sorted by similarity (highest first)
|
||||
*/
|
||||
public getSimilar(uri: URI, topK: number = 10): SimilarResource[] {
|
||||
const targetEmbedding = this.getEmbedding(uri);
|
||||
if (!targetEmbedding) {
|
||||
return [];
|
||||
}
|
||||
|
||||
const similarities: SimilarResource[] = [];
|
||||
|
||||
for (const [path, embedding] of this.embeddings.entries()) {
|
||||
// Skip self
|
||||
if (path === uri.path) {
|
||||
continue;
|
||||
}
|
||||
|
||||
const similarity = this.cosineSimilarity(
|
||||
targetEmbedding,
|
||||
embedding.vector
|
||||
);
|
||||
similarities.push({
|
||||
uri: URI.file(path),
|
||||
similarity,
|
||||
});
|
||||
}
|
||||
|
||||
// Sort by similarity (highest first) and take top K
|
||||
similarities.sort((a, b) => b.similarity - a.similarity);
|
||||
return similarities.slice(0, topK);
|
||||
}
|
||||
|
||||
/**
|
||||
* Calculate cosine similarity between two vectors
|
||||
* @param a First vector
|
||||
* @param b Second vector
|
||||
* @returns Similarity score between -1 and 1 (higher is more similar)
|
||||
*/
|
||||
public cosineSimilarity(a: number[], b: number[]): number {
|
||||
if (a.length !== b.length) {
|
||||
throw new Error('Vectors must have the same length');
|
||||
}
|
||||
|
||||
let dotProduct = 0;
|
||||
let normA = 0;
|
||||
let normB = 0;
|
||||
|
||||
for (let i = 0; i < a.length; i++) {
|
||||
dotProduct += a[i] * b[i];
|
||||
normA += a[i] * a[i];
|
||||
normB += b[i] * b[i];
|
||||
}
|
||||
|
||||
const denominator = Math.sqrt(normA) * Math.sqrt(normB);
|
||||
if (denominator === 0) {
|
||||
return 0;
|
||||
}
|
||||
|
||||
return dotProduct / denominator;
|
||||
}
|
||||
|
||||
/**
|
||||
* Update embeddings for a single resource
|
||||
* @param uri The URI of the resource to update
|
||||
* @returns The embedding vector, or null if not found/not processed
|
||||
*/
|
||||
public async updateResource(uri: URI): Promise<Embedding | null> {
|
||||
const resource = this.workspace.find(uri);
|
||||
if (!resource) {
|
||||
// Resource deleted, remove embedding
|
||||
this.embeddings.delete(uri.path);
|
||||
if (this.cache) {
|
||||
this.cache.del(uri);
|
||||
}
|
||||
this.onDidUpdateEmitter.fire();
|
||||
return null;
|
||||
}
|
||||
|
||||
// Skip non-note resources (attachments)
|
||||
if (resource.type !== 'note') {
|
||||
return null;
|
||||
}
|
||||
|
||||
try {
|
||||
const content = await this.workspace.readAsMarkdown(resource.uri);
|
||||
const text = this.prepareTextForEmbedding(resource.title, content);
|
||||
const textChecksum = hash(text);
|
||||
|
||||
// Check cache if available
|
||||
if (this.cache && this.cache.has(uri)) {
|
||||
const cached = this.cache.get(uri);
|
||||
if (cached.checksum === textChecksum) {
|
||||
Logger.debug(
|
||||
`Skipping embedding for ${uri.toFsPath()} - content unchanged`
|
||||
);
|
||||
// Use cached embedding
|
||||
const embedding: Embedding = {
|
||||
vector: cached.embedding,
|
||||
createdAt: Date.now(),
|
||||
};
|
||||
this.embeddings.set(uri.path, embedding);
|
||||
return embedding;
|
||||
}
|
||||
}
|
||||
|
||||
// Generate new embedding
|
||||
const vector = await this.provider.embed(text);
|
||||
|
||||
const embedding: Embedding = {
|
||||
vector,
|
||||
createdAt: Date.now(),
|
||||
};
|
||||
this.embeddings.set(uri.path, embedding);
|
||||
|
||||
// Update cache
|
||||
if (this.cache) {
|
||||
this.cache.set(uri, {
|
||||
checksum: textChecksum,
|
||||
embedding: vector,
|
||||
});
|
||||
}
|
||||
|
||||
this.onDidUpdateEmitter.fire();
|
||||
return embedding;
|
||||
} catch (error) {
|
||||
Logger.error(`Failed to update embedding for ${uri.toFsPath()}`, error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Update embeddings for all notes, processing only missing or stale ones
|
||||
* @param onProgress Optional callback to report progress
|
||||
* @param cancellationToken Optional token to cancel the operation
|
||||
* @returns Promise that resolves when all embeddings are updated
|
||||
* @throws CancellationError if the operation is cancelled
|
||||
*/
|
||||
public async update(
|
||||
onProgress?: ProgressCallback<EmbeddingProgressContext>,
|
||||
cancellationToken?: CancellationToken
|
||||
): Promise<void> {
|
||||
const start = Date.now();
|
||||
|
||||
// Filter to only process notes (not attachments)
|
||||
const allResources = Array.from(this.workspace.resources());
|
||||
const resources = allResources.filter(r => r.type === 'note');
|
||||
|
||||
Logger.info(
|
||||
`Building embeddings for ${resources.length} notes (${allResources.length} total resources)...`
|
||||
);
|
||||
|
||||
let skipped = 0;
|
||||
let generated = 0;
|
||||
let reused = 0;
|
||||
|
||||
// Process embeddings sequentially to avoid overwhelming the service
|
||||
for (let i = 0; i < resources.length; i++) {
|
||||
// Check for cancellation
|
||||
if (cancellationToken?.isCancellationRequested) {
|
||||
Logger.info(
|
||||
`Embedding build cancelled. Processed ${i}/${resources.length} notes.`
|
||||
);
|
||||
throw new CancellationError('Embedding build cancelled');
|
||||
}
|
||||
|
||||
const resource = resources[i];
|
||||
|
||||
onProgress?.({
|
||||
current: i + 1,
|
||||
total: resources.length,
|
||||
context: {
|
||||
uri: resource.uri,
|
||||
title: resource.title,
|
||||
},
|
||||
});
|
||||
|
||||
try {
|
||||
const content = await this.workspace.readAsMarkdown(resource.uri);
|
||||
const text = this.prepareTextForEmbedding(resource.title, content);
|
||||
const textChecksum = hash(text);
|
||||
|
||||
// Check cache if available
|
||||
if (this.cache && this.cache.has(resource.uri)) {
|
||||
const cached = this.cache.get(resource.uri);
|
||||
if (cached.checksum === textChecksum) {
|
||||
// Check if we already have this embedding in memory
|
||||
const existing = this.embeddings.get(resource.uri.path);
|
||||
if (existing) {
|
||||
// Already have current embedding, skip
|
||||
reused++;
|
||||
continue;
|
||||
}
|
||||
|
||||
// Restore from cache
|
||||
this.embeddings.set(resource.uri.path, {
|
||||
vector: cached.embedding,
|
||||
createdAt: Date.now(),
|
||||
});
|
||||
skipped++;
|
||||
continue;
|
||||
}
|
||||
}
|
||||
|
||||
// Generate new embedding
|
||||
const vector = await this.provider.embed(text);
|
||||
this.embeddings.set(resource.uri.path, {
|
||||
vector,
|
||||
createdAt: Date.now(),
|
||||
});
|
||||
|
||||
// Update cache
|
||||
if (this.cache) {
|
||||
this.cache.set(resource.uri, {
|
||||
checksum: textChecksum,
|
||||
embedding: vector,
|
||||
});
|
||||
}
|
||||
|
||||
generated++;
|
||||
} catch (error) {
|
||||
Logger.error(
|
||||
`Failed to generate embedding for ${resource.uri.toFsPath()}`,
|
||||
error
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
const end = Date.now();
|
||||
Logger.info(
|
||||
`Embeddings update complete: ${generated} generated, ${skipped} from cache, ${reused} already current (${
|
||||
this.embeddings.size
|
||||
}/${resources.length} total) in ${end - start}ms`
|
||||
);
|
||||
this.onDidUpdateEmitter.fire();
|
||||
}
|
||||
|
||||
/**
|
||||
* Prepare text for embedding by combining title and content
|
||||
* @param title The title of the note
|
||||
* @param content The markdown content of the note
|
||||
* @returns The combined text to embed
|
||||
*/
|
||||
private prepareTextForEmbedding(title: string, content: string): string {
|
||||
const parts: string[] = [];
|
||||
|
||||
if (title) {
|
||||
parts.push(title);
|
||||
}
|
||||
|
||||
if (content) {
|
||||
parts.push(content);
|
||||
}
|
||||
|
||||
return parts.join('\n\n');
|
||||
}
|
||||
|
||||
/**
|
||||
* Create FoamEmbeddings from a workspace
|
||||
* @param workspace The workspace to generate embeddings for
|
||||
* @param provider The embedding provider to use
|
||||
* @param keepMonitoring Whether to automatically update embeddings when workspace changes
|
||||
* @param cache Optional cache for storing embeddings
|
||||
* @returns The FoamEmbeddings instance
|
||||
*/
|
||||
public static fromWorkspace(
|
||||
workspace: FoamWorkspace,
|
||||
provider: EmbeddingProvider,
|
||||
keepMonitoring: boolean = false,
|
||||
cache?: EmbeddingCache
|
||||
): FoamEmbeddings {
|
||||
const embeddings = new FoamEmbeddings(workspace, provider, cache);
|
||||
|
||||
if (keepMonitoring) {
|
||||
// Update embeddings when resources change
|
||||
embeddings.disposables.push(
|
||||
workspace.onDidAdd(resource => {
|
||||
embeddings.updateResource(resource.uri);
|
||||
}),
|
||||
workspace.onDidUpdate(({ new: resource }) => {
|
||||
embeddings.updateResource(resource.uri);
|
||||
}),
|
||||
workspace.onDidDelete(resource => {
|
||||
embeddings.embeddings.delete(resource.uri.path);
|
||||
embeddings.onDidUpdateEmitter.fire();
|
||||
})
|
||||
);
|
||||
}
|
||||
|
||||
return embeddings;
|
||||
}
|
||||
|
||||
public dispose(): void {
|
||||
this.onDidUpdateEmitter.dispose();
|
||||
this.disposables.forEach(d => d.dispose());
|
||||
this.disposables = [];
|
||||
this.embeddings.clear();
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,29 @@
|
||||
import { URI } from '../../core/model/uri';
|
||||
import { EmbeddingCache, EmbeddingCacheEntry } from './embedding-cache';
|
||||
|
||||
/**
|
||||
* Simple in-memory implementation of embedding cache
|
||||
*/
|
||||
export class InMemoryEmbeddingCache implements EmbeddingCache {
|
||||
private cache: Map<string, EmbeddingCacheEntry> = new Map();
|
||||
|
||||
get(uri: URI): EmbeddingCacheEntry {
|
||||
return this.cache.get(uri.toString());
|
||||
}
|
||||
|
||||
has(uri: URI): boolean {
|
||||
return this.cache.has(uri.toString());
|
||||
}
|
||||
|
||||
set(uri: URI, entry: EmbeddingCacheEntry): void {
|
||||
this.cache.set(uri.toString(), entry);
|
||||
}
|
||||
|
||||
del(uri: URI): void {
|
||||
this.cache.delete(uri.toString());
|
||||
}
|
||||
|
||||
clear(): void {
|
||||
this.cache.clear();
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,294 @@
|
||||
import { Logger } from '../../../core/utils/log';
|
||||
import {
|
||||
OllamaEmbeddingProvider,
|
||||
DEFAULT_OLLAMA_CONFIG,
|
||||
} from './ollama-provider';
|
||||
|
||||
Logger.setLevel('error');
|
||||
|
||||
describe('OllamaEmbeddingProvider', () => {
|
||||
const originalFetch = global.fetch;
|
||||
beforeEach(() => {
|
||||
global.fetch = jest.fn();
|
||||
jest.clearAllMocks();
|
||||
jest.useFakeTimers();
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
jest.useRealTimers();
|
||||
global.fetch = originalFetch;
|
||||
});
|
||||
|
||||
describe('constructor', () => {
|
||||
it('should use default config when no config provided', () => {
|
||||
const provider = new OllamaEmbeddingProvider();
|
||||
const config = provider.getConfig();
|
||||
|
||||
expect(config.url).toBe(DEFAULT_OLLAMA_CONFIG.url);
|
||||
expect(config.model).toBe(DEFAULT_OLLAMA_CONFIG.model);
|
||||
expect(config.timeout).toBe(DEFAULT_OLLAMA_CONFIG.timeout);
|
||||
});
|
||||
|
||||
it('should merge custom config with defaults', () => {
|
||||
const provider = new OllamaEmbeddingProvider({
|
||||
url: 'http://custom:11434',
|
||||
});
|
||||
const config = provider.getConfig();
|
||||
|
||||
expect(config.url).toBe('http://custom:11434');
|
||||
expect(config.model).toBe(DEFAULT_OLLAMA_CONFIG.model);
|
||||
});
|
||||
});
|
||||
|
||||
describe('getProviderInfo', () => {
|
||||
it('should return provider information', () => {
|
||||
const provider = new OllamaEmbeddingProvider();
|
||||
const info = provider.getProviderInfo();
|
||||
|
||||
expect(info.name).toBe('Ollama');
|
||||
expect(info.type).toBe('local');
|
||||
expect(info.model.name).toBe('nomic-embed-text');
|
||||
expect(info.model.dimensions).toBe(768);
|
||||
expect(info.endpoint).toBe('http://localhost:11434');
|
||||
expect(info.description).toBe('Local embedding provider using Ollama');
|
||||
expect(info.metadata).toEqual({ timeout: 30000 });
|
||||
});
|
||||
|
||||
it('should return custom model name when configured', () => {
|
||||
const provider = new OllamaEmbeddingProvider({
|
||||
model: 'custom-model',
|
||||
});
|
||||
const info = provider.getProviderInfo();
|
||||
|
||||
expect(info.model.name).toBe('custom-model');
|
||||
});
|
||||
|
||||
it('should return custom endpoint when configured', () => {
|
||||
const provider = new OllamaEmbeddingProvider({
|
||||
url: 'http://custom:8080',
|
||||
});
|
||||
const info = provider.getProviderInfo();
|
||||
|
||||
expect(info.endpoint).toBe('http://custom:8080');
|
||||
});
|
||||
});
|
||||
|
||||
describe('embed', () => {
|
||||
it('should successfully generate embeddings', async () => {
|
||||
const mockEmbedding = new Array(768).fill(0.1);
|
||||
(global.fetch as jest.Mock).mockResolvedValueOnce({
|
||||
ok: true,
|
||||
json: async () => ({ embeddings: [mockEmbedding] }),
|
||||
});
|
||||
|
||||
const provider = new OllamaEmbeddingProvider();
|
||||
const result = await provider.embed('test text');
|
||||
|
||||
expect(result).toEqual(mockEmbedding);
|
||||
expect(global.fetch).toHaveBeenCalledWith(
|
||||
'http://localhost:11434/api/embed',
|
||||
expect.objectContaining({
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
model: 'nomic-embed-text',
|
||||
input: ['test text'],
|
||||
}),
|
||||
})
|
||||
);
|
||||
});
|
||||
|
||||
it('should throw error on non-ok response', async () => {
|
||||
(global.fetch as jest.Mock).mockResolvedValueOnce({
|
||||
ok: false,
|
||||
status: 500,
|
||||
text: async () => 'Internal server error',
|
||||
});
|
||||
|
||||
const provider = new OllamaEmbeddingProvider();
|
||||
|
||||
await expect(provider.embed('test')).rejects.toThrow(
|
||||
'AI service error (500)'
|
||||
);
|
||||
});
|
||||
|
||||
it('should throw error on connection refused', async () => {
|
||||
(global.fetch as jest.Mock).mockRejectedValueOnce(
|
||||
new Error('fetch failed')
|
||||
);
|
||||
|
||||
const provider = new OllamaEmbeddingProvider();
|
||||
|
||||
await expect(provider.embed('test')).rejects.toThrow(
|
||||
'Cannot connect to Ollama'
|
||||
);
|
||||
});
|
||||
|
||||
it('should timeout after configured duration', async () => {
|
||||
(global.fetch as jest.Mock).mockImplementationOnce(
|
||||
(_url, options) =>
|
||||
new Promise((_resolve, reject) => {
|
||||
// Simulate abort signal being triggered
|
||||
options.signal.addEventListener('abort', () => {
|
||||
const error = new Error('The operation was aborted');
|
||||
error.name = 'AbortError';
|
||||
reject(error);
|
||||
});
|
||||
})
|
||||
);
|
||||
|
||||
const provider = new OllamaEmbeddingProvider({ timeout: 1000 });
|
||||
const embedPromise = provider.embed('test');
|
||||
|
||||
// Fast-forward time to trigger timeout
|
||||
jest.advanceTimersByTime(1001);
|
||||
|
||||
await expect(embedPromise).rejects.toThrow('AI service took too long');
|
||||
});
|
||||
});
|
||||
|
||||
describe('isAvailable', () => {
|
||||
it('should return true when Ollama is available', async () => {
|
||||
(global.fetch as jest.Mock).mockResolvedValueOnce({
|
||||
ok: true,
|
||||
});
|
||||
|
||||
const provider = new OllamaEmbeddingProvider();
|
||||
const result = await provider.isAvailable();
|
||||
|
||||
expect(result).toBe(true);
|
||||
expect(global.fetch).toHaveBeenCalledWith(
|
||||
'http://localhost:11434/api/tags',
|
||||
expect.objectContaining({
|
||||
method: 'GET',
|
||||
})
|
||||
);
|
||||
});
|
||||
|
||||
it('should return false when Ollama is not available', async () => {
|
||||
(global.fetch as jest.Mock).mockRejectedValueOnce(
|
||||
new Error('Connection refused')
|
||||
);
|
||||
|
||||
const provider = new OllamaEmbeddingProvider();
|
||||
const result = await provider.isAvailable();
|
||||
|
||||
expect(result).toBe(false);
|
||||
});
|
||||
|
||||
it('should return false when Ollama returns non-ok status', async () => {
|
||||
(global.fetch as jest.Mock).mockResolvedValueOnce({
|
||||
ok: false,
|
||||
status: 404,
|
||||
});
|
||||
|
||||
const provider = new OllamaEmbeddingProvider();
|
||||
const result = await provider.isAvailable();
|
||||
|
||||
expect(result).toBe(false);
|
||||
});
|
||||
|
||||
it('should timeout quickly (5s) when checking availability', async () => {
|
||||
(global.fetch as jest.Mock).mockImplementationOnce(
|
||||
(_url, options) =>
|
||||
new Promise((_resolve, reject) => {
|
||||
// Simulate abort signal being triggered
|
||||
options.signal.addEventListener('abort', () => {
|
||||
const error = new Error('The operation was aborted');
|
||||
error.name = 'AbortError';
|
||||
reject(error);
|
||||
});
|
||||
})
|
||||
);
|
||||
|
||||
const provider = new OllamaEmbeddingProvider();
|
||||
const availabilityPromise = provider.isAvailable();
|
||||
|
||||
// Fast-forward time to trigger timeout (5s for availability check)
|
||||
jest.advanceTimersByTime(5001);
|
||||
|
||||
const result = await availabilityPromise;
|
||||
expect(result).toBe(false);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('OllamaEmbeddingProvider - Integration', () => {
|
||||
const provider = new OllamaEmbeddingProvider();
|
||||
|
||||
it('should handle text with unicode characters and emojis', async () => {
|
||||
if (!(await provider.isAvailable())) {
|
||||
console.warn('Ollama is not available, skipping test');
|
||||
return;
|
||||
}
|
||||
const text = 'Task completed ✔ 🚀: All systems go! 🌟';
|
||||
const embedding = await provider.embed(text);
|
||||
|
||||
expect(embedding).toBeDefined();
|
||||
expect(Array.isArray(embedding)).toBe(true);
|
||||
expect(embedding.length).toBe(768); // nomic-embed-text dimension
|
||||
expect(embedding.every(n => typeof n === 'number')).toBe(true);
|
||||
});
|
||||
|
||||
it('should handle text with various unicode characters', async () => {
|
||||
if (!(await provider.isAvailable())) {
|
||||
console.warn('Ollama is not available, skipping test');
|
||||
return;
|
||||
}
|
||||
const text = 'Hello 🌍 with émojis and spëcial çharacters • bullet ✓ check';
|
||||
const embedding = await provider.embed(text);
|
||||
|
||||
expect(embedding).toBeDefined();
|
||||
expect(Array.isArray(embedding)).toBe(true);
|
||||
expect(embedding.length).toBe(768);
|
||||
});
|
||||
|
||||
it('should handle text with combining unicode characters', async () => {
|
||||
if (!(await provider.isAvailable())) {
|
||||
console.warn('Ollama is not available, skipping test');
|
||||
return;
|
||||
}
|
||||
// Test with combining diacriticals that could be represented differently
|
||||
const text = 'café vs cafe\u0301'; // Two ways to represent é
|
||||
const embedding = await provider.embed(text);
|
||||
|
||||
expect(embedding).toBeDefined();
|
||||
expect(Array.isArray(embedding)).toBe(true);
|
||||
expect(embedding.length).toBe(768);
|
||||
});
|
||||
|
||||
it('should handle empty text', async () => {
|
||||
if (!(await provider.isAvailable())) {
|
||||
console.warn('Ollama is not available, skipping test');
|
||||
return;
|
||||
}
|
||||
const text = '';
|
||||
const embedding = await provider.embed(text);
|
||||
|
||||
expect(embedding).toBeDefined();
|
||||
expect(Array.isArray(embedding)).toBe(true);
|
||||
// Note: Ollama returns empty array for empty text
|
||||
expect(embedding.length).toBeGreaterThanOrEqual(0);
|
||||
});
|
||||
|
||||
it.each([10, 50, 60, 100, 300])(
|
||||
'should handle text of various lengths',
|
||||
async length => {
|
||||
if (!(await provider.isAvailable())) {
|
||||
console.warn('Ollama is not available, skipping test');
|
||||
return;
|
||||
}
|
||||
const text = 'Lorem ipsum dolor sit amet. '.repeat(length);
|
||||
try {
|
||||
const embedding = await provider.embed(text);
|
||||
expect(embedding).toBeDefined();
|
||||
expect(Array.isArray(embedding)).toBe(true);
|
||||
expect(embedding.length).toBe(768);
|
||||
} catch (error) {
|
||||
throw new Error(
|
||||
`Embedding failed for text of length ${text.length}: ${error}`
|
||||
);
|
||||
}
|
||||
}
|
||||
);
|
||||
});
|
||||
166
packages/foam-vscode/src/ai/providers/ollama/ollama-provider.ts
Normal file
166
packages/foam-vscode/src/ai/providers/ollama/ollama-provider.ts
Normal file
@@ -0,0 +1,166 @@
|
||||
import {
|
||||
EmbeddingProvider,
|
||||
EmbeddingProviderInfo,
|
||||
} from '../../services/embedding-provider';
|
||||
import { Logger } from '../../../core/utils/log';
|
||||
|
||||
/**
|
||||
* Configuration for Ollama embedding provider
|
||||
*/
|
||||
export interface OllamaConfig {
|
||||
/** Base URL for Ollama API (default: http://localhost:11434) */
|
||||
url: string;
|
||||
/** Model name to use for embeddings (default: nomic-embed-text) */
|
||||
model: string;
|
||||
/** Request timeout in milliseconds (default: 30000) */
|
||||
timeout: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* Default configuration for Ollama
|
||||
*/
|
||||
export const DEFAULT_OLLAMA_CONFIG: OllamaConfig = {
|
||||
url: 'http://localhost:11434',
|
||||
model: 'nomic-embed-text',
|
||||
timeout: 30000,
|
||||
};
|
||||
|
||||
/**
|
||||
* Ollama API response for embeddings
|
||||
*/
|
||||
interface OllamaEmbeddingResponse {
|
||||
embeddings: number[][];
|
||||
}
|
||||
|
||||
/**
|
||||
* Embedding provider that uses Ollama for generating embeddings
|
||||
*/
|
||||
export class OllamaEmbeddingProvider implements EmbeddingProvider {
|
||||
private config: OllamaConfig;
|
||||
|
||||
constructor(config: Partial<OllamaConfig> = {}) {
|
||||
this.config = { ...DEFAULT_OLLAMA_CONFIG, ...config };
|
||||
}
|
||||
|
||||
/**
|
||||
* Generate an embedding for the given text
|
||||
*/
|
||||
async embed(text: string): Promise<number[]> {
|
||||
// normalize text to suitable input (format and size)
|
||||
// TODO we should better handle long texts by chunking them and averaging embeddings
|
||||
const input = text.substring(0, 6000).normalize();
|
||||
|
||||
try {
|
||||
const controller = new AbortController();
|
||||
const timeoutId = setTimeout(
|
||||
() => controller.abort(),
|
||||
this.config.timeout
|
||||
);
|
||||
|
||||
const response = await fetch(`${this.config.url}/api/embed`, {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({
|
||||
model: this.config.model,
|
||||
input: [input],
|
||||
}),
|
||||
signal: controller.signal,
|
||||
});
|
||||
|
||||
clearTimeout(timeoutId);
|
||||
|
||||
if (!response.ok) {
|
||||
const errorText = await response.text();
|
||||
throw new Error(`AI service error (${response.status}): ${errorText}`);
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
if (data.embeddings == null) {
|
||||
throw new Error(
|
||||
`Invalid response from AI service: ${JSON.stringify(data)}`
|
||||
);
|
||||
}
|
||||
return data.embeddings[0];
|
||||
} catch (error) {
|
||||
if (error instanceof Error) {
|
||||
if (error.name === 'AbortError') {
|
||||
throw new Error(
|
||||
'AI service took too long to respond. It may be busy processing another request.'
|
||||
);
|
||||
}
|
||||
if (
|
||||
error.message.includes('fetch') ||
|
||||
error.message.includes('ECONNREFUSED')
|
||||
) {
|
||||
throw new Error(
|
||||
`Cannot connect to Ollama at ${this.config.url}. Make sure Ollama is installed and running.`
|
||||
);
|
||||
}
|
||||
}
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if Ollama is available and the model is accessible
|
||||
*/
|
||||
async isAvailable(): Promise<boolean> {
|
||||
try {
|
||||
const controller = new AbortController();
|
||||
const timeoutId = setTimeout(() => controller.abort(), 5000);
|
||||
|
||||
// Try to reach the Ollama API
|
||||
const response = await fetch(`${this.config.url}/api/tags`, {
|
||||
method: 'GET',
|
||||
signal: controller.signal,
|
||||
});
|
||||
|
||||
clearTimeout(timeoutId);
|
||||
|
||||
if (!response.ok) {
|
||||
Logger.warn(
|
||||
`Ollama API returned status ${response.status} when checking availability`
|
||||
);
|
||||
return false;
|
||||
}
|
||||
|
||||
return true;
|
||||
} catch (error) {
|
||||
Logger.debug(
|
||||
`Ollama not available at ${this.config.url}: ${
|
||||
error instanceof Error ? error.message : 'Unknown error'
|
||||
}`
|
||||
);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Get provider information including model details
|
||||
*/
|
||||
getProviderInfo(): EmbeddingProviderInfo {
|
||||
return {
|
||||
name: 'Ollama',
|
||||
type: 'local',
|
||||
model: {
|
||||
name: this.config.model,
|
||||
// nomic-embed-text produces 768-dimensional embeddings
|
||||
dimensions: 768,
|
||||
},
|
||||
description: 'Local embedding provider using Ollama',
|
||||
endpoint: this.config.url,
|
||||
metadata: {
|
||||
timeout: this.config.timeout,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Get current configuration
|
||||
*/
|
||||
getConfig(): OllamaConfig {
|
||||
return { ...this.config };
|
||||
}
|
||||
}
|
||||
61
packages/foam-vscode/src/ai/services/embedding-provider.ts
Normal file
61
packages/foam-vscode/src/ai/services/embedding-provider.ts
Normal file
@@ -0,0 +1,61 @@
|
||||
/**
|
||||
* Information about an embedding provider and its model
|
||||
*/
|
||||
export interface EmbeddingProviderInfo {
|
||||
/** Human-readable name of the provider (e.g., "Ollama", "OpenAI") */
|
||||
name: string;
|
||||
|
||||
/** Type of provider */
|
||||
type: 'local' | 'remote';
|
||||
|
||||
/** Model information */
|
||||
model: {
|
||||
/** Model name (e.g., "nomic-embed-text", "text-embedding-3-small") */
|
||||
name: string;
|
||||
/** Vector dimensions */
|
||||
dimensions: number;
|
||||
};
|
||||
|
||||
/** Optional description of the provider */
|
||||
description?: string;
|
||||
|
||||
/** Backend endpoint/URL if applicable */
|
||||
endpoint?: string;
|
||||
|
||||
/** Additional provider-specific metadata */
|
||||
metadata?: Record<string, unknown>;
|
||||
}
|
||||
|
||||
/**
|
||||
* Provider interface for generating text embeddings
|
||||
*/
|
||||
export interface EmbeddingProvider {
|
||||
/**
|
||||
* Generate an embedding vector for the given text
|
||||
* @param text The text to embed
|
||||
* @returns A promise that resolves to the embedding vector
|
||||
*/
|
||||
embed(text: string): Promise<number[]>;
|
||||
|
||||
/**
|
||||
* Check if the embedding service is available and ready to use
|
||||
* @returns A promise that resolves to true if available, false otherwise
|
||||
*/
|
||||
isAvailable(): Promise<boolean>;
|
||||
|
||||
/**
|
||||
* Get information about the provider and its model
|
||||
* @returns Provider metadata including name, type, model info, and configuration
|
||||
*/
|
||||
getProviderInfo(): EmbeddingProviderInfo;
|
||||
}
|
||||
|
||||
/**
|
||||
* Represents a text embedding with metadata
|
||||
*/
|
||||
export interface Embedding {
|
||||
/** The embedding vector */
|
||||
vector: number[];
|
||||
/** Timestamp when the embedding was created */
|
||||
createdAt: number;
|
||||
}
|
||||
@@ -0,0 +1,27 @@
|
||||
import { EmbeddingProvider, EmbeddingProviderInfo } from './embedding-provider';
|
||||
|
||||
/**
|
||||
* A no-op embedding provider that does nothing.
|
||||
* Used when no real embedding provider is available.
|
||||
*/
|
||||
export class NoOpEmbeddingProvider implements EmbeddingProvider {
|
||||
async embed(_text: string): Promise<number[]> {
|
||||
return [];
|
||||
}
|
||||
|
||||
async isAvailable(): Promise<boolean> {
|
||||
return false;
|
||||
}
|
||||
|
||||
getProviderInfo(): EmbeddingProviderInfo {
|
||||
return {
|
||||
name: 'None',
|
||||
type: 'local',
|
||||
model: {
|
||||
name: 'none',
|
||||
dimensions: 0,
|
||||
},
|
||||
description: 'No embedding provider configured',
|
||||
};
|
||||
}
|
||||
}
|
||||
@@ -0,0 +1,88 @@
|
||||
/* @unit-ready */
|
||||
import * as vscode from 'vscode';
|
||||
import {
|
||||
cleanWorkspace,
|
||||
createFile,
|
||||
deleteFile,
|
||||
waitForNoteInFoamWorkspace,
|
||||
} from '../../../test/test-utils-vscode';
|
||||
import { BUILD_EMBEDDINGS_COMMAND } from './build-embeddings';
|
||||
|
||||
describe('build-embeddings command', () => {
|
||||
it('should complete successfully with no notes to analyze', async () => {
|
||||
await cleanWorkspace();
|
||||
|
||||
const showInfoSpy = jest
|
||||
.spyOn(vscode.window, 'showInformationMessage')
|
||||
.mockResolvedValue(undefined);
|
||||
|
||||
const result = await vscode.commands.executeCommand<
|
||||
'complete' | 'cancelled' | 'error'
|
||||
>(BUILD_EMBEDDINGS_COMMAND.command);
|
||||
|
||||
expect(result).toBe('complete');
|
||||
expect(showInfoSpy).toHaveBeenCalledWith(
|
||||
expect.stringContaining('No notes found')
|
||||
);
|
||||
|
||||
showInfoSpy.mockRestore();
|
||||
});
|
||||
|
||||
it('should analyze notes and report completion', async () => {
|
||||
const note1 = await createFile('# Note 1\nContent here', ['note1.md']);
|
||||
const note2 = await createFile('# Note 2\nMore content', ['note2.md']);
|
||||
|
||||
await waitForNoteInFoamWorkspace(note1.uri);
|
||||
await waitForNoteInFoamWorkspace(note2.uri);
|
||||
|
||||
const showInfoSpy = jest
|
||||
.spyOn(vscode.window, 'showInformationMessage')
|
||||
.mockResolvedValue(undefined);
|
||||
|
||||
const result = await vscode.commands.executeCommand<
|
||||
'complete' | 'cancelled' | 'error'
|
||||
>(BUILD_EMBEDDINGS_COMMAND.command);
|
||||
|
||||
expect(result).toBe('complete');
|
||||
expect(showInfoSpy).toHaveBeenCalledWith(
|
||||
expect.stringMatching(/Analyzed.*2/)
|
||||
);
|
||||
|
||||
showInfoSpy.mockRestore();
|
||||
await deleteFile(note1.uri);
|
||||
await deleteFile(note2.uri);
|
||||
});
|
||||
|
||||
it('should return cancelled status when operation is cancelled', async () => {
|
||||
const note1 = await createFile('# Note 1\nContent', ['note1.md']);
|
||||
await waitForNoteInFoamWorkspace(note1.uri);
|
||||
|
||||
const tokenSource = new vscode.CancellationTokenSource();
|
||||
|
||||
const withProgressSpy = jest
|
||||
.spyOn(vscode.window, 'withProgress')
|
||||
.mockImplementation(async (options, task) => {
|
||||
const progress = { report: () => {} };
|
||||
// Cancel immediately
|
||||
tokenSource.cancel();
|
||||
return await task(progress, tokenSource.token);
|
||||
});
|
||||
|
||||
const showInfoSpy = jest
|
||||
.spyOn(vscode.window, 'showInformationMessage')
|
||||
.mockResolvedValue(undefined);
|
||||
|
||||
const result = await vscode.commands.executeCommand<
|
||||
'complete' | 'cancelled' | 'error'
|
||||
>(BUILD_EMBEDDINGS_COMMAND.command);
|
||||
|
||||
expect(result).toBe('cancelled');
|
||||
expect(showInfoSpy).toHaveBeenCalledWith(
|
||||
expect.stringContaining('cancelled')
|
||||
);
|
||||
|
||||
withProgressSpy.mockRestore();
|
||||
showInfoSpy.mockRestore();
|
||||
await deleteFile(note1.uri);
|
||||
});
|
||||
});
|
||||
@@ -0,0 +1,91 @@
|
||||
import * as vscode from 'vscode';
|
||||
import { Foam } from '../../../core/model/foam';
|
||||
import { CancellationError } from '../../../core/services/progress';
|
||||
import { TaskDeduplicator } from '../../../core/utils/task-deduplicator';
|
||||
import { FoamWorkspace } from '../../../core/model/workspace';
|
||||
import { FoamEmbeddings } from '../../../ai/model/embeddings';
|
||||
|
||||
export const BUILD_EMBEDDINGS_COMMAND = {
|
||||
command: 'foam-vscode.build-embeddings',
|
||||
title: 'Foam: Analyze Notes with AI',
|
||||
};
|
||||
|
||||
export default async function activate(
|
||||
context: vscode.ExtensionContext,
|
||||
foamPromise: Promise<Foam>
|
||||
) {
|
||||
const foam = await foamPromise;
|
||||
// Deduplicate concurrent executions
|
||||
const deduplicator = new TaskDeduplicator<
|
||||
'complete' | 'cancelled' | 'error'
|
||||
>();
|
||||
|
||||
context.subscriptions.push(
|
||||
vscode.commands.registerCommand(
|
||||
BUILD_EMBEDDINGS_COMMAND.command,
|
||||
async () => {
|
||||
return await deduplicator.run(
|
||||
() => buildEmbeddings(foam.workspace, foam.embeddings),
|
||||
() => {
|
||||
vscode.window.showInformationMessage(
|
||||
'Note analysis is already in progress - waiting for it to complete'
|
||||
);
|
||||
}
|
||||
);
|
||||
}
|
||||
)
|
||||
);
|
||||
}
|
||||
|
||||
async function buildEmbeddings(
|
||||
workspace: FoamWorkspace,
|
||||
embeddings: FoamEmbeddings
|
||||
): Promise<'complete' | 'cancelled' | 'error'> {
|
||||
const notesCount = workspace.list().filter(r => r.type === 'note').length;
|
||||
|
||||
if (notesCount === 0) {
|
||||
vscode.window.showInformationMessage('No notes found in workspace');
|
||||
return 'complete';
|
||||
}
|
||||
|
||||
// Show progress notification
|
||||
return await vscode.window.withProgress(
|
||||
{
|
||||
location: vscode.ProgressLocation.Window,
|
||||
title: 'Analyzing notes',
|
||||
cancellable: true,
|
||||
},
|
||||
async (progress, token) => {
|
||||
try {
|
||||
await embeddings.update(progressInfo => {
|
||||
const title = progressInfo.context?.title || 'Processing...';
|
||||
const increment = (1 / progressInfo.total) * 100;
|
||||
progress.report({
|
||||
message: `${progressInfo.current}/${progressInfo.total} - ${title}`,
|
||||
increment: increment,
|
||||
});
|
||||
}, token);
|
||||
|
||||
vscode.window.showInformationMessage(
|
||||
`✓ Analyzed ${embeddings.size()} of ${notesCount} notes`
|
||||
);
|
||||
return 'complete';
|
||||
} catch (error) {
|
||||
if (error instanceof CancellationError) {
|
||||
vscode.window.showInformationMessage(
|
||||
'Analysis cancelled. Run the command again to continue where you left off.'
|
||||
);
|
||||
return 'cancelled';
|
||||
}
|
||||
|
||||
const errorMessage =
|
||||
error instanceof Error ? error.message : 'Unknown error';
|
||||
|
||||
vscode.window.showErrorMessage(
|
||||
`Failed to analyze notes: ${errorMessage}`
|
||||
);
|
||||
return 'error';
|
||||
}
|
||||
}
|
||||
);
|
||||
}
|
||||
@@ -0,0 +1,99 @@
|
||||
import * as vscode from 'vscode';
|
||||
import { Foam } from '../../../core/model/foam';
|
||||
import { fromVsCodeUri, toVsCodeUri } from '../../../utils/vsc-utils';
|
||||
import { URI } from '../../../core/model/uri';
|
||||
import { BUILD_EMBEDDINGS_COMMAND } from './build-embeddings';
|
||||
|
||||
export const SHOW_SIMILAR_NOTES_COMMAND = {
|
||||
command: 'foam-vscode.show-similar-notes',
|
||||
title: 'Foam: Show Similar Notes',
|
||||
};
|
||||
|
||||
export default async function activate(
|
||||
context: vscode.ExtensionContext,
|
||||
foamPromise: Promise<Foam>
|
||||
) {
|
||||
const foam = await foamPromise;
|
||||
|
||||
context.subscriptions.push(
|
||||
vscode.commands.registerCommand(
|
||||
SHOW_SIMILAR_NOTES_COMMAND.command,
|
||||
async () => {
|
||||
await showSimilarNotes(foam);
|
||||
}
|
||||
)
|
||||
);
|
||||
}
|
||||
|
||||
async function showSimilarNotes(foam: Foam): Promise<void> {
|
||||
// Get the active editor
|
||||
const editor = vscode.window.activeTextEditor;
|
||||
if (!editor) {
|
||||
vscode.window.showInformationMessage('Please open a note first');
|
||||
return;
|
||||
}
|
||||
|
||||
// Get the URI of the active document
|
||||
const uri = fromVsCodeUri(editor.document.uri);
|
||||
|
||||
// Check if the resource exists in workspace
|
||||
const resource = foam.workspace.find(uri);
|
||||
if (!resource) {
|
||||
vscode.window.showInformationMessage('This file is not a note');
|
||||
return;
|
||||
}
|
||||
|
||||
// Ensure embeddings are up-to-date (incremental update)
|
||||
const status: 'complete' | 'error' | 'cancelled' =
|
||||
await vscode.commands.executeCommand(BUILD_EMBEDDINGS_COMMAND.command);
|
||||
|
||||
if (status !== 'complete') {
|
||||
return;
|
||||
}
|
||||
|
||||
// Check if embedding exists for this resource
|
||||
const embedding = foam.embeddings.getEmbedding(uri);
|
||||
if (!embedding) {
|
||||
vscode.window.showInformationMessage(
|
||||
'This note hasn\'t been analyzed yet. Make sure the AI service is running and try the "Analyze Notes with AI" command.'
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
// Get similar notes
|
||||
const similar = foam.embeddings.getSimilar(uri, 10);
|
||||
|
||||
if (similar.length === 0) {
|
||||
vscode.window.showInformationMessage('No similar notes found');
|
||||
return;
|
||||
}
|
||||
|
||||
// Create quick pick items
|
||||
const items: vscode.QuickPickItem[] = similar.map(item => {
|
||||
const resource = foam.workspace.find(item.uri);
|
||||
const title = resource?.title || item.uri.getBasename();
|
||||
const similarityPercent = (item.similarity * 100).toFixed(1);
|
||||
|
||||
return {
|
||||
label: `$(file) ${title}`,
|
||||
description: `${similarityPercent}% similar`,
|
||||
detail: item.uri.toFsPath(),
|
||||
uri: item.uri,
|
||||
} as vscode.QuickPickItem & { uri: URI };
|
||||
});
|
||||
|
||||
// Show quick pick
|
||||
const selected = await vscode.window.showQuickPick(items, {
|
||||
placeHolder: 'Select a similar note to open',
|
||||
matchOnDescription: true,
|
||||
matchOnDetail: true,
|
||||
});
|
||||
|
||||
if (selected) {
|
||||
const selectedUri = (selected as any).uri as URI;
|
||||
const doc = await vscode.workspace.openTextDocument(
|
||||
toVsCodeUri(selectedUri)
|
||||
);
|
||||
await vscode.window.showTextDocument(doc);
|
||||
}
|
||||
}
|
||||
138
packages/foam-vscode/src/ai/vscode/panels/related-notes.ts
Normal file
138
packages/foam-vscode/src/ai/vscode/panels/related-notes.ts
Normal file
@@ -0,0 +1,138 @@
|
||||
import * as vscode from 'vscode';
|
||||
import { Foam } from '../../../core/model/foam';
|
||||
import { FoamWorkspace } from '../../../core/model/workspace';
|
||||
import { URI } from '../../../core/model/uri';
|
||||
import { fromVsCodeUri } from '../../../utils/vsc-utils';
|
||||
import { BaseTreeProvider } from '../../../features/panels/utils/base-tree-provider';
|
||||
import { ResourceTreeItem } from '../../../features/panels/utils/tree-view-utils';
|
||||
import { FoamEmbeddings } from '../../../ai/model/embeddings';
|
||||
|
||||
export default async function activate(
|
||||
context: vscode.ExtensionContext,
|
||||
foamPromise: Promise<Foam>
|
||||
) {
|
||||
const foam = await foamPromise;
|
||||
|
||||
const provider = new RelatedNotesTreeDataProvider(
|
||||
foam.workspace,
|
||||
foam.embeddings,
|
||||
context.globalState
|
||||
);
|
||||
|
||||
const treeView = vscode.window.createTreeView('foam-vscode.related-notes', {
|
||||
treeDataProvider: provider,
|
||||
showCollapseAll: false,
|
||||
});
|
||||
|
||||
const updateTreeView = async () => {
|
||||
const activeEditor = vscode.window.activeTextEditor;
|
||||
provider.target = activeEditor
|
||||
? fromVsCodeUri(activeEditor.document.uri)
|
||||
: undefined;
|
||||
await provider.refresh();
|
||||
|
||||
// Update context for conditional viewsWelcome messages
|
||||
vscode.commands.executeCommand(
|
||||
'setContext',
|
||||
'foam.relatedNotes.state',
|
||||
provider.getState()
|
||||
);
|
||||
};
|
||||
|
||||
updateTreeView();
|
||||
|
||||
context.subscriptions.push(
|
||||
provider,
|
||||
treeView,
|
||||
foam.embeddings.onDidUpdate(() => updateTreeView()),
|
||||
vscode.window.onDidChangeActiveTextEditor(() => updateTreeView()),
|
||||
provider.onDidChangeTreeData(() => {
|
||||
treeView.title = `Related Notes (${provider.nValues})`;
|
||||
})
|
||||
);
|
||||
}
|
||||
|
||||
export class RelatedNotesTreeDataProvider extends BaseTreeProvider<vscode.TreeItem> {
|
||||
public target?: URI = undefined;
|
||||
public nValues = 0;
|
||||
private relatedNotes: Array<{ uri: URI; similarity: number }> = [];
|
||||
private currentNoteHasEmbedding = false;
|
||||
|
||||
constructor(
|
||||
private workspace: FoamWorkspace,
|
||||
private embeddings: FoamEmbeddings,
|
||||
public state: vscode.Memento
|
||||
) {
|
||||
super();
|
||||
}
|
||||
|
||||
async refresh(): Promise<void> {
|
||||
const uri = this.target;
|
||||
|
||||
// Clear if no target or target is not a note
|
||||
if (!uri) {
|
||||
this.relatedNotes = [];
|
||||
this.nValues = 0;
|
||||
this.currentNoteHasEmbedding = false;
|
||||
super.refresh();
|
||||
return;
|
||||
}
|
||||
|
||||
const resource = this.workspace.find(uri);
|
||||
if (!resource || resource.type !== 'note') {
|
||||
this.relatedNotes = [];
|
||||
this.nValues = 0;
|
||||
this.currentNoteHasEmbedding = false;
|
||||
super.refresh();
|
||||
return;
|
||||
}
|
||||
|
||||
// Check if current note has an embedding
|
||||
this.currentNoteHasEmbedding = this.embeddings.getEmbedding(uri) !== null;
|
||||
|
||||
// Get similar notes (user can click "Build Embeddings" button if needed)
|
||||
const similar = this.embeddings.getSimilar(uri, 10);
|
||||
this.relatedNotes = similar.filter(n => n.similarity > 0.6);
|
||||
this.nValues = this.relatedNotes.length;
|
||||
super.refresh();
|
||||
}
|
||||
|
||||
async getChildren(item?: vscode.TreeItem): Promise<vscode.TreeItem[]> {
|
||||
if (item) {
|
||||
return [];
|
||||
}
|
||||
|
||||
// If no related notes found, show appropriate message in viewsWelcome
|
||||
// The empty array will trigger the viewsWelcome content
|
||||
if (this.relatedNotes.length === 0) {
|
||||
return [];
|
||||
}
|
||||
|
||||
return this.relatedNotes
|
||||
.map(({ uri, similarity }) => {
|
||||
const resource = this.workspace.find(uri);
|
||||
if (!resource) {
|
||||
return null;
|
||||
}
|
||||
|
||||
const item = new ResourceTreeItem(resource, this.workspace);
|
||||
// Show similarity score as percentage in description
|
||||
item.description = `${Math.round(similarity * 100)}%`;
|
||||
return item;
|
||||
})
|
||||
.filter(item => item !== null) as ResourceTreeItem[];
|
||||
}
|
||||
|
||||
/**
|
||||
* Returns the current state of the related notes panel
|
||||
*/
|
||||
public getState(): 'no-note' | 'no-embedding' | 'ready' {
|
||||
if (!this.target) {
|
||||
return 'no-note';
|
||||
}
|
||||
if (!this.currentNoteHasEmbedding) {
|
||||
return 'no-embedding';
|
||||
}
|
||||
return 'ready';
|
||||
}
|
||||
}
|
||||
@@ -1,297 +1,13 @@
|
||||
import { generateLinkReferences } from '.';
|
||||
import { createTestNote, createTestWorkspace } from '../../test/test-utils';
|
||||
import { Range } from '../model/range';
|
||||
import { Logger } from '../utils/log';
|
||||
import { URI } from '../model/uri';
|
||||
import { EOL } from 'os';
|
||||
import { createMarkdownParser } from '../services/markdown-parser';
|
||||
import { TextEdit } from '../services/text-edit';
|
||||
|
||||
Logger.setLevel('error');
|
||||
|
||||
describe('generateLinkReferences', () => {
|
||||
it('should add link references to a file that does not have them', async () => {
|
||||
const parser = createMarkdownParser();
|
||||
const workspace = createTestWorkspace([URI.file('/')]);
|
||||
workspace.set(
|
||||
createTestNote({ uri: '/first-document.md', title: 'First Document' })
|
||||
);
|
||||
workspace.set(
|
||||
createTestNote({ uri: '/second-document.md', title: 'Second Document' })
|
||||
);
|
||||
workspace.set(
|
||||
createTestNote({
|
||||
uri: '/file-without-title.md',
|
||||
title: 'file-without-title',
|
||||
})
|
||||
);
|
||||
const noteText = `# Index
|
||||
|
||||
This file is intentionally missing the link reference definitions
|
||||
|
||||
[[first-document]]
|
||||
|
||||
[[second-document]]
|
||||
|
||||
[[file-without-title]]
|
||||
`;
|
||||
|
||||
const note = parser.parse(URI.file('/note.md'), textForNote(noteText));
|
||||
|
||||
const expected = {
|
||||
newText: textForNote(
|
||||
`
|
||||
[first-document]: first-document "First Document"
|
||||
[second-document]: second-document "Second Document"
|
||||
[file-without-title]: file-without-title "file-without-title"`
|
||||
),
|
||||
range: Range.create(9, 0, 9, 0),
|
||||
};
|
||||
|
||||
const noteEol = EOL;
|
||||
const actual = await generateLinkReferences(
|
||||
note,
|
||||
noteText,
|
||||
noteEol,
|
||||
workspace,
|
||||
false
|
||||
);
|
||||
|
||||
expect(actual![0].range.start).toEqual(expected.range.start);
|
||||
expect(actual![0].range.end).toEqual(expected.range.end);
|
||||
expect(actual![0].newText).toEqual(expected.newText);
|
||||
});
|
||||
|
||||
it('should remove link definitions from a file that has them, if no links are present', async () => {
|
||||
const parser = createMarkdownParser();
|
||||
const workspace = createTestWorkspace([URI.file('/')]);
|
||||
workspace.set(
|
||||
createTestNote({ uri: '/first-document.md', title: 'First Document' })
|
||||
);
|
||||
const noteText = `# Second Document
|
||||
|
||||
This is just a link target for now.
|
||||
|
||||
We can use it for other things later if needed.
|
||||
|
||||
[first-document]: first-document 'First Document'
|
||||
`;
|
||||
|
||||
const note = parser.parse(URI.file('/note.md'), textForNote(noteText));
|
||||
|
||||
const expected = {
|
||||
newText: '',
|
||||
range: Range.create(6, 0, 6, 49),
|
||||
};
|
||||
|
||||
const actual = await generateLinkReferences(
|
||||
note,
|
||||
noteText,
|
||||
EOL,
|
||||
workspace,
|
||||
false
|
||||
);
|
||||
|
||||
expect(actual.length).toBe(1);
|
||||
expect(actual[0]!.range.start).toEqual(expected.range.start);
|
||||
expect(actual[0]!.range.end).toEqual(expected.range.end);
|
||||
expect(actual[0]!.newText).toEqual(expected.newText);
|
||||
});
|
||||
|
||||
it('should update link definitions if they are present but changed', async () => {
|
||||
const parser = createMarkdownParser();
|
||||
const workspace = createTestWorkspace([URI.file('/')]);
|
||||
workspace.set(
|
||||
createTestNote({ uri: '/second-document.md', title: 'Second Document' })
|
||||
);
|
||||
workspace.set(
|
||||
createTestNote({
|
||||
uri: '/file-without-title.md',
|
||||
title: 'file-without-title',
|
||||
})
|
||||
);
|
||||
const noteText = `# First Document
|
||||
|
||||
Here's some [unrelated] content.
|
||||
|
||||
[unrelated]: http://unrelated.com 'This link should not be changed'
|
||||
|
||||
[[file-without-title]]
|
||||
|
||||
[second-document]: second-document 'Second Document'
|
||||
`;
|
||||
|
||||
const note = parser.parse(URI.file('/note.md'), textForNote(noteText));
|
||||
|
||||
const expected = [
|
||||
{
|
||||
newText: '',
|
||||
range: Range.create(8, 0, 8, 52),
|
||||
},
|
||||
{
|
||||
newText: textForNote(
|
||||
`\n[file-without-title]: file-without-title "file-without-title"`
|
||||
),
|
||||
range: Range.create(9, 0, 9, 0),
|
||||
},
|
||||
];
|
||||
|
||||
const actual = await generateLinkReferences(
|
||||
note,
|
||||
noteText,
|
||||
EOL,
|
||||
workspace,
|
||||
false
|
||||
);
|
||||
|
||||
expect(actual.length).toBe(2);
|
||||
expect(actual[0]!.range.start).toEqual(expected[0].range.start);
|
||||
expect(actual[0]!.range.end).toEqual(expected[0].range.end);
|
||||
expect(actual[0]!.newText).toEqual(expected[0].newText);
|
||||
expect(actual[1]!.range.start).toEqual(expected[1].range.start);
|
||||
expect(actual[1]!.range.end).toEqual(expected[1].range.end);
|
||||
expect(actual[1]!.newText).toEqual(expected[1].newText);
|
||||
});
|
||||
|
||||
it('should not cause any changes if link reference definitions were up to date', async () => {
|
||||
const parser = createMarkdownParser();
|
||||
const workspace = createTestWorkspace([URI.file('/')])
|
||||
.set(
|
||||
createTestNote({ uri: '/first-document.md', title: 'First Document' })
|
||||
)
|
||||
.set(
|
||||
createTestNote({ uri: '/second-document.md', title: 'Second Document' })
|
||||
);
|
||||
const noteText = `# Third Document
|
||||
|
||||
All the link references are correct in this file.
|
||||
|
||||
[[first-document]]
|
||||
|
||||
[[second-document]]
|
||||
|
||||
[first-document]: first-document "First Document"
|
||||
[second-document]: second-document "Second Document"
|
||||
`;
|
||||
|
||||
const note = parser.parse(URI.file('/note.md'), textForNote(noteText));
|
||||
|
||||
const expected = [];
|
||||
|
||||
const noteEol = EOL;
|
||||
const actual = await generateLinkReferences(
|
||||
note,
|
||||
noteText,
|
||||
noteEol,
|
||||
workspace,
|
||||
false
|
||||
);
|
||||
|
||||
expect(actual).toEqual(expected);
|
||||
});
|
||||
|
||||
it('should put links with spaces in angel brackets', async () => {
|
||||
const parser = createMarkdownParser();
|
||||
const workspace = createTestWorkspace([URI.file('/')]).set(
|
||||
createTestNote({
|
||||
uri: '/Note being referred as angel.md',
|
||||
title: 'Note being referred as angel',
|
||||
})
|
||||
);
|
||||
const noteText = `# Angel reference
|
||||
|
||||
[[Note being referred as angel]]
|
||||
`;
|
||||
const note = parser.parse(URI.file('/note.md'), textForNote(noteText));
|
||||
|
||||
const expected = {
|
||||
newText: textForNote(
|
||||
`
|
||||
[Note being referred as angel]: <Note being referred as angel> "Note being referred as angel"`
|
||||
),
|
||||
range: Range.create(3, 0, 3, 0),
|
||||
};
|
||||
|
||||
const actual = await generateLinkReferences(
|
||||
note,
|
||||
noteText,
|
||||
EOL,
|
||||
workspace,
|
||||
false
|
||||
);
|
||||
|
||||
expect(actual.length).toBe(1);
|
||||
expect(actual[0]!.range.start).toEqual(expected.range.start);
|
||||
expect(actual[0]!.range.end).toEqual(expected.range.end);
|
||||
expect(actual[0]!.newText).toEqual(expected.newText);
|
||||
});
|
||||
|
||||
it('should not remove explicitly entered link references', async () => {
|
||||
const parser = createMarkdownParser();
|
||||
const workspace = createTestWorkspace([URI.file('/')]);
|
||||
workspace.set(
|
||||
createTestNote({ uri: '/second-document.md', title: 'Second Document' })
|
||||
);
|
||||
workspace.set(
|
||||
createTestNote({
|
||||
uri: '/file-without-title.md',
|
||||
title: 'file-without-title',
|
||||
})
|
||||
);
|
||||
const noteText = `# File with explicit link references
|
||||
|
||||
A Bug [^footerlink]. Here is [Another link][linkreference]
|
||||
|
||||
[^footerlink]: https://foambubble.github.io/
|
||||
|
||||
[linkreference]: https://foambubble.github.io/
|
||||
`;
|
||||
|
||||
const note = parser.parse(URI.file('/note.md'), textForNote(noteText));
|
||||
|
||||
const expected = [];
|
||||
|
||||
const actual = await generateLinkReferences(
|
||||
note,
|
||||
noteText,
|
||||
EOL,
|
||||
workspace,
|
||||
false
|
||||
);
|
||||
|
||||
expect(actual).toEqual(expected);
|
||||
});
|
||||
|
||||
it('should not remove explicitly entered link references and have an implicit link', async () => {
|
||||
const parser = createMarkdownParser();
|
||||
const workspace = createTestWorkspace([URI.file('/')]);
|
||||
workspace.set(
|
||||
createTestNote({ uri: '/second-document.md', title: 'Second Document' })
|
||||
);
|
||||
const noteText = `# File with explicit link references
|
||||
|
||||
A Bug [^footerlink]. Here is [Another link][linkreference].
|
||||
I also want a [[first-document]].
|
||||
|
||||
[^footerlink]: https://foambubble.github.io/
|
||||
|
||||
[linkreference]: https://foambubble.github.io/
|
||||
[first-document]: first-document 'First Document'
|
||||
`;
|
||||
|
||||
const note = parser.parse(URI.file('/note.md'), textForNote(noteText));
|
||||
|
||||
const actual = await generateLinkReferences(
|
||||
note,
|
||||
noteText,
|
||||
EOL,
|
||||
workspace,
|
||||
false
|
||||
);
|
||||
|
||||
expect(actual.length).toBe(0);
|
||||
});
|
||||
});
|
||||
|
||||
/**
|
||||
* Will adjust a text line separator to match
|
||||
* what is used by the note
|
||||
@@ -304,3 +20,347 @@ function textForNote(text: string): string {
|
||||
const eol = EOL;
|
||||
return text.split('\n').join(eol);
|
||||
}
|
||||
|
||||
describe('generateLinkReferences', () => {
|
||||
const parser = createMarkdownParser();
|
||||
|
||||
interface TestCase {
|
||||
case: string;
|
||||
input: string;
|
||||
expected: string;
|
||||
}
|
||||
|
||||
const testCases: TestCase[] = [
|
||||
{
|
||||
case: 'should add link references for wikilinks present in note',
|
||||
input: `
|
||||
# Index
|
||||
[[doc1]] [[doc2]] [[file-without-title]]
|
||||
`,
|
||||
expected: `
|
||||
# Index
|
||||
[[doc1]] [[doc2]] [[file-without-title]]
|
||||
|
||||
[doc1]: doc1 "First"
|
||||
[doc2]: doc2 "Second"
|
||||
[file-without-title]: file-without-title "file-without-title"
|
||||
`,
|
||||
},
|
||||
{
|
||||
case: '#1558 - should keep a blank line before link references',
|
||||
input: `
|
||||
# Test
|
||||
|
||||
[[doc1]]
|
||||
|
||||
[[doc2]]
|
||||
|
||||
|
||||
|
||||
`,
|
||||
expected: `
|
||||
# Test
|
||||
|
||||
[[doc1]]
|
||||
|
||||
[[doc2]]
|
||||
|
||||
[doc1]: doc1 "First"
|
||||
[doc2]: doc2 "Second"
|
||||
`,
|
||||
},
|
||||
{
|
||||
case: 'should remove obsolete link definitions',
|
||||
input: `
|
||||
# Document
|
||||
Some content here.
|
||||
[doc1]: doc1 "First"
|
||||
`,
|
||||
expected: `
|
||||
# Document
|
||||
Some content here.
|
||||
|
||||
`,
|
||||
},
|
||||
{
|
||||
case: 'should add and remove link definitions as needed',
|
||||
input: `
|
||||
# First Document
|
||||
|
||||
Here's some [unrelated] content.
|
||||
|
||||
[unrelated]: http://unrelated.com 'This link should not be changed'
|
||||
|
||||
[[file-without-title]]
|
||||
|
||||
[doc2]: doc2 'Second Document'
|
||||
`,
|
||||
expected: `
|
||||
# First Document
|
||||
|
||||
Here's some [unrelated] content.
|
||||
|
||||
[unrelated]: http://unrelated.com 'This link should not be changed'
|
||||
|
||||
[[file-without-title]]
|
||||
|
||||
|
||||
|
||||
[file-without-title]: file-without-title "file-without-title"
|
||||
`,
|
||||
},
|
||||
{
|
||||
case: 'should not change correct link references',
|
||||
input: `
|
||||
# Third Document
|
||||
All the link references are correct in this file.
|
||||
|
||||
[[doc1]]
|
||||
[[doc2]]
|
||||
|
||||
|
||||
[doc1]: doc1 "First"
|
||||
[doc2]: doc2 "Second"
|
||||
`,
|
||||
expected: `
|
||||
# Third Document
|
||||
All the link references are correct in this file.
|
||||
|
||||
[[doc1]]
|
||||
[[doc2]]
|
||||
|
||||
|
||||
[doc1]: doc1 "First"
|
||||
[doc2]: doc2 "Second"
|
||||
`,
|
||||
},
|
||||
{
|
||||
case: 'should put links with spaces in angel brackets',
|
||||
input: `
|
||||
# Angel reference
|
||||
|
||||
[[Angel note]]
|
||||
`,
|
||||
expected: `
|
||||
# Angel reference
|
||||
|
||||
[[Angel note]]
|
||||
|
||||
[Angel note]: <Angel note> "Angel note"
|
||||
`,
|
||||
},
|
||||
{
|
||||
case: 'should not remove explicitly entered link references',
|
||||
input: `
|
||||
# File with explicit link references
|
||||
|
||||
A Bug [^footerlink]. Here is [Another link][linkreference]
|
||||
|
||||
[^footerlink]: https://foambubble.github.io/
|
||||
|
||||
[linkreference]: https://foambubble.github.io/
|
||||
`,
|
||||
expected: `
|
||||
# File with explicit link references
|
||||
|
||||
A Bug [^footerlink]. Here is [Another link][linkreference]
|
||||
|
||||
[^footerlink]: https://foambubble.github.io/
|
||||
|
||||
[linkreference]: https://foambubble.github.io/
|
||||
`,
|
||||
},
|
||||
{
|
||||
case: 'should not change explicitly entered link references',
|
||||
input: `
|
||||
# File with explicit link references
|
||||
|
||||
A Bug [^footerlink]. Here is [Another link][linkreference].
|
||||
I also want a [[doc1]].
|
||||
|
||||
[^footerlink]: https://foambubble.github.io/
|
||||
|
||||
[linkreference]: https://foambubble.github.io/
|
||||
`,
|
||||
expected: `
|
||||
# File with explicit link references
|
||||
|
||||
A Bug [^footerlink]. Here is [Another link][linkreference].
|
||||
I also want a [[doc1]].
|
||||
|
||||
[^footerlink]: https://foambubble.github.io/
|
||||
|
||||
[linkreference]: https://foambubble.github.io/
|
||||
|
||||
[doc1]: doc1 "First"
|
||||
`,
|
||||
},
|
||||
{
|
||||
case: 'should handle empty file with no wikilinks and no definitions',
|
||||
input: `
|
||||
# Empty Document
|
||||
|
||||
Just some text without any links.
|
||||
`,
|
||||
expected: `
|
||||
# Empty Document
|
||||
|
||||
Just some text without any links.
|
||||
`,
|
||||
},
|
||||
{
|
||||
case: 'should handle wikilinks with aliases',
|
||||
input: `
|
||||
# Document with aliases
|
||||
|
||||
[[doc1|Custom Alias]] and [[doc2|Another Alias]]
|
||||
`,
|
||||
expected: `
|
||||
# Document with aliases
|
||||
|
||||
[[doc1|Custom Alias]] and [[doc2|Another Alias]]
|
||||
|
||||
[doc1|Custom Alias]: doc1 "First"
|
||||
[doc2|Another Alias]: doc2 "Second"
|
||||
`,
|
||||
},
|
||||
{
|
||||
case: 'should generate only one definition for multiple references to the same link',
|
||||
input: `
|
||||
# Multiple references
|
||||
|
||||
First mention: [[doc1]]
|
||||
Second mention: [[doc1]]
|
||||
Third mention: [[doc1]]
|
||||
`,
|
||||
expected: `
|
||||
# Multiple references
|
||||
|
||||
First mention: [[doc1]]
|
||||
Second mention: [[doc1]]
|
||||
Third mention: [[doc1]]
|
||||
|
||||
[doc1]: doc1 "First"
|
||||
`,
|
||||
},
|
||||
{
|
||||
case: 'should handle link definitions in the middle of content',
|
||||
input: `
|
||||
# Document
|
||||
|
||||
[[doc1]]
|
||||
|
||||
[doc1]: doc1 "First"
|
||||
|
||||
Some more content here.
|
||||
|
||||
[[doc2]]
|
||||
`,
|
||||
expected: `
|
||||
# Document
|
||||
|
||||
[[doc1]]
|
||||
|
||||
[doc1]: doc1 "First"
|
||||
|
||||
Some more content here.
|
||||
|
||||
[[doc2]]
|
||||
|
||||
[doc2]: doc2 "Second"
|
||||
`,
|
||||
},
|
||||
{
|
||||
case: 'should handle orphaned wikilinks without corresponding notes',
|
||||
input: `
|
||||
# Document with broken links
|
||||
|
||||
[[doc1]] [[nonexistent]] [[another-missing]]
|
||||
`,
|
||||
expected: `
|
||||
# Document with broken links
|
||||
|
||||
[[doc1]] [[nonexistent]] [[another-missing]]
|
||||
|
||||
[doc1]: doc1 "First"
|
||||
`,
|
||||
},
|
||||
{
|
||||
case: 'should handle file with only blank lines at end',
|
||||
input: `
|
||||
|
||||
`,
|
||||
expected: `
|
||||
|
||||
`,
|
||||
},
|
||||
{
|
||||
case: 'should handle empty files',
|
||||
input: '',
|
||||
expected: '',
|
||||
},
|
||||
{
|
||||
case: 'should handle link definitions with different quote styles',
|
||||
input: `
|
||||
# Mixed quotes
|
||||
|
||||
[[doc1]] [[doc2]]
|
||||
|
||||
[doc1]: doc1 'First'
|
||||
[doc2]: doc2 "Second"
|
||||
`,
|
||||
expected: `
|
||||
# Mixed quotes
|
||||
|
||||
[[doc1]] [[doc2]]
|
||||
|
||||
[doc1]: doc1 'First'
|
||||
[doc2]: doc2 "Second"
|
||||
`,
|
||||
},
|
||||
// TODO
|
||||
// {
|
||||
// case: 'should append new link references to existing ones without blank lines',
|
||||
// input: `
|
||||
// [[doc1]] [[doc2]]
|
||||
|
||||
// [doc1]: doc1 "First"
|
||||
// `,
|
||||
// expected: `
|
||||
// [[doc1]] [[doc2]]
|
||||
|
||||
// [doc1]: doc1 "First"
|
||||
// [doc2]: doc2 "Second"
|
||||
// `,
|
||||
// },
|
||||
];
|
||||
|
||||
testCases.forEach(testCase => {
|
||||
// eslint-disable-next-line jest/valid-title
|
||||
it(testCase.case, async () => {
|
||||
const workspace = createTestWorkspace([URI.file('/')]);
|
||||
const workspaceNotes = [
|
||||
{ uri: '/doc1.md', title: 'First' },
|
||||
{ uri: '/doc2.md', title: 'Second' },
|
||||
{ uri: '/file-without-title.md', title: 'file-without-title' },
|
||||
{ uri: '/Angel note.md', title: 'Angel note' },
|
||||
];
|
||||
workspaceNotes.forEach(note => {
|
||||
workspace.set(createTestNote({ uri: note.uri, title: note.title }));
|
||||
});
|
||||
|
||||
const noteText = testCase.input;
|
||||
const note = parser.parse(URI.file('/note.md'), textForNote(noteText));
|
||||
const actual = await generateLinkReferences(
|
||||
note,
|
||||
noteText,
|
||||
EOL,
|
||||
workspace,
|
||||
false
|
||||
);
|
||||
const updated = TextEdit.apply(noteText, actual);
|
||||
|
||||
expect(updated).toBe(textForNote(testCase.expected));
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import { NoteLinkDefinition, Resource, ResourceLink } from '../model/note';
|
||||
import { NoteLinkDefinition, Resource } from '../model/note';
|
||||
import { Range } from '../model/range';
|
||||
import { createMarkdownReferences } from '../services/markdown-provider';
|
||||
import { FoamWorkspace } from '../model/workspace';
|
||||
@@ -19,7 +19,8 @@ export const generateLinkReferences = async (
|
||||
return [];
|
||||
}
|
||||
|
||||
const nLines = currentNoteText.split(eol).length;
|
||||
const lines = currentNoteText.split(eol);
|
||||
const nLines = lines.length;
|
||||
|
||||
const updatedWikilinkDefinitions = createMarkdownReferences(
|
||||
workspace,
|
||||
@@ -51,20 +52,24 @@ export const generateLinkReferences = async (
|
||||
|
||||
// Add new definitions
|
||||
if (toAddWikilinkDefinitions.length > 0) {
|
||||
const lastLine = currentNoteText.split(eol)[nLines - 1];
|
||||
const isLastLineEmpty = lastLine.trim().length === 0;
|
||||
|
||||
let text = isLastLineEmpty ? '' : eol;
|
||||
for (const def of toAddWikilinkDefinitions) {
|
||||
// Choose the correct position for insertion, e.g., end of file or after last reference
|
||||
text = `${text}${eol}${NoteLinkDefinition.format(def)}`;
|
||||
// find the last non-empty line to append the definitions after it
|
||||
const lastLineIndex = nLines - 1;
|
||||
let insertLineIndex = lastLineIndex;
|
||||
while (insertLineIndex > 0 && lines[insertLineIndex].trim() === '') {
|
||||
insertLineIndex--;
|
||||
}
|
||||
|
||||
const definitions = toAddWikilinkDefinitions.map(def =>
|
||||
NoteLinkDefinition.format(def)
|
||||
);
|
||||
const text = eol + eol + definitions.join(eol) + eol;
|
||||
|
||||
edits.push({
|
||||
range: Range.create(
|
||||
nLines - 1,
|
||||
lastLine.length,
|
||||
nLines - 1,
|
||||
lastLine.length
|
||||
insertLineIndex,
|
||||
lines[insertLineIndex].length,
|
||||
lastLineIndex,
|
||||
lines[lastLineIndex].length
|
||||
),
|
||||
newText: text,
|
||||
});
|
||||
|
||||
@@ -5,6 +5,10 @@ import { FoamGraph } from './graph';
|
||||
import { ResourceParser } from './note';
|
||||
import { ResourceProvider } from './provider';
|
||||
import { FoamTags } from './tags';
|
||||
import { FoamEmbeddings } from '../../ai/model/embeddings';
|
||||
import { InMemoryEmbeddingCache } from '../../ai/model/in-memory-embedding-cache';
|
||||
import { EmbeddingProvider } from '../../ai/services/embedding-provider';
|
||||
import { NoOpEmbeddingProvider } from '../../ai/services/noop-embedding-provider';
|
||||
import { Logger, withTiming, withTimingAsync } from '../utils/log';
|
||||
|
||||
export interface Services {
|
||||
@@ -18,6 +22,7 @@ export interface Foam extends IDisposable {
|
||||
workspace: FoamWorkspace;
|
||||
graph: FoamGraph;
|
||||
tags: FoamTags;
|
||||
embeddings: FoamEmbeddings;
|
||||
}
|
||||
|
||||
export const bootstrap = async (
|
||||
@@ -26,7 +31,8 @@ export const bootstrap = async (
|
||||
dataStore: IDataStore,
|
||||
parser: ResourceParser,
|
||||
initialProviders: ResourceProvider[],
|
||||
defaultExtension: string = '.md'
|
||||
defaultExtension: string = '.md',
|
||||
embeddingProvider?: EmbeddingProvider
|
||||
) => {
|
||||
const workspace = await withTimingAsync(
|
||||
() =>
|
||||
@@ -48,6 +54,22 @@ export const bootstrap = async (
|
||||
ms => Logger.info(`Tags loaded in ${ms}ms`)
|
||||
);
|
||||
|
||||
embeddingProvider = embeddingProvider ?? new NoOpEmbeddingProvider();
|
||||
const embeddings = FoamEmbeddings.fromWorkspace(
|
||||
workspace,
|
||||
embeddingProvider,
|
||||
true,
|
||||
new InMemoryEmbeddingCache()
|
||||
);
|
||||
|
||||
if (await embeddingProvider.isAvailable()) {
|
||||
Logger.info('Embeddings service initialized');
|
||||
} else {
|
||||
Logger.warn(
|
||||
'Embedding provider not available. Semantic features will be disabled.'
|
||||
);
|
||||
}
|
||||
|
||||
watcher?.onDidChange(async uri => {
|
||||
if (matcher.isMatch(uri)) {
|
||||
await workspace.fetchAndSet(uri);
|
||||
@@ -67,6 +89,7 @@ export const bootstrap = async (
|
||||
workspace,
|
||||
graph,
|
||||
tags,
|
||||
embeddings,
|
||||
services: {
|
||||
parser,
|
||||
dataStore,
|
||||
@@ -75,6 +98,7 @@ export const bootstrap = async (
|
||||
dispose: () => {
|
||||
workspace.dispose();
|
||||
graph.dispose();
|
||||
embeddings.dispose();
|
||||
},
|
||||
};
|
||||
|
||||
|
||||
@@ -146,7 +146,6 @@ describe('Graph', () => {
|
||||
});
|
||||
const noteB = createTestNote({
|
||||
uri: '/somewhere/page-b.md',
|
||||
text: '## Section 1\n\n## Section 2',
|
||||
});
|
||||
const ws = createTestWorkspace().set(noteA).set(noteB);
|
||||
const graph = FoamGraph.fromWorkspace(ws);
|
||||
|
||||
@@ -61,6 +61,14 @@ export abstract class NoteLinkDefinition {
|
||||
|
||||
return text;
|
||||
}
|
||||
|
||||
static isEqual(def1: NoteLinkDefinition, def2: NoteLinkDefinition): boolean {
|
||||
return (
|
||||
def1.label === def2.label &&
|
||||
def1.url === def2.url &&
|
||||
def1.title === def2.title
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
export interface Tag {
|
||||
|
||||
@@ -94,8 +94,15 @@ export class FileListBasedMatcher implements IMatcher {
|
||||
include: string[];
|
||||
exclude: string[];
|
||||
|
||||
constructor(files: URI[], private readonly listFiles: () => Promise<URI[]>) {
|
||||
constructor(
|
||||
files: URI[],
|
||||
private readonly listFiles: () => Promise<URI[]>,
|
||||
include: string[] = ['**/*'],
|
||||
exclude: string[] = []
|
||||
) {
|
||||
this.files = files.map(f => f.path);
|
||||
this.include = include;
|
||||
this.exclude = exclude;
|
||||
}
|
||||
|
||||
match(files: URI[]): URI[] {
|
||||
@@ -110,9 +117,13 @@ export class FileListBasedMatcher implements IMatcher {
|
||||
this.files = (await this.listFiles()).map(f => f.path);
|
||||
}
|
||||
|
||||
static async createFromListFn(listFiles: () => Promise<URI[]>) {
|
||||
static async createFromListFn(
|
||||
listFiles: () => Promise<URI[]>,
|
||||
include: string[] = ['**/*'],
|
||||
exclude: string[] = []
|
||||
) {
|
||||
const files = await listFiles();
|
||||
return new FileListBasedMatcher(files, listFiles);
|
||||
return new FileListBasedMatcher(files, listFiles, include, exclude);
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
@@ -144,6 +144,19 @@ this is some text with our [[second-wikilink]].
|
||||
]);
|
||||
});
|
||||
|
||||
it('#1545 - should not detect single brackets as links', () => {
|
||||
const note = createNoteFromMarkdown(`
|
||||
"She said [winning the award] was her best year."
|
||||
|
||||
We use brackets ([ and ]) to surround links.
|
||||
|
||||
This is not an easy task.[^1]
|
||||
|
||||
[^1]: It would be easier if more papers were well written.
|
||||
`);
|
||||
expect(note.links.length).toEqual(0);
|
||||
});
|
||||
|
||||
it('should detect reference-style links', () => {
|
||||
const note = createNoteFromMarkdown(`
|
||||
# Test Document
|
||||
@@ -181,12 +194,9 @@ This is a [reference-style link][missing-ref].
|
||||
[existing-ref]: target.md "Target"
|
||||
`);
|
||||
|
||||
expect(note.links.length).toEqual(1);
|
||||
const link = note.links[0];
|
||||
expect(link.type).toEqual('link');
|
||||
expect(link.rawText).toEqual('[reference-style link][missing-ref]');
|
||||
expect(ResourceLink.isUnresolvedReference(link)).toBe(true);
|
||||
expect(link.definition).toEqual('missing-ref');
|
||||
// Per CommonMark spec, reference links without matching definitions
|
||||
// should be treated as plain text, not as links
|
||||
expect(note.links.length).toEqual(0);
|
||||
});
|
||||
|
||||
it('should handle mixed link types', () => {
|
||||
|
||||
@@ -179,6 +179,14 @@ export function createMarkdownParser(
|
||||
}
|
||||
});
|
||||
|
||||
// For type: 'link', keep only if:
|
||||
// - It's a direct link [text](url) - no definition field
|
||||
// - It's a resolved reference - definition is an object
|
||||
note.links = note.links.filter(
|
||||
link =>
|
||||
link.type === 'wikilink' || !ResourceLink.isUnresolvedReference(link)
|
||||
);
|
||||
|
||||
Logger.debug('Result:', note);
|
||||
return note;
|
||||
},
|
||||
|
||||
@@ -303,8 +303,9 @@ describe('Link resolution', () => {
|
||||
|
||||
expect(ws.resolveLink(noteB, noteB.links[0])).toEqual(noteA.uri);
|
||||
expect(ws.resolveLink(noteC, noteC.links[0])).toEqual(noteA.uri);
|
||||
expect(noteD.links.length).toEqual(1);
|
||||
expect(noteD.links[0].definition).toEqual('note'); // Unresolved reference
|
||||
// noteD has malformed URL with unencoded space, which gets treated as
|
||||
// shortcut reference [note] without definition, now correctly filtered out
|
||||
expect(noteD.links.length).toEqual(0);
|
||||
});
|
||||
|
||||
describe('Workspace-relative paths (root-path relative)', () => {
|
||||
|
||||
34
packages/foam-vscode/src/core/services/progress.ts
Normal file
34
packages/foam-vscode/src/core/services/progress.ts
Normal file
@@ -0,0 +1,34 @@
|
||||
/**
|
||||
* Generic progress information for long-running operations
|
||||
*/
|
||||
export interface Progress<T = unknown> {
|
||||
/** Current item being processed (1-indexed) */
|
||||
current: number;
|
||||
/** Total number of items to process */
|
||||
total: number;
|
||||
/** Optional context data about the current item */
|
||||
context?: T;
|
||||
}
|
||||
|
||||
/**
|
||||
* Callback for reporting progress during operations
|
||||
*/
|
||||
export type ProgressCallback<T = unknown> = (progress: Progress<T>) => void;
|
||||
|
||||
/**
|
||||
* Cancellation token for aborting long-running operations
|
||||
*/
|
||||
export interface CancellationToken {
|
||||
/** Whether cancellation has been requested */
|
||||
readonly isCancellationRequested: boolean;
|
||||
}
|
||||
|
||||
/**
|
||||
* Exception thrown when an operation is cancelled
|
||||
*/
|
||||
export class CancellationError extends Error {
|
||||
constructor(message: string = 'Operation cancelled') {
|
||||
super(message);
|
||||
this.name = 'CancellationError';
|
||||
}
|
||||
}
|
||||
@@ -78,7 +78,7 @@ export async function firstFrom<T>(
|
||||
* @param functions - The array of functions to execute.
|
||||
* @returns A generator yielding the results of the functions.
|
||||
*/
|
||||
function* lazyExecutor<T>(functions: Array<() => T>): Generator<T> {
|
||||
export function* lazyExecutor<T>(functions: Array<() => T>): Generator<T> {
|
||||
for (const fn of functions) {
|
||||
yield fn();
|
||||
}
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
import { isSome } from './core';
|
||||
export const HASHTAG_REGEX =
|
||||
/(?<=^|\s)#([0-9]*[\p{L}\p{Emoji_Presentation}/_-][\p{L}\p{Emoji_Presentation}\p{N}/_-]*)/gmu;
|
||||
/(?<=^|\s)#([0-9]*[\p{L}\p{Extended_Pictographic}/_-](?:[\p{L}\p{Extended_Pictographic}\p{N}/_-]|\uFE0F|\p{Emoji_Modifier})*)/gmu;
|
||||
export const WORD_REGEX =
|
||||
/(?<=^|\s)([0-9]*[\p{L}\p{Emoji_Presentation}/_-][\p{L}\p{Emoji_Presentation}\p{N}/_-]*)/gmu;
|
||||
/(?<=^|\s)([0-9]*[\p{L}\p{Extended_Pictographic}/_-](?:[\p{L}\p{Extended_Pictographic}\p{N}/_-]|\uFE0F|\p{Emoji_Modifier})*)/gmu;
|
||||
|
||||
export const extractHashtags = (
|
||||
text: string
|
||||
|
||||
306
packages/foam-vscode/src/core/utils/task-deduplicator.test.ts
Normal file
306
packages/foam-vscode/src/core/utils/task-deduplicator.test.ts
Normal file
@@ -0,0 +1,306 @@
|
||||
import { TaskDeduplicator } from './task-deduplicator';
|
||||
|
||||
describe('TaskDeduplicator', () => {
|
||||
describe('run', () => {
|
||||
it('should execute a task and return its result', async () => {
|
||||
const deduplicator = new TaskDeduplicator<string>();
|
||||
const task = jest.fn(async () => 'result');
|
||||
|
||||
const result = await deduplicator.run(task);
|
||||
|
||||
expect(result).toBe('result');
|
||||
expect(task).toHaveBeenCalledTimes(1);
|
||||
});
|
||||
|
||||
it('should deduplicate concurrent calls to the same task', async () => {
|
||||
const deduplicator = new TaskDeduplicator<string>();
|
||||
let executeCount = 0;
|
||||
|
||||
const task = async () => {
|
||||
executeCount++;
|
||||
await new Promise(resolve => setTimeout(resolve, 10));
|
||||
return 'result';
|
||||
};
|
||||
|
||||
// Start multiple concurrent calls
|
||||
const [result1, result2, result3] = await Promise.all([
|
||||
deduplicator.run(task),
|
||||
deduplicator.run(task),
|
||||
deduplicator.run(task),
|
||||
]);
|
||||
|
||||
// All should get the same result
|
||||
expect(result1).toBe('result');
|
||||
expect(result2).toBe('result');
|
||||
expect(result3).toBe('result');
|
||||
|
||||
// Task should only execute once
|
||||
expect(executeCount).toBe(1);
|
||||
});
|
||||
|
||||
it('should call onDuplicate callback for concurrent calls', async () => {
|
||||
const deduplicator = new TaskDeduplicator<string>();
|
||||
const onDuplicate = jest.fn();
|
||||
|
||||
const task = async () => {
|
||||
await new Promise(resolve => setTimeout(resolve, 10));
|
||||
return 'result';
|
||||
};
|
||||
|
||||
// Start concurrent calls
|
||||
const promise1 = deduplicator.run(task);
|
||||
const promise2 = deduplicator.run(task, onDuplicate);
|
||||
const promise3 = deduplicator.run(task, onDuplicate);
|
||||
|
||||
await Promise.all([promise1, promise2, promise3]);
|
||||
|
||||
// onDuplicate should be called for the 2nd and 3rd calls
|
||||
expect(onDuplicate).toHaveBeenCalledTimes(2);
|
||||
});
|
||||
|
||||
it('should not call onDuplicate for the first call', async () => {
|
||||
const deduplicator = new TaskDeduplicator<string>();
|
||||
const onDuplicate = jest.fn();
|
||||
const task = jest.fn(async () => 'result');
|
||||
|
||||
await deduplicator.run(task, onDuplicate);
|
||||
|
||||
expect(onDuplicate).not.toHaveBeenCalled();
|
||||
});
|
||||
|
||||
it('should allow new tasks after previous task completes', async () => {
|
||||
const deduplicator = new TaskDeduplicator<number>();
|
||||
let counter = 0;
|
||||
|
||||
const task1 = async () => ++counter;
|
||||
const task2 = async () => ++counter;
|
||||
|
||||
const result1 = await deduplicator.run(task1);
|
||||
const result2 = await deduplicator.run(task2);
|
||||
|
||||
expect(result1).toBe(1);
|
||||
expect(result2).toBe(2);
|
||||
});
|
||||
|
||||
it('should propagate errors from the task', async () => {
|
||||
const deduplicator = new TaskDeduplicator<string>();
|
||||
const error = new Error('Task failed');
|
||||
const task = jest.fn(async () => {
|
||||
throw error;
|
||||
});
|
||||
|
||||
await expect(deduplicator.run(task)).rejects.toThrow('Task failed');
|
||||
});
|
||||
|
||||
it('should propagate errors to all concurrent callers', async () => {
|
||||
const deduplicator = new TaskDeduplicator<string>();
|
||||
const error = new Error('Task failed');
|
||||
|
||||
const task = async () => {
|
||||
await new Promise(resolve => setTimeout(resolve, 10));
|
||||
throw error;
|
||||
};
|
||||
|
||||
const promise1 = deduplicator.run(task);
|
||||
const promise2 = deduplicator.run(task);
|
||||
const promise3 = deduplicator.run(task);
|
||||
|
||||
await expect(promise1).rejects.toThrow('Task failed');
|
||||
await expect(promise2).rejects.toThrow('Task failed');
|
||||
await expect(promise3).rejects.toThrow('Task failed');
|
||||
});
|
||||
|
||||
it('should clear running task after error', async () => {
|
||||
const deduplicator = new TaskDeduplicator<string>();
|
||||
const task1 = jest.fn(async () => {
|
||||
throw new Error('Task failed');
|
||||
});
|
||||
const task2 = jest.fn(async () => 'success');
|
||||
|
||||
// First task fails
|
||||
await expect(deduplicator.run(task1)).rejects.toThrow('Task failed');
|
||||
|
||||
// Second task should execute (not deduplicated)
|
||||
const result = await deduplicator.run(task2);
|
||||
|
||||
expect(result).toBe('success');
|
||||
expect(task1).toHaveBeenCalledTimes(1);
|
||||
expect(task2).toHaveBeenCalledTimes(1);
|
||||
});
|
||||
|
||||
it('should handle different return types', async () => {
|
||||
// String
|
||||
const stringDeduplicator = new TaskDeduplicator<string>();
|
||||
const stringResult = await stringDeduplicator.run(async () => 'test');
|
||||
expect(stringResult).toBe('test');
|
||||
|
||||
// Number
|
||||
const numberDeduplicator = new TaskDeduplicator<number>();
|
||||
const numberResult = await numberDeduplicator.run(async () => 42);
|
||||
expect(numberResult).toBe(42);
|
||||
|
||||
// Object
|
||||
const objectDeduplicator = new TaskDeduplicator<{ value: string }>();
|
||||
const objectResult = await objectDeduplicator.run(async () => ({
|
||||
value: 'test',
|
||||
}));
|
||||
expect(objectResult).toEqual({ value: 'test' });
|
||||
|
||||
// Union types
|
||||
type Status = 'complete' | 'cancelled' | 'error';
|
||||
const statusDeduplicator = new TaskDeduplicator<Status>();
|
||||
const statusResult = await statusDeduplicator.run(
|
||||
async () => 'complete' as Status
|
||||
);
|
||||
expect(statusResult).toBe('complete');
|
||||
});
|
||||
});
|
||||
|
||||
describe('isRunning', () => {
|
||||
it('should return false when no task is running', () => {
|
||||
const deduplicator = new TaskDeduplicator<string>();
|
||||
|
||||
expect(deduplicator.isRunning()).toBe(false);
|
||||
});
|
||||
|
||||
it('should return true when a task is running', async () => {
|
||||
const deduplicator = new TaskDeduplicator<string>();
|
||||
|
||||
const task = async () => {
|
||||
await new Promise(resolve => setTimeout(resolve, 10));
|
||||
return 'result';
|
||||
};
|
||||
|
||||
const promise = deduplicator.run(task);
|
||||
|
||||
expect(deduplicator.isRunning()).toBe(true);
|
||||
|
||||
await promise;
|
||||
});
|
||||
|
||||
it('should return false after task completes', async () => {
|
||||
const deduplicator = new TaskDeduplicator<string>();
|
||||
const task = jest.fn(async () => 'result');
|
||||
|
||||
await deduplicator.run(task);
|
||||
|
||||
expect(deduplicator.isRunning()).toBe(false);
|
||||
});
|
||||
|
||||
it('should return false after task fails', async () => {
|
||||
const deduplicator = new TaskDeduplicator<string>();
|
||||
const task = jest.fn(async () => {
|
||||
throw new Error('Failed');
|
||||
});
|
||||
|
||||
await expect(deduplicator.run(task)).rejects.toThrow('Failed');
|
||||
|
||||
expect(deduplicator.isRunning()).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
describe('clear', () => {
|
||||
it('should clear the running task reference', async () => {
|
||||
const deduplicator = new TaskDeduplicator<string>();
|
||||
|
||||
const task = async () => {
|
||||
await new Promise(resolve => setTimeout(resolve, 10));
|
||||
return 'result';
|
||||
};
|
||||
|
||||
const promise = deduplicator.run(task);
|
||||
|
||||
expect(deduplicator.isRunning()).toBe(true);
|
||||
|
||||
deduplicator.clear();
|
||||
|
||||
expect(deduplicator.isRunning()).toBe(false);
|
||||
|
||||
// Original promise should still complete
|
||||
await expect(promise).resolves.toBe('result');
|
||||
});
|
||||
|
||||
it('should allow new task after manual clear', async () => {
|
||||
const deduplicator = new TaskDeduplicator<string>();
|
||||
let executeCount = 0;
|
||||
|
||||
const task = async () => {
|
||||
executeCount++;
|
||||
await new Promise(resolve => setTimeout(resolve, 50));
|
||||
return 'result';
|
||||
};
|
||||
|
||||
// Start first task
|
||||
const promise1 = deduplicator.run(task);
|
||||
|
||||
// Clear while still running
|
||||
deduplicator.clear();
|
||||
|
||||
// Start second task (should not be deduplicated)
|
||||
const promise2 = deduplicator.run(task);
|
||||
|
||||
await Promise.all([promise1, promise2]);
|
||||
|
||||
// Both tasks should have executed
|
||||
expect(executeCount).toBe(2);
|
||||
});
|
||||
|
||||
it('should be safe to call when no task is running', () => {
|
||||
const deduplicator = new TaskDeduplicator<string>();
|
||||
|
||||
expect(() => deduplicator.clear()).not.toThrow();
|
||||
expect(deduplicator.isRunning()).toBe(false);
|
||||
});
|
||||
});
|
||||
|
||||
describe('edge cases', () => {
|
||||
it('should handle tasks that resolve immediately', async () => {
|
||||
const deduplicator = new TaskDeduplicator<string>();
|
||||
const task = jest.fn(async () => 'immediate');
|
||||
|
||||
const result = await deduplicator.run(task);
|
||||
|
||||
expect(result).toBe('immediate');
|
||||
expect(deduplicator.isRunning()).toBe(false);
|
||||
});
|
||||
|
||||
it('should handle tasks that throw synchronously', async () => {
|
||||
const deduplicator = new TaskDeduplicator<string>();
|
||||
const task = jest.fn(() => {
|
||||
throw new Error('Sync error');
|
||||
});
|
||||
|
||||
await expect(deduplicator.run(task as any)).rejects.toThrow('Sync error');
|
||||
expect(deduplicator.isRunning()).toBe(false);
|
||||
});
|
||||
|
||||
it('should handle null/undefined results', async () => {
|
||||
const nullDeduplicator = new TaskDeduplicator<null>();
|
||||
const nullResult = await nullDeduplicator.run(async () => null);
|
||||
expect(nullResult).toBeNull();
|
||||
|
||||
const undefinedDeduplicator = new TaskDeduplicator<undefined>();
|
||||
const undefinedResult = await undefinedDeduplicator.run(
|
||||
async () => undefined
|
||||
);
|
||||
expect(undefinedResult).toBeUndefined();
|
||||
});
|
||||
|
||||
it('should handle sequential calls with delays between them', async () => {
|
||||
const deduplicator = new TaskDeduplicator<number>();
|
||||
let counter = 0;
|
||||
|
||||
const task = async () => {
|
||||
await new Promise(resolve => setTimeout(resolve, 10));
|
||||
return ++counter;
|
||||
};
|
||||
|
||||
const result1 = await deduplicator.run(task);
|
||||
await new Promise(resolve => setTimeout(resolve, 20));
|
||||
const result2 = await deduplicator.run(task);
|
||||
|
||||
expect(result1).toBe(1);
|
||||
expect(result2).toBe(2);
|
||||
});
|
||||
});
|
||||
});
|
||||
67
packages/foam-vscode/src/core/utils/task-deduplicator.ts
Normal file
67
packages/foam-vscode/src/core/utils/task-deduplicator.ts
Normal file
@@ -0,0 +1,67 @@
|
||||
/**
|
||||
* A utility class for deduplicating concurrent async operations.
|
||||
* When multiple calls are made while a task is running, subsequent calls
|
||||
* will wait for and receive the result of the already-running task instead
|
||||
* of starting a new one.
|
||||
*
|
||||
* @example
|
||||
* const deduplicator = new TaskDeduplicator<string>();
|
||||
*
|
||||
* async function expensiveOperation(input: string): Promise<string> {
|
||||
* return deduplicator.run(async () => {
|
||||
* // Expensive work here
|
||||
* return result;
|
||||
* });
|
||||
* }
|
||||
*
|
||||
* // Multiple concurrent calls will share the same execution
|
||||
* const [result1, result2] = await Promise.all([
|
||||
* expensiveOperation("test"),
|
||||
* expensiveOperation("test"),
|
||||
* ]);
|
||||
* // Only runs once, both get the same result
|
||||
*/
|
||||
export class TaskDeduplicator<T> {
|
||||
private runningTask: Promise<T> | null = null;
|
||||
|
||||
/**
|
||||
* Run a task with deduplication.
|
||||
* If a task is already running, waits for it to complete and returns its result.
|
||||
* Otherwise, starts the task and stores its promise for other callers to await.
|
||||
*
|
||||
* @param task The async function to execute
|
||||
* @param onDuplicate Optional callback when a duplicate call is detected
|
||||
* @returns The result of the task
|
||||
*/
|
||||
async run(task: () => Promise<T>, onDuplicate?: () => void): Promise<T> {
|
||||
// If already running, wait for the existing task
|
||||
if (this.runningTask) {
|
||||
onDuplicate?.();
|
||||
return await this.runningTask;
|
||||
}
|
||||
|
||||
// Start the task and store the promise
|
||||
this.runningTask = task();
|
||||
|
||||
try {
|
||||
return await this.runningTask;
|
||||
} finally {
|
||||
// Clear the task when done
|
||||
this.runningTask = null;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
* Check if a task is currently running
|
||||
*/
|
||||
isRunning(): boolean {
|
||||
return this.runningTask !== null;
|
||||
}
|
||||
|
||||
/**
|
||||
* Clear the running task reference (useful for testing or error recovery)
|
||||
*/
|
||||
clear(): void {
|
||||
this.runningTask = null;
|
||||
}
|
||||
}
|
||||
@@ -80,6 +80,25 @@ describe('hashtag extraction', () => {
|
||||
]);
|
||||
});
|
||||
|
||||
it('supports emoji tags with variant selectors (issue #1536)', () => {
|
||||
expect(
|
||||
extractHashtags('#🗃️/37-Education #🔖/37/Learning #🟣HOUSE #🟠MONEY').map(
|
||||
t => t.label
|
||||
)
|
||||
).toEqual(['🗃️/37-Education', '🔖/37/Learning', '🟣HOUSE', '🟠MONEY']);
|
||||
});
|
||||
|
||||
it('supports individual emojis with variant selectors', () => {
|
||||
// Test each emoji separately to debug
|
||||
expect(extractHashtags('#🗃️').map(t => t.label)).toEqual(['🗃️']);
|
||||
expect(extractHashtags('#🔖').map(t => t.label)).toEqual(['🔖']);
|
||||
});
|
||||
|
||||
it('supports emojis that work without variant selector', () => {
|
||||
// These emojis should work with current implementation
|
||||
expect(extractHashtags('#📥 #⭐').map(t => t.label)).toEqual(['📥', '⭐']);
|
||||
});
|
||||
|
||||
it('ignores hashes in plain text urls and links', () => {
|
||||
expect(
|
||||
extractHashtags(`
|
||||
|
||||
@@ -10,7 +10,8 @@ import { features } from './features';
|
||||
import { VsCodeOutputLogger, exposeLogger } from './services/logging';
|
||||
import {
|
||||
getAttachmentsExtensions,
|
||||
getIgnoredFilesSetting,
|
||||
getExcludedFilesSetting,
|
||||
getIncludeFilesSetting,
|
||||
getNotesExtensions,
|
||||
} from './settings';
|
||||
import { AttachmentResourceProvider } from './core/services/attachment-provider';
|
||||
@@ -18,6 +19,7 @@ import { VsCodeWatcher } from './services/watcher';
|
||||
import { createMarkdownParser } from './core/services/markdown-parser';
|
||||
import VsCodeBasedParserCache from './services/cache';
|
||||
import { createMatcherAndDataStore } from './services/editor';
|
||||
import { OllamaEmbeddingProvider } from './ai/providers/ollama/ollama-provider';
|
||||
|
||||
export async function activate(context: ExtensionContext) {
|
||||
const logger = new VsCodeOutputLogger();
|
||||
@@ -33,14 +35,15 @@ export async function activate(context: ExtensionContext) {
|
||||
}
|
||||
|
||||
// Prepare Foam
|
||||
const excludes = getIgnoredFilesSetting().map(g => g.toString());
|
||||
const { matcher, dataStore, excludePatterns } =
|
||||
await createMatcherAndDataStore(excludes);
|
||||
const includes = getIncludeFilesSetting().map(g => g.toString());
|
||||
const excludes = getExcludedFilesSetting().map(g => g.toString());
|
||||
const { matcher, dataStore, includePatterns, excludePatterns } =
|
||||
await createMatcherAndDataStore(includes, excludes);
|
||||
|
||||
Logger.info('Loading from directories:');
|
||||
for (const folder of workspace.workspaceFolders) {
|
||||
Logger.info('- ' + folder.uri.fsPath);
|
||||
Logger.info(' Include: **/*');
|
||||
Logger.info(' Include: ' + includePatterns.get(folder.name).join(','));
|
||||
Logger.info(' Exclude: ' + excludePatterns.get(folder.name).join(','));
|
||||
}
|
||||
|
||||
@@ -69,13 +72,20 @@ export async function activate(context: ExtensionContext) {
|
||||
attachmentExtConfig
|
||||
);
|
||||
|
||||
// Initialize embedding provider
|
||||
const aiEnabled = workspace.getConfiguration('foam.experimental').get('ai');
|
||||
const embeddingProvider = aiEnabled
|
||||
? new OllamaEmbeddingProvider()
|
||||
: undefined;
|
||||
|
||||
const foamPromise = bootstrap(
|
||||
matcher,
|
||||
watcher,
|
||||
dataStore,
|
||||
parser,
|
||||
[markdownProvider, attachmentProvider],
|
||||
defaultExtension
|
||||
defaultExtension,
|
||||
embeddingProvider
|
||||
);
|
||||
|
||||
// Load the features
|
||||
@@ -98,6 +108,8 @@ export async function activate(context: ExtensionContext) {
|
||||
if (
|
||||
[
|
||||
'foam.files.ignore',
|
||||
'foam.files.exclude',
|
||||
'foam.files.include',
|
||||
'foam.files.attachmentExtensions',
|
||||
'foam.files.noteExtensions',
|
||||
'foam.files.defaultNoteExtension',
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/* @unit-ready */
|
||||
import { env, Position, Selection, commands } from 'vscode';
|
||||
import { env, Selection, commands } from 'vscode';
|
||||
import { createFile, showInEditor } from '../../test/test-utils-vscode';
|
||||
import { removeBrackets, toTitleCase } from './copy-without-brackets';
|
||||
|
||||
|
||||
@@ -1,5 +1,4 @@
|
||||
import { commands, ExtensionContext } from 'vscode';
|
||||
import { askUserForTemplate } from '../../services/templates';
|
||||
|
||||
export default async function activate(context: ExtensionContext) {
|
||||
context.subscriptions.push(
|
||||
|
||||
@@ -13,3 +13,5 @@ export { default as createNote } from './create-note';
|
||||
export { default as searchTagCommand } from './search-tag';
|
||||
export { default as renameTagCommand } from './rename-tag';
|
||||
export { default as convertLinksCommand } from './convert-links';
|
||||
export { default as showSimilarNotesCommand } from '../../ai/vscode/commands/show-similar-notes';
|
||||
export { default as buildEmbeddingsCommand } from '../../ai/vscode/commands/build-embeddings';
|
||||
|
||||
@@ -9,7 +9,7 @@ import {
|
||||
import { CommandDescriptor } from '../../utils/commands';
|
||||
import { FoamWorkspace } from '../../core/model/workspace';
|
||||
import { Resource } from '../../core/model/note';
|
||||
import { isSome, isNone } from '../../core/utils';
|
||||
import { isNone } from '../../core/utils';
|
||||
|
||||
export default async function activate(
|
||||
context: vscode.ExtensionContext,
|
||||
|
||||
@@ -35,7 +35,7 @@ describe('Graph Panel', () => {
|
||||
expect(graphPanel?.group.viewColumn).toBeGreaterThan(vscode.ViewColumn.One);
|
||||
});
|
||||
|
||||
it('should create graph in ViewColumn.Two when no active editor', async () => {
|
||||
it('should create graph in ViewColumn.One when no active editor', async () => {
|
||||
// Make sure no editors are open
|
||||
await closeEditors();
|
||||
|
||||
@@ -51,7 +51,7 @@ describe('Graph Panel', () => {
|
||||
.find(tab => tab.label === 'Foam Graph');
|
||||
|
||||
expect(graphPanel).toBeDefined();
|
||||
expect(graphPanel?.group.viewColumn).toBe(vscode.ViewColumn.Two);
|
||||
expect(graphPanel?.group.viewColumn).toBe(vscode.ViewColumn.One);
|
||||
});
|
||||
|
||||
it('should reveal existing graph panel without moving it', async () => {
|
||||
|
||||
@@ -3,6 +3,7 @@ import { Foam } from '../../core/model/foam';
|
||||
import { Logger } from '../../core/utils/log';
|
||||
import { fromVsCodeUri } from '../../utils/vsc-utils';
|
||||
import { isSome } from '../../core/utils';
|
||||
import { getFoamVsCodeConfig } from '../../services/config';
|
||||
|
||||
export default async function activate(
|
||||
context: vscode.ExtensionContext,
|
||||
@@ -10,12 +11,14 @@ export default async function activate(
|
||||
) {
|
||||
let panel: vscode.WebviewPanel | undefined = undefined;
|
||||
vscode.workspace.onDidChangeConfiguration(event => {
|
||||
if (event.affectsConfiguration('foam.graph.style')) {
|
||||
const style = getGraphStyle();
|
||||
panel.webview.postMessage({
|
||||
type: 'didUpdateStyle',
|
||||
payload: style,
|
||||
});
|
||||
if (panel) {
|
||||
if (event.affectsConfiguration('foam.graph.style')) {
|
||||
const style = getGraphStyle();
|
||||
panel.webview.postMessage({
|
||||
type: 'didUpdateStyle',
|
||||
payload: style,
|
||||
});
|
||||
}
|
||||
}
|
||||
});
|
||||
|
||||
@@ -23,11 +26,8 @@ export default async function activate(
|
||||
if (panel) {
|
||||
panel.reveal();
|
||||
} else {
|
||||
const columnToShowIn = vscode.window.activeTextEditor
|
||||
? vscode.ViewColumn.Beside
|
||||
: vscode.ViewColumn.Two;
|
||||
const foam = await foamPromise;
|
||||
panel = await createGraphPanel(foam, context, columnToShowIn);
|
||||
panel = await createGraphPanel(foam, context);
|
||||
const onFoamChanged = _ => {
|
||||
updateGraph(panel, foam);
|
||||
};
|
||||
@@ -51,6 +51,10 @@ export default async function activate(
|
||||
});
|
||||
}
|
||||
});
|
||||
const shouldOpenGraphOnStartup = getFoamVsCodeConfig('graph.onStartup');
|
||||
if (shouldOpenGraphOnStartup) {
|
||||
vscode.commands.executeCommand('foam-vscode.show-graph');
|
||||
}
|
||||
}
|
||||
|
||||
function updateGraph(panel: vscode.WebviewPanel, foam: Foam) {
|
||||
@@ -119,7 +123,7 @@ async function createGraphPanel(
|
||||
const panel = vscode.window.createWebviewPanel(
|
||||
'foam-graph',
|
||||
'Foam Graph',
|
||||
viewColumn ?? vscode.ViewColumn.Two,
|
||||
viewColumn ?? vscode.ViewColumn.Beside,
|
||||
{
|
||||
enableScripts: true,
|
||||
retainContextWhenHidden: true,
|
||||
@@ -137,6 +141,7 @@ async function createGraphPanel(
|
||||
type: 'didUpdateStyle',
|
||||
payload: styles,
|
||||
});
|
||||
|
||||
updateGraph(panel, foam);
|
||||
break;
|
||||
}
|
||||
@@ -178,14 +183,11 @@ async function getWebviewContent(
|
||||
const getWebviewUri = (fileName: string) =>
|
||||
panel.webview.asWebviewUri(vscode.Uri.joinPath(datavizUri, fileName));
|
||||
|
||||
const indexHtml =
|
||||
vscode.env.uiKind === vscode.UIKind.Desktop
|
||||
? new TextDecoder('utf-8').decode(
|
||||
await vscode.workspace.fs.readFile(
|
||||
vscode.Uri.joinPath(datavizUri, 'index.html')
|
||||
)
|
||||
)
|
||||
: await fetch(getWebviewUri('index.html').toString()).then(r => r.text());
|
||||
const indexHtml = new TextDecoder('utf-8').decode(
|
||||
await vscode.workspace.fs.readFile(
|
||||
vscode.Uri.joinPath(datavizUri, 'index.html')
|
||||
)
|
||||
);
|
||||
|
||||
// Replace the script paths with the appropriate webview URI.
|
||||
const filled = indexHtml.replace(
|
||||
|
||||
@@ -4,3 +4,4 @@ export { default as orphans } from './orphans';
|
||||
export { default as placeholders } from './placeholders';
|
||||
export { default as tags } from './tags-explorer';
|
||||
export { default as notes } from './notes-explorer';
|
||||
export { default as relatedNotes } from '../../ai/vscode/panels/related-notes';
|
||||
|
||||
@@ -1,7 +1,10 @@
|
||||
import * as vscode from 'vscode';
|
||||
import { Foam } from '../../core/model/foam';
|
||||
import { createMatcherAndDataStore } from '../../services/editor';
|
||||
import { getAttachmentsExtensions } from '../../settings';
|
||||
import {
|
||||
getAttachmentsExtensions,
|
||||
getIncludeFilesSetting,
|
||||
} from '../../settings';
|
||||
import {
|
||||
GroupedResourcesConfig,
|
||||
GroupedResourcesTreeDataProvider,
|
||||
@@ -21,6 +24,7 @@ export default async function activate(
|
||||
const foam = await foamPromise;
|
||||
|
||||
const { matcher } = await createMatcherAndDataStore(
|
||||
getIncludeFilesSetting().map(g => g.toString()),
|
||||
getOrphansConfig().exclude
|
||||
);
|
||||
const provider = new OrphanTreeView(
|
||||
|
||||
@@ -17,6 +17,7 @@ import { FoamGraph } from '../../core/model/graph';
|
||||
import { URI } from '../../core/model/uri';
|
||||
import { FoamWorkspace } from '../../core/model/workspace';
|
||||
import { FolderTreeItem } from './utils/folder-tree-provider';
|
||||
import { getIncludeFilesSetting } from '../../settings';
|
||||
|
||||
/** Retrieve the placeholders configuration */
|
||||
export function getPlaceholdersConfig(): GroupedResourcesConfig {
|
||||
@@ -31,6 +32,7 @@ export default async function activate(
|
||||
) {
|
||||
const foam = await foamPromise;
|
||||
const { matcher } = await createMatcherAndDataStore(
|
||||
getIncludeFilesSetting().map(g => g.toString()),
|
||||
getPlaceholdersConfig().exclude
|
||||
);
|
||||
const provider = new PlaceholderTreeView(
|
||||
|
||||
@@ -0,0 +1,217 @@
|
||||
/* @unit-ready */
|
||||
import MarkdownIt from 'markdown-it';
|
||||
import {
|
||||
default as escapeWikilinkPipes,
|
||||
PIPE_PLACEHOLDER,
|
||||
} from './escape-wikilink-pipes';
|
||||
|
||||
describe('escape-wikilink-pipes plugin', () => {
|
||||
it('should render table with wikilink alias correctly', () => {
|
||||
const md = MarkdownIt();
|
||||
escapeWikilinkPipes(md);
|
||||
|
||||
const markdown = `| Column |
|
||||
| --- |
|
||||
| [[note|alias]] |`;
|
||||
|
||||
const html = md.render(markdown);
|
||||
|
||||
// Should have proper table structure
|
||||
expect(html).toContain('<table>');
|
||||
expect(html).toContain('<tbody>');
|
||||
|
||||
// Should preserve the wikilink with pipe character intact
|
||||
expect(html).toContain('[[note|alias]]');
|
||||
|
||||
// Should NOT split into multiple cells (would see extra <td> tags)
|
||||
const tdCount = (html.match(/<td>/g) || []).length;
|
||||
expect(tdCount).toBe(1);
|
||||
});
|
||||
|
||||
it('should render table with multiple wikilink aliases in same row', () => {
|
||||
const md = MarkdownIt();
|
||||
escapeWikilinkPipes(md);
|
||||
|
||||
const markdown = `| Col1 | Col2 | Col3 |
|
||||
| --- | --- | --- |
|
||||
| [[a|A]] | [[b|B]] | [[c|C]] |`;
|
||||
|
||||
const html = md.render(markdown);
|
||||
|
||||
// All three wikilinks should be preserved
|
||||
expect(html).toContain('[[a|A]]');
|
||||
expect(html).toContain('[[b|B]]');
|
||||
expect(html).toContain('[[c|C]]');
|
||||
|
||||
// Should have exactly 3 cells in body row
|
||||
const bodyMatch = html.match(/<tbody>(.*?)<\/tbody>/s);
|
||||
expect(bodyMatch).toBeTruthy();
|
||||
const bodyCells = (bodyMatch[0].match(/<td>/g) || []).length;
|
||||
expect(bodyCells).toBe(3);
|
||||
});
|
||||
|
||||
it('should render table with wikilink containing section and alias', () => {
|
||||
const md = MarkdownIt();
|
||||
escapeWikilinkPipes(md);
|
||||
|
||||
const markdown = `| Link |
|
||||
| --- |
|
||||
| [[note#section|alias text]] |`;
|
||||
|
||||
const html = md.render(markdown);
|
||||
|
||||
// Wikilink with section and alias should be intact
|
||||
expect(html).toContain('[[note#section|alias text]]');
|
||||
|
||||
// Should be in a single cell
|
||||
const bodyMatch = html.match(/<tbody>(.*?)<\/tbody>/s);
|
||||
const bodyCells = (bodyMatch[0].match(/<td>/g) || []).length;
|
||||
expect(bodyCells).toBe(1);
|
||||
});
|
||||
|
||||
it('should render table with embed wikilink alias correctly', () => {
|
||||
const md = MarkdownIt();
|
||||
escapeWikilinkPipes(md);
|
||||
|
||||
const markdown = `| Embed |
|
||||
| --- |
|
||||
| ![[image|caption]] |`;
|
||||
|
||||
const html = md.render(markdown);
|
||||
|
||||
// Embed wikilink should be preserved
|
||||
expect(html).toContain('![[image|caption]]');
|
||||
});
|
||||
|
||||
it('should not affect wikilinks without aliases', () => {
|
||||
const md = MarkdownIt();
|
||||
escapeWikilinkPipes(md);
|
||||
|
||||
const markdown = `| Link |
|
||||
| --- |
|
||||
| [[note-without-alias]] |`;
|
||||
|
||||
const html = md.render(markdown);
|
||||
|
||||
// Regular wikilink should still work
|
||||
expect(html).toContain('[[note-without-alias]]');
|
||||
});
|
||||
|
||||
it('should not affect wikilinks outside of tables', () => {
|
||||
const md = MarkdownIt();
|
||||
escapeWikilinkPipes(md);
|
||||
|
||||
const markdown = `
|
||||
Paragraph with [[note|alias]] link.
|
||||
|
||||
| Column |
|
||||
| --- |
|
||||
| [[table-note|table-alias]] |
|
||||
|
||||
Another [[note2|alias2]] paragraph.
|
||||
`;
|
||||
|
||||
const html = md.render(markdown);
|
||||
|
||||
// All wikilinks should be preserved
|
||||
expect(html).toContain('[[note|alias]]');
|
||||
expect(html).toContain('[[table-note|table-alias]]');
|
||||
expect(html).toContain('[[note2|alias2]]');
|
||||
});
|
||||
|
||||
it('should handle table with mixed content and wikilinks', () => {
|
||||
const md = MarkdownIt();
|
||||
escapeWikilinkPipes(md);
|
||||
|
||||
const markdown = `| Text | Link | Mixed |
|
||||
| --- | --- | --- |
|
||||
| plain text | [[note|alias]] | text [[link|L]] more |`;
|
||||
|
||||
const html = md.render(markdown);
|
||||
|
||||
// Both wikilinks should be preserved
|
||||
expect(html).toContain('[[note|alias]]');
|
||||
expect(html).toContain('[[link|L]]');
|
||||
|
||||
// Should have 3 cells
|
||||
const bodyMatch = html.match(/<tbody>(.*?)<\/tbody>/s);
|
||||
const bodyCells = (bodyMatch[0].match(/<td>/g) || []).length;
|
||||
expect(bodyCells).toBe(3);
|
||||
});
|
||||
|
||||
it('should handle tables without wikilinks', () => {
|
||||
const md = MarkdownIt();
|
||||
escapeWikilinkPipes(md);
|
||||
|
||||
const markdown = `| Col1 | Col2 |
|
||||
| --- | --- |
|
||||
| text | more |`;
|
||||
|
||||
const html = md.render(markdown);
|
||||
|
||||
// Should render normal table
|
||||
expect(html).toContain('<table>');
|
||||
expect(html).toContain('text');
|
||||
expect(html).toContain('more');
|
||||
});
|
||||
|
||||
it('should not leave placeholder character in rendered output', () => {
|
||||
const md = MarkdownIt();
|
||||
escapeWikilinkPipes(md);
|
||||
|
||||
const markdown = `| Col1 | Col2 |
|
||||
| --- | --- |
|
||||
| [[a|A]] | [[b|B]] |`;
|
||||
|
||||
const html = md.render(markdown);
|
||||
|
||||
// Should not contain the internal placeholder
|
||||
expect(html).not.toContain(PIPE_PLACEHOLDER);
|
||||
});
|
||||
|
||||
it('should handle complex wikilink aliases with special characters', () => {
|
||||
const md = MarkdownIt();
|
||||
escapeWikilinkPipes(md);
|
||||
|
||||
const markdown = `| Link |
|
||||
| --- |
|
||||
| [[note-with-dashes|Alias with spaces & special!]] |`;
|
||||
|
||||
const html = md.render(markdown);
|
||||
|
||||
expect(html).toContain(
|
||||
'[[note-with-dashes|Alias with spaces & special!]]'
|
||||
);
|
||||
});
|
||||
|
||||
it('should handle multiple rows with wikilink aliases', () => {
|
||||
const md = MarkdownIt();
|
||||
escapeWikilinkPipes(md);
|
||||
|
||||
const markdown = `| Links |
|
||||
| --- |
|
||||
| [[note1|alias1]] |
|
||||
| [[note2|alias2]] |
|
||||
| [[note3|alias3]] |`;
|
||||
|
||||
const html = md.render(markdown);
|
||||
|
||||
// All three should be preserved
|
||||
expect(html).toContain('[[note1|alias1]]');
|
||||
expect(html).toContain('[[note2|alias2]]');
|
||||
expect(html).toContain('[[note3|alias3]]');
|
||||
|
||||
// Should have 3 rows in tbody
|
||||
const bodyMatch = html.match(/<tbody>(.*?)<\/tbody>/s);
|
||||
const bodyRows = (bodyMatch[0].match(/<tr>/g) || []).length;
|
||||
expect(bodyRows).toBe(3);
|
||||
});
|
||||
|
||||
it('should work when markdown-it does not have table support', () => {
|
||||
const md = MarkdownIt();
|
||||
md.disable(['table']);
|
||||
|
||||
// Should not throw when table rule doesn't exist
|
||||
expect(() => escapeWikilinkPipes(md)).not.toThrow();
|
||||
});
|
||||
});
|
||||
@@ -0,0 +1,105 @@
|
||||
/*global markdownit:readonly*/
|
||||
|
||||
/**
|
||||
* Markdown-it plugin to handle wikilink aliases in tables by wrapping the table parser.
|
||||
*
|
||||
* This plugin addresses issue #1544 where wikilink aliases (e.g., [[note|alias]])
|
||||
* are incorrectly split into separate table cells because the pipe character `|`
|
||||
* is used both as a wikilink alias separator and a table column separator.
|
||||
*
|
||||
* The plugin works by wrapping the table block parser:
|
||||
* 1. Before the table parser runs, temporarily replace pipes in wikilinks with a placeholder
|
||||
* 2. Let the table parser create the table structure and inline tokens
|
||||
* 3. After the table parser returns, restore pipes in the inline token content
|
||||
* 4. Later inline parsing will see the correct wikilink syntax with pipes
|
||||
*
|
||||
* This approach keeps all encoding/decoding logic localized to this single function,
|
||||
* making it invisible to the rest of the codebase.
|
||||
*/
|
||||
|
||||
// Unique placeholder that's unlikely to appear in normal markdown text
|
||||
// Note: We've tested various text-based placeholders but all fail:
|
||||
// - "___FOAM_ALIAS_DIVIDER___" - underscores interpreted as emphasis markers
|
||||
// - "FOAM__INTERNAL__..." - double underscores cause strong emphasis issues
|
||||
// - "FOAMINTERNALALIASDIVIDERPLACEHOLDER" - gets truncated (output: "[[noteFOAMINTERN")
|
||||
// Solution: Use a single Unicode character (U+F8FF Private Use Area) that:
|
||||
// - Has no markdown meaning
|
||||
// - Won't be split or modified by parsers
|
||||
// - Is extremely unlikely to appear in user content
|
||||
export const PIPE_PLACEHOLDER = '\uF8FF';
|
||||
|
||||
/**
|
||||
* Regex to match wikilinks with pipes (aliases or multiple pipes)
|
||||
* Matches:
|
||||
* - [[note|alias]]
|
||||
* - ![[note|alias]] (embeds)
|
||||
* - [[note#section|alias]]
|
||||
*/
|
||||
const WIKILINK_WITH_PIPE_REGEX = /!?\[\[([^\]]*?\|[^\]]*?)\]\]/g;
|
||||
|
||||
/**
|
||||
* Replace pipes within wikilinks with placeholder
|
||||
*/
|
||||
function encodePipesInWikilinks(text: string): string {
|
||||
return text.replace(WIKILINK_WITH_PIPE_REGEX, match => {
|
||||
return match.replace(/\|/g, PIPE_PLACEHOLDER);
|
||||
});
|
||||
}
|
||||
|
||||
/**
|
||||
* Restore pipes from placeholder in text
|
||||
*/
|
||||
function decodePipesInWikilinks(text: string): string {
|
||||
return text.replace(new RegExp(PIPE_PLACEHOLDER, 'g'), '|');
|
||||
}
|
||||
|
||||
export const escapeWikilinkPipes = (md: markdownit) => {
|
||||
// Get the original table parser function
|
||||
// Note: __find__ and __rules__ are internal APIs but necessary for wrapping
|
||||
const ruler = md.block.ruler as any;
|
||||
const tableRuleIndex = ruler.__find__('table');
|
||||
if (tableRuleIndex === -1) {
|
||||
// Table rule not found (maybe GFM tables not enabled), skip wrapping
|
||||
return md;
|
||||
}
|
||||
|
||||
const originalTableRule = ruler.__rules__[tableRuleIndex].fn;
|
||||
|
||||
// Create wrapped table parser
|
||||
const wrappedTableRule = function (state, startLine, endLine, silent) {
|
||||
// Store the token count before parsing to identify new tokens
|
||||
const tokensBefore = state.tokens.length;
|
||||
|
||||
// 1. ENCODE: Replace pipes in wikilinks with placeholder in source
|
||||
const originalSrc = state.src;
|
||||
state.src = encodePipesInWikilinks(state.src);
|
||||
|
||||
// 2. Call the original table parser
|
||||
// It will create tokens with encoded content (pipes replaced)
|
||||
const result = originalTableRule(state, startLine, endLine, silent);
|
||||
|
||||
// 3. DECODE: Restore pipes in the newly created inline tokens
|
||||
if (result) {
|
||||
// Only process tokens that were created by this table parse
|
||||
for (let i = tokensBefore; i < state.tokens.length; i++) {
|
||||
const token = state.tokens[i];
|
||||
// Inline tokens contain the cell content that needs decoding
|
||||
if (token.type === 'inline' && token.content) {
|
||||
token.content = decodePipesInWikilinks(token.content);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 4. Restore original source
|
||||
state.src = originalSrc;
|
||||
|
||||
return result;
|
||||
};
|
||||
|
||||
// Replace the table rule with our wrapped version
|
||||
md.block.ruler.at('table', wrappedTableRule);
|
||||
|
||||
return md;
|
||||
};
|
||||
|
||||
export default escapeWikilinkPipes;
|
||||
@@ -6,6 +6,7 @@ import { default as markdownItFoamTags } from './tag-highlight';
|
||||
import { default as markdownItWikilinkNavigation } from './wikilink-navigation';
|
||||
import { default as markdownItRemoveLinkReferences } from './remove-wikilink-references';
|
||||
import { default as markdownItWikilinkEmbed } from './wikilink-embed';
|
||||
import { default as escapeWikilinkPipes } from './escape-wikilink-pipes';
|
||||
|
||||
export default async function activate(
|
||||
context: vscode.ExtensionContext,
|
||||
@@ -16,6 +17,7 @@ export default async function activate(
|
||||
return {
|
||||
extendMarkdownIt: (md: markdownit) => {
|
||||
return [
|
||||
escapeWikilinkPipes,
|
||||
markdownItWikilinkEmbed,
|
||||
markdownItFoamTags,
|
||||
markdownItWikilinkNavigation,
|
||||
|
||||
@@ -3,8 +3,6 @@
|
||||
import markdownItRegex from 'markdown-it-regex';
|
||||
import { FoamWorkspace } from '../../core/model/workspace';
|
||||
import { Logger } from '../../core/utils/log';
|
||||
import { isNone } from '../../core/utils';
|
||||
import { commandAsURI } from '../../utils/commands';
|
||||
|
||||
export const markdownItFoamTags = (
|
||||
md: markdownit,
|
||||
|
||||
@@ -5,6 +5,7 @@ import { createTestNote } from '../../test/test-utils';
|
||||
import { getUriInWorkspace } from '../../test/test-utils-vscode';
|
||||
import { default as markdownItWikilinkNavigation } from './wikilink-navigation';
|
||||
import { default as markdownItRemoveLinkReferences } from './remove-wikilink-references';
|
||||
import { default as escapeWikilinkPipes } from './escape-wikilink-pipes';
|
||||
|
||||
describe('Link generation in preview', () => {
|
||||
const noteA = createTestNote({
|
||||
@@ -23,6 +24,7 @@ describe('Link generation in preview', () => {
|
||||
const ws = new FoamWorkspace().set(noteA).set(noteB);
|
||||
|
||||
const md = [
|
||||
escapeWikilinkPipes,
|
||||
markdownItWikilinkNavigation,
|
||||
markdownItRemoveLinkReferences,
|
||||
].reduce((acc, extension) => extension(acc, ws), MarkdownIt());
|
||||
@@ -91,4 +93,71 @@ describe('Link generation in preview', () => {
|
||||
`<p><a class='foam-placeholder-link' title="Link to non-existing resource" href="javascript:void(0);">this note</a></p>\n`
|
||||
);
|
||||
});
|
||||
|
||||
describe('wikilinks with aliases in tables', () => {
|
||||
it('generates a link with alias inside a table cell', () => {
|
||||
const table = `| Week | Week again |
|
||||
| --- | --- |
|
||||
| [[note-a|W44]] | [[note-b|W45]] |`;
|
||||
const result = md.render(table);
|
||||
|
||||
// Should contain proper links with aliases
|
||||
expect(result).toContain(
|
||||
`<a class='foam-note-link' title='${noteA.title}' href='/path/to/note-a.md' data-href='/path/to/note-a.md'>W44</a>`
|
||||
);
|
||||
expect(result).toContain(
|
||||
`<a class='foam-note-link' title='${noteB.title}' href='/path2/to/note-b.md' data-href='/path2/to/note-b.md'>W45</a>`
|
||||
);
|
||||
});
|
||||
|
||||
it('generates a link with alias and section inside a table cell', () => {
|
||||
const table = `| Week |
|
||||
| --- |
|
||||
| [[note-b#sec1|Week 1]] |`;
|
||||
const result = md.render(table);
|
||||
|
||||
expect(result).toContain(
|
||||
`<a class='foam-note-link' title='${noteB.title}#sec1' href='/path2/to/note-b.md#sec1' data-href='/path2/to/note-b.md#sec1'>Week 1</a>`
|
||||
);
|
||||
});
|
||||
|
||||
it('generates placeholder link with alias inside a table cell', () => {
|
||||
const table = `| Week |
|
||||
| --- |
|
||||
| [[nonexistent|Placeholder]] |`;
|
||||
const result = md.render(table);
|
||||
|
||||
expect(result).toContain(
|
||||
`<a class='foam-placeholder-link' title="Link to non-existing resource" href="javascript:void(0);">Placeholder</a>`
|
||||
);
|
||||
});
|
||||
|
||||
it('handles multiple wikilinks with aliases in the same table row', () => {
|
||||
const table = `| Col1 | Col2 | Col3 |
|
||||
| --- | --- | --- |
|
||||
| [[note-a|A]] | [[note-b|B]] | [[placeholder|P]] |`;
|
||||
const result = md.render(table);
|
||||
|
||||
expect(result).toContain(
|
||||
`<a class='foam-note-link' title='${noteA.title}' href='/path/to/note-a.md' data-href='/path/to/note-a.md'>A</a>`
|
||||
);
|
||||
expect(result).toContain(
|
||||
`<a class='foam-note-link' title='${noteB.title}' href='/path2/to/note-b.md' data-href='/path2/to/note-b.md'>B</a>`
|
||||
);
|
||||
expect(result).toContain(
|
||||
`<a class='foam-placeholder-link' title="Link to non-existing resource" href="javascript:void(0);">P</a>`
|
||||
);
|
||||
});
|
||||
|
||||
it('handles wikilinks without aliases in tables (should still work)', () => {
|
||||
const table = `| Week |
|
||||
| --- |
|
||||
| [[note-a]] |`;
|
||||
const result = md.render(table);
|
||||
|
||||
expect(result).toContain(
|
||||
`<a class='foam-note-link' title='${noteA.title}' href='/path/to/note-a.md' data-href='/path/to/note-a.md'>${noteA.title}</a>`
|
||||
);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
@@ -49,8 +49,6 @@ describe('FoamWorkspaceSymbolProvider', () => {
|
||||
});
|
||||
|
||||
describe('provideWorkspaceSymbols', () => {
|
||||
const provider = new FoamWorkspaceSymbolProvider(new FoamWorkspace());
|
||||
|
||||
it('should return empty array when workspace is empty', () => {
|
||||
const provider = new FoamWorkspaceSymbolProvider(new FoamWorkspace());
|
||||
|
||||
|
||||
@@ -225,14 +225,38 @@ export function asAbsoluteWorkspaceUri(
|
||||
return res;
|
||||
}
|
||||
|
||||
export async function createMatcherAndDataStore(excludes: string[]): Promise<{
|
||||
export async function createMatcherAndDataStore(
|
||||
includes: string[],
|
||||
excludes: string[]
|
||||
): Promise<{
|
||||
matcher: IMatcher;
|
||||
dataStore: IDataStore;
|
||||
includePatterns: Map<string, string[]>;
|
||||
excludePatterns: Map<string, string[]>;
|
||||
}> {
|
||||
const includePatterns = new Map<string, string[]>();
|
||||
const excludePatterns = new Map<string, string[]>();
|
||||
workspace.workspaceFolders.forEach(f => excludePatterns.set(f.name, []));
|
||||
workspace.workspaceFolders.forEach(f => {
|
||||
includePatterns.set(f.name, []);
|
||||
excludePatterns.set(f.name, []);
|
||||
});
|
||||
|
||||
// Process include patterns
|
||||
for (const include of includes) {
|
||||
const tokens = include.split('/');
|
||||
const matchesFolder = workspace.workspaceFolders.find(
|
||||
f => f.name === tokens[0]
|
||||
);
|
||||
if (matchesFolder) {
|
||||
includePatterns.get(tokens[0]).push(tokens.slice(1).join('/'));
|
||||
} else {
|
||||
for (const [, value] of includePatterns.entries()) {
|
||||
value.push(include);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Process exclude patterns
|
||||
for (const exclude of excludes) {
|
||||
const tokens = exclude.split('/');
|
||||
const matchesFolder = workspace.workspaceFolders.find(
|
||||
@@ -248,19 +272,41 @@ export async function createMatcherAndDataStore(excludes: string[]): Promise<{
|
||||
}
|
||||
|
||||
const listFiles = async () => {
|
||||
let files: Uri[] = [];
|
||||
let allFiles: Uri[] = [];
|
||||
|
||||
for (const folder of workspace.workspaceFolders) {
|
||||
const uris = await workspace.findFiles(
|
||||
new RelativePattern(folder.uri, '**/*'),
|
||||
new RelativePattern(
|
||||
folder.uri,
|
||||
`{${excludePatterns.get(folder.name).join(',')}}`
|
||||
)
|
||||
const folderIncludes = includePatterns.get(folder.name);
|
||||
const folderExcludes = excludePatterns.get(folder.name);
|
||||
const excludePattern =
|
||||
folderExcludes.length > 0
|
||||
? new RelativePattern(folder.uri, `{${folderExcludes.join(',')}}`)
|
||||
: null;
|
||||
|
||||
// If includes are empty, include nothing
|
||||
if (folderIncludes.length === 0) {
|
||||
continue;
|
||||
}
|
||||
|
||||
const filesFromAllPatterns: Uri[] = [];
|
||||
|
||||
// Apply each include pattern
|
||||
for (const includePattern of folderIncludes) {
|
||||
const uris = await workspace.findFiles(
|
||||
new RelativePattern(folder.uri, includePattern),
|
||||
excludePattern
|
||||
);
|
||||
filesFromAllPatterns.push(...uris);
|
||||
}
|
||||
|
||||
// Deduplicate files (same file may match multiple patterns)
|
||||
const uniqueFiles = Array.from(
|
||||
new Map(filesFromAllPatterns.map(uri => [uri.fsPath, uri])).values()
|
||||
);
|
||||
files = [...files, ...uris];
|
||||
|
||||
allFiles = [...allFiles, ...uniqueFiles];
|
||||
}
|
||||
|
||||
return files.map(fromVsCodeUri);
|
||||
return allFiles.map(fromVsCodeUri);
|
||||
};
|
||||
|
||||
const decoder = new TextDecoder('utf-8');
|
||||
@@ -270,9 +316,14 @@ export async function createMatcherAndDataStore(excludes: string[]): Promise<{
|
||||
};
|
||||
|
||||
const dataStore = new GenericDataStore(listFiles, readFile);
|
||||
const matcher = isEmpty(excludes)
|
||||
? new AlwaysIncludeMatcher()
|
||||
: await FileListBasedMatcher.createFromListFn(listFiles);
|
||||
const matcher =
|
||||
isEmpty(excludes) && includes.length === 1 && includes[0] === '**/*'
|
||||
? new AlwaysIncludeMatcher()
|
||||
: await FileListBasedMatcher.createFromListFn(
|
||||
listFiles,
|
||||
includes,
|
||||
excludes
|
||||
);
|
||||
|
||||
return { matcher, dataStore, excludePatterns };
|
||||
return { matcher, dataStore, includePatterns, excludePatterns };
|
||||
}
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
/* @unit-ready */
|
||||
import { getNotesExtensions } from './settings';
|
||||
import { getNotesExtensions, getIncludeFilesSetting } from './settings';
|
||||
import { withModifiedFoamConfiguration } from './test/test-utils-vscode';
|
||||
|
||||
describe('Default note settings', () => {
|
||||
@@ -31,3 +31,53 @@ describe('Default note settings', () => {
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
describe('Include files settings', () => {
|
||||
it('should default to **/* when not configured', () => {
|
||||
const includes = getIncludeFilesSetting();
|
||||
expect(includes).toEqual(['**/*']);
|
||||
});
|
||||
|
||||
it('should return custom include patterns when configured', async () => {
|
||||
await withModifiedFoamConfiguration(
|
||||
'files.include',
|
||||
['notes/**'],
|
||||
async () => {
|
||||
const includes = getIncludeFilesSetting();
|
||||
expect(includes).toEqual(['notes/**']);
|
||||
}
|
||||
);
|
||||
});
|
||||
|
||||
it('should support multiple include patterns', async () => {
|
||||
await withModifiedFoamConfiguration(
|
||||
'files.include',
|
||||
['docs/**', 'notes/**', '**/*.md'],
|
||||
async () => {
|
||||
const includes = getIncludeFilesSetting();
|
||||
expect(includes).toEqual(['docs/**', 'notes/**', '**/*.md']);
|
||||
}
|
||||
);
|
||||
});
|
||||
|
||||
it('should expand alternate groups in include patterns', async () => {
|
||||
await withModifiedFoamConfiguration(
|
||||
'files.include',
|
||||
['**/*.{md,mdx,markdown}'],
|
||||
async () => {
|
||||
const includes = getIncludeFilesSetting();
|
||||
expect(includes).toEqual(
|
||||
expect.arrayContaining(['**/*.md', '**/*.mdx', '**/*.markdown'])
|
||||
);
|
||||
expect(includes.length).toBe(3);
|
||||
}
|
||||
);
|
||||
});
|
||||
|
||||
it('should return empty array when configured with empty array', async () => {
|
||||
await withModifiedFoamConfiguration('files.include', [], async () => {
|
||||
const includes = getIncludeFilesSetting();
|
||||
expect(includes).toEqual([]);
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
@@ -39,11 +39,20 @@ export function getAttachmentsExtensions() {
|
||||
.map(ext => '.' + ext.trim());
|
||||
}
|
||||
|
||||
/** Retrieve the list of file ignoring globs. */
|
||||
export function getIgnoredFilesSetting(): GlobPattern[] {
|
||||
/** Retrieve the list of file exclude globs. */
|
||||
export function getExcludedFilesSetting(): GlobPattern[] {
|
||||
return [
|
||||
'**/.foam/**',
|
||||
...workspace.getConfiguration().get('foam.files.ignore', []),
|
||||
...workspace.getConfiguration().get('foam.files.exclude', []),
|
||||
...workspace.getConfiguration().get('foam.files.ignore', []), // deprecated, for backward compatibility
|
||||
...Object.keys(workspace.getConfiguration().get('files.exclude', {})),
|
||||
].flatMap(expandAlternateGroups);
|
||||
}
|
||||
|
||||
/** Retrieve the list of file include globs. */
|
||||
export function getIncludeFilesSetting(): GlobPattern[] {
|
||||
return workspace
|
||||
.getConfiguration()
|
||||
.get('foam.files.include', ['**/*'])
|
||||
.flatMap(expandAlternateGroups);
|
||||
}
|
||||
|
||||
@@ -80,7 +80,9 @@ export const waitForNoteInFoamWorkspace = async (uri: URI, timeout = 5000) => {
|
||||
}
|
||||
await wait(100);
|
||||
}
|
||||
return false;
|
||||
throw new Error(
|
||||
`Timeout waiting for note ${uri.toString()} in Foam workspace`
|
||||
);
|
||||
};
|
||||
|
||||
/**
|
||||
|
||||
@@ -7,13 +7,51 @@ import { Range } from '../core/model/range';
|
||||
import { URI } from '../core/model/uri';
|
||||
import { FoamWorkspace } from '../core/model/workspace';
|
||||
import { MarkdownResourceProvider } from '../core/services/markdown-provider';
|
||||
import { NoteLinkDefinition, Resource } from '../core/model/note';
|
||||
import { Resource } from '../core/model/note';
|
||||
import { createMarkdownParser } from '../core/services/markdown-parser';
|
||||
import { IDataStore } from '../core/services/datastore';
|
||||
|
||||
export { default as waitForExpect } from 'wait-for-expect';
|
||||
|
||||
Logger.setLevel('error');
|
||||
|
||||
/**
|
||||
* An in-memory data store for testing that stores file content in a Map.
|
||||
* This allows tests to provide text content for notes without touching the filesystem.
|
||||
*/
|
||||
export class InMemoryDataStore implements IDataStore {
|
||||
private files = new Map<string, string>();
|
||||
|
||||
/**
|
||||
* Set the content for a file
|
||||
*/
|
||||
set(uri: URI, content: string): void {
|
||||
this.files.set(uri.path, content);
|
||||
}
|
||||
|
||||
/**
|
||||
* Delete a file
|
||||
*/
|
||||
delete(uri: URI): void {
|
||||
this.files.delete(uri.path);
|
||||
}
|
||||
|
||||
/**
|
||||
* Clear all files
|
||||
*/
|
||||
clear(): void {
|
||||
this.files.clear();
|
||||
}
|
||||
|
||||
async list(): Promise<URI[]> {
|
||||
return Array.from(this.files.keys()).map(path => URI.parse(path, 'file'));
|
||||
}
|
||||
|
||||
async read(uri: URI): Promise<string | null> {
|
||||
return this.files.get(uri.path) ?? null;
|
||||
}
|
||||
}
|
||||
|
||||
export const TEST_DATA_DIR = URI.file(__dirname).joinPath(
|
||||
'..',
|
||||
'..',
|
||||
@@ -29,11 +67,14 @@ const position = Range.create(0, 0, 0, 100);
|
||||
*/
|
||||
export const strToUri = URI.file;
|
||||
|
||||
export const createTestWorkspace = (workspaceRoots: URI[] = []) => {
|
||||
export const createTestWorkspace = (
|
||||
workspaceRoots: URI[] = [],
|
||||
dataStore?: IDataStore
|
||||
) => {
|
||||
const workspace = new FoamWorkspace();
|
||||
const parser = createMarkdownParser();
|
||||
const provider = new MarkdownResourceProvider(
|
||||
{
|
||||
dataStore ?? {
|
||||
read: _ => Promise.resolve(''),
|
||||
list: () => Promise.resolve([]),
|
||||
},
|
||||
@@ -51,7 +92,6 @@ export const createTestNote = (params: {
|
||||
links?: Array<{ slug: string; definitionUrl?: string } | { to: string }>;
|
||||
tags?: string[];
|
||||
aliases?: string[];
|
||||
text?: string;
|
||||
sections?: string[];
|
||||
root?: URI;
|
||||
type?: string;
|
||||
|
||||
@@ -17,10 +17,12 @@ import { createMarkdownParser } from '../core/services/markdown-parser';
|
||||
import {
|
||||
GenericDataStore,
|
||||
AlwaysIncludeMatcher,
|
||||
IWatcher,
|
||||
} from '../core/services/datastore';
|
||||
import { MarkdownResourceProvider } from '../core/services/markdown-provider';
|
||||
import { randomString } from './test-utils';
|
||||
import micromatch from 'micromatch';
|
||||
import { Emitter } from '../core/common/event';
|
||||
|
||||
interface Thenable<T> {
|
||||
then<TResult>(
|
||||
@@ -247,6 +249,13 @@ export function createVSCodeUri(foamUri: URI): Uri {
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Convert VS Code Uri to Foam URI
|
||||
*/
|
||||
export function fromVsCodeUri(vsCodeUri: Uri): URI {
|
||||
return URI.file(vsCodeUri.fsPath);
|
||||
}
|
||||
|
||||
// VS Code Uri static methods
|
||||
// eslint-disable-next-line @typescript-eslint/no-redeclare
|
||||
export const Uri = {
|
||||
@@ -423,6 +432,12 @@ export enum DiagnosticSeverity {
|
||||
Hint = 3,
|
||||
}
|
||||
|
||||
export enum ProgressLocation {
|
||||
SourceControl = 1,
|
||||
Window = 10,
|
||||
Notification = 15,
|
||||
}
|
||||
|
||||
// ===== Code Actions =====
|
||||
|
||||
export class CodeActionKind {
|
||||
@@ -588,6 +603,57 @@ export interface Disposable {
|
||||
dispose(): void;
|
||||
}
|
||||
|
||||
// ===== Cancellation =====
|
||||
|
||||
export interface CancellationToken {
|
||||
readonly isCancellationRequested: boolean;
|
||||
readonly onCancellationRequested: Event<any>;
|
||||
}
|
||||
|
||||
export class CancellationTokenSource {
|
||||
private _token: CancellationToken | undefined;
|
||||
private _emitter: EventEmitter<any> | undefined;
|
||||
private _isCancelled = false;
|
||||
|
||||
get token(): CancellationToken {
|
||||
if (!this._token) {
|
||||
this._emitter = new EventEmitter<any>();
|
||||
this._token = {
|
||||
isCancellationRequested: this._isCancelled,
|
||||
onCancellationRequested: this._emitter.event,
|
||||
};
|
||||
}
|
||||
return this._token;
|
||||
}
|
||||
|
||||
cancel(): void {
|
||||
if (!this._isCancelled) {
|
||||
this._isCancelled = true;
|
||||
if (this._emitter) {
|
||||
this._emitter.fire(undefined);
|
||||
}
|
||||
// Update token state
|
||||
if (this._token) {
|
||||
(this._token as any).isCancellationRequested = true;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
dispose(): void {
|
||||
if (this._emitter) {
|
||||
this._emitter.dispose();
|
||||
this._emitter = undefined;
|
||||
}
|
||||
this._token = undefined;
|
||||
}
|
||||
}
|
||||
|
||||
// ===== Progress =====
|
||||
|
||||
export interface Progress<T> {
|
||||
report(value: T): void;
|
||||
}
|
||||
|
||||
export class EventEmitter<T> {
|
||||
private listeners: ((e: T) => any)[] = [];
|
||||
|
||||
@@ -791,11 +857,21 @@ class MockTextDocument implements TextDocument {
|
||||
this._content = content;
|
||||
// Write the content to file if provided
|
||||
try {
|
||||
const existed = fs.existsSync(uri.fsPath);
|
||||
const dir = path.dirname(uri.fsPath);
|
||||
if (!fs.existsSync(dir)) {
|
||||
fs.mkdirSync(dir, { recursive: true });
|
||||
}
|
||||
fs.writeFileSync(uri.fsPath, content);
|
||||
|
||||
// Manually fire watcher events (can't use async workspace.fs in constructor)
|
||||
for (const watcher of mockState.fileWatchers) {
|
||||
if (existed) {
|
||||
watcher._fireChange(uri);
|
||||
} else {
|
||||
watcher._fireCreate(uri);
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
// Ignore write errors in mock
|
||||
}
|
||||
@@ -1115,10 +1191,31 @@ class MockFileSystem implements FileSystem {
|
||||
}
|
||||
|
||||
async writeFile(uri: Uri, content: Uint8Array): Promise<void> {
|
||||
// Check if file exists before writing
|
||||
const existed = await this.exists(uri);
|
||||
|
||||
// Ensure directory exists
|
||||
const dir = path.dirname(uri.fsPath);
|
||||
await fs.promises.mkdir(dir, { recursive: true });
|
||||
await fs.promises.writeFile(uri.fsPath, content);
|
||||
|
||||
// Fire watcher events
|
||||
for (const watcher of mockState.fileWatchers) {
|
||||
if (existed) {
|
||||
watcher._fireChange(uri);
|
||||
} else {
|
||||
watcher._fireCreate(uri);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private async exists(uri: Uri): Promise<boolean> {
|
||||
try {
|
||||
await fs.promises.access(uri.fsPath);
|
||||
return true;
|
||||
} catch {
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
||||
async delete(uri: Uri, options?: { recursive?: boolean }): Promise<void> {
|
||||
@@ -1133,6 +1230,11 @@ class MockFileSystem implements FileSystem {
|
||||
} else {
|
||||
await fs.promises.unlink(uri.fsPath);
|
||||
}
|
||||
|
||||
// Fire watcher events
|
||||
for (const watcher of mockState.fileWatchers) {
|
||||
watcher._fireDelete(uri);
|
||||
}
|
||||
}
|
||||
|
||||
async stat(
|
||||
@@ -1175,6 +1277,84 @@ class MockFileSystem implements FileSystem {
|
||||
options?: { overwrite?: boolean }
|
||||
): Promise<void> {
|
||||
await fs.promises.rename(source.fsPath, target.fsPath);
|
||||
|
||||
// Fire watcher events (rename = delete + create)
|
||||
for (const watcher of mockState.fileWatchers) {
|
||||
watcher._fireDelete(source);
|
||||
watcher._fireCreate(target);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// ===== File System Watcher =====
|
||||
|
||||
export interface FileSystemWatcher extends Disposable {
|
||||
onDidCreate: Event<Uri>;
|
||||
onDidChange: Event<Uri>;
|
||||
onDidDelete: Event<Uri>;
|
||||
ignoreCreateEvents: boolean;
|
||||
ignoreChangeEvents: boolean;
|
||||
ignoreDeleteEvents: boolean;
|
||||
}
|
||||
|
||||
class MockFileSystemWatcher implements FileSystemWatcher {
|
||||
private onDidCreateEmitter = new Emitter<Uri>();
|
||||
private onDidChangeEmitter = new Emitter<Uri>();
|
||||
private onDidDeleteEmitter = new Emitter<Uri>();
|
||||
|
||||
onDidCreate = this.onDidCreateEmitter.event;
|
||||
onDidChange = this.onDidChangeEmitter.event;
|
||||
onDidDelete = this.onDidDeleteEmitter.event;
|
||||
|
||||
ignoreCreateEvents = false;
|
||||
ignoreChangeEvents = false;
|
||||
ignoreDeleteEvents = false;
|
||||
|
||||
constructor(private pattern: string) {
|
||||
// Register this watcher in mockState (will be added to mockState)
|
||||
if (mockState.fileWatchers) {
|
||||
mockState.fileWatchers.push(this);
|
||||
}
|
||||
}
|
||||
|
||||
// Internal methods called by MockFileSystem
|
||||
_fireCreate(uri: Uri) {
|
||||
if (!this.ignoreCreateEvents && this.matches(uri)) {
|
||||
this.onDidCreateEmitter.fire(uri);
|
||||
}
|
||||
}
|
||||
|
||||
_fireChange(uri: Uri) {
|
||||
if (!this.ignoreChangeEvents && this.matches(uri)) {
|
||||
this.onDidChangeEmitter.fire(uri);
|
||||
}
|
||||
}
|
||||
|
||||
_fireDelete(uri: Uri) {
|
||||
if (!this.ignoreDeleteEvents && this.matches(uri)) {
|
||||
this.onDidDeleteEmitter.fire(uri);
|
||||
}
|
||||
}
|
||||
|
||||
private matches(uri: Uri): boolean {
|
||||
const workspaceFolder = mockState.workspaceFolders[0];
|
||||
if (!workspaceFolder) return false;
|
||||
|
||||
const relativePath = path.relative(workspaceFolder.uri.fsPath, uri.fsPath);
|
||||
// Use micromatch (already imported) for glob matching
|
||||
return micromatch.isMatch(relativePath, this.pattern);
|
||||
}
|
||||
|
||||
dispose() {
|
||||
if (mockState.fileWatchers) {
|
||||
const index = mockState.fileWatchers.indexOf(this);
|
||||
if (index >= 0) {
|
||||
mockState.fileWatchers.splice(index, 1);
|
||||
}
|
||||
}
|
||||
this.onDidCreateEmitter.dispose();
|
||||
this.onDidChangeEmitter.dispose();
|
||||
this.onDidDeleteEmitter.dispose();
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1343,17 +1523,41 @@ class TestFoam {
|
||||
// Create resource providers
|
||||
const providers = [new MarkdownResourceProvider(dataStore, parser)];
|
||||
|
||||
// Use the bootstrap function without file watcher (simpler for tests)
|
||||
// Create file watcher for automatic workspace updates
|
||||
const vsCodeWatcher = workspace.createFileSystemWatcher('**/*');
|
||||
|
||||
// Convert VS Code Uri events to Foam URI events
|
||||
const onDidCreateEmitter = new Emitter<URI>();
|
||||
const onDidChangeEmitter = new Emitter<URI>();
|
||||
const onDidDeleteEmitter = new Emitter<URI>();
|
||||
|
||||
vsCodeWatcher.onDidCreate(uri =>
|
||||
onDidCreateEmitter.fire(fromVsCodeUri(uri))
|
||||
);
|
||||
vsCodeWatcher.onDidChange(uri =>
|
||||
onDidChangeEmitter.fire(fromVsCodeUri(uri))
|
||||
);
|
||||
vsCodeWatcher.onDidDelete(uri =>
|
||||
onDidDeleteEmitter.fire(fromVsCodeUri(uri))
|
||||
);
|
||||
|
||||
const foamWatcher: IWatcher = {
|
||||
onDidCreate: onDidCreateEmitter.event,
|
||||
onDidChange: onDidChangeEmitter.event,
|
||||
onDidDelete: onDidDeleteEmitter.event,
|
||||
};
|
||||
|
||||
// Use the bootstrap function with file watcher
|
||||
const foam = await bootstrap(
|
||||
matcher,
|
||||
undefined,
|
||||
foamWatcher,
|
||||
dataStore,
|
||||
parser,
|
||||
providers,
|
||||
'.md'
|
||||
);
|
||||
|
||||
Logger.info('Mock Foam instance created (manual reload for tests)');
|
||||
Logger.info('Mock Foam instance created with file watcher');
|
||||
return foam;
|
||||
}
|
||||
|
||||
@@ -1399,13 +1603,14 @@ async function initializeFoamCommands(foam: Foam): Promise<void> {
|
||||
await foamCommands.updateWikilinksCommand(mockContext, foamPromise);
|
||||
await foamCommands.openDailyNoteForDateCommand(mockContext, foamPromise);
|
||||
await foamCommands.convertLinksCommand(mockContext, foamPromise);
|
||||
await foamCommands.buildEmbeddingsCommand(mockContext, foamPromise);
|
||||
await foamCommands.openDailyNoteCommand(mockContext, foamPromise);
|
||||
await foamCommands.openDatedNote(mockContext, foamPromise);
|
||||
|
||||
// Commands that only need context
|
||||
await foamCommands.copyWithoutBracketsCommand(mockContext);
|
||||
await foamCommands.createFromTemplateCommand(mockContext);
|
||||
await foamCommands.createNewTemplate(mockContext);
|
||||
await foamCommands.openDailyNoteCommand(mockContext, foamPromise);
|
||||
await foamCommands.openDatedNote(mockContext, foamPromise);
|
||||
|
||||
Logger.info('Foam commands initialized successfully in mock environment');
|
||||
}
|
||||
@@ -1420,6 +1625,7 @@ const mockState = {
|
||||
commands: new Map<string, (...args: any[]) => any>(),
|
||||
fileSystem: new MockFileSystem(),
|
||||
configuration: new MockWorkspaceConfiguration(),
|
||||
fileWatchers: [] as MockFileSystemWatcher[],
|
||||
};
|
||||
|
||||
// Window namespace
|
||||
@@ -1514,6 +1720,31 @@ export const window = {
|
||||
message
|
||||
);
|
||||
},
|
||||
|
||||
async withProgress<R>(
|
||||
options: {
|
||||
location: ProgressLocation;
|
||||
title?: string;
|
||||
cancellable?: boolean;
|
||||
},
|
||||
task: (
|
||||
progress: Progress<{ message?: string; increment?: number }>,
|
||||
token: CancellationToken
|
||||
) => Thenable<R>
|
||||
): Promise<R> {
|
||||
const tokenSource = new CancellationTokenSource();
|
||||
const progress: Progress<{ message?: string; increment?: number }> = {
|
||||
report: () => {
|
||||
// No-op in mock, but can be overridden in tests
|
||||
},
|
||||
};
|
||||
|
||||
try {
|
||||
return await task(progress, tokenSource.token);
|
||||
} finally {
|
||||
tokenSource.dispose();
|
||||
}
|
||||
},
|
||||
};
|
||||
|
||||
// Workspace namespace
|
||||
@@ -1528,6 +1759,10 @@ export const workspace = {
|
||||
return mockState.fileSystem;
|
||||
},
|
||||
|
||||
createFileSystemWatcher(globPattern: string): FileSystemWatcher {
|
||||
return new MockFileSystemWatcher(globPattern);
|
||||
},
|
||||
|
||||
getConfiguration(section?: string): WorkspaceConfiguration {
|
||||
if (section) {
|
||||
// Return a scoped configuration for the specific section
|
||||
|
||||
@@ -2,14 +2,6 @@ body {
|
||||
overflow: hidden;
|
||||
}
|
||||
|
||||
.dg .c {
|
||||
width: 10%;
|
||||
}
|
||||
|
||||
.dg .property-name {
|
||||
width: 90%;
|
||||
}
|
||||
|
||||
.vscode-light .dg.main.taller-than-window .close-button {
|
||||
border-top: 1px solid #ddd;
|
||||
}
|
||||
|
||||
@@ -1,9 +1,70 @@
|
||||
const CONTAINER_ID = 'graph';
|
||||
|
||||
let nodeFontSizeController = null;
|
||||
const initGUI = () => {
|
||||
const gui = new dat.gui.GUI();
|
||||
const nodeTypeFilterFolder = gui.addFolder('Filter by type');
|
||||
const nodeTypeFilterControllers = new Map();
|
||||
const appearanceFolder = gui.addFolder('Appearance');
|
||||
|
||||
appearanceFolder
|
||||
.add(model, 'textFade', 0, 5)
|
||||
.step(0.1)
|
||||
.name('Text Fade')
|
||||
.onFinishChange(v => {
|
||||
const invertedValue = 5 - v;
|
||||
getNodeLabelOpacity.domain([invertedValue, invertedValue + 0.8]);
|
||||
});
|
||||
|
||||
nodeFontSizeController = appearanceFolder
|
||||
.add(model, 'nodeFontSizeMultiplier', 0.5, 3)
|
||||
.step(0.1)
|
||||
.name('Node Font Size');
|
||||
const forcesFolder = gui.addFolder('Forces');
|
||||
|
||||
forcesFolder
|
||||
.add(model.forces, 'collide', 0, 4)
|
||||
.step(0.1)
|
||||
.name('Collide Force')
|
||||
.onFinishChange(v => {
|
||||
graph.d3Force('collide').radius(graph.nodeRelSize() * v);
|
||||
graph.d3ReheatSimulation();
|
||||
});
|
||||
forcesFolder
|
||||
.add(model.forces, 'repel', 0, 200)
|
||||
.name('Repel Force')
|
||||
.onFinishChange(v => {
|
||||
model.forces.charge = -v;
|
||||
graph.d3Force('charge').strength(-v);
|
||||
graph.d3ReheatSimulation();
|
||||
});
|
||||
forcesFolder
|
||||
.add(model.forces, 'link', 0, 100)
|
||||
.step(1)
|
||||
.name('Link Distance')
|
||||
.onFinishChange(v => {
|
||||
graph.d3Force('link').distance(v);
|
||||
graph.d3ReheatSimulation();
|
||||
});
|
||||
forcesFolder
|
||||
.add(model.forces, 'velocityDecay', 0, 1)
|
||||
.step(0.01)
|
||||
.name('Velocity Decay')
|
||||
.onChange(v => {
|
||||
graph.d3VelocityDecay(1 - v);
|
||||
});
|
||||
const selectionFolder = gui.addFolder('Selection');
|
||||
|
||||
selectionFolder
|
||||
.add(model.selection, 'neighborDepth', 1, 5)
|
||||
.step(1)
|
||||
.name('Neighbor Depth')
|
||||
.onFinishChange(() => {
|
||||
update(m => m);
|
||||
});
|
||||
|
||||
selectionFolder.add(model.selection, 'enableRefocus').name('Refocus Enable');
|
||||
selectionFolder.add(model.selection, 'enableZoom').name('Zoom Enable');
|
||||
|
||||
return {
|
||||
/**
|
||||
@@ -91,11 +152,47 @@ let model = {
|
||||
note: true,
|
||||
tag: true,
|
||||
},
|
||||
textFade: 1.2,
|
||||
nodeFontSizeMultiplier: 1,
|
||||
forces: {
|
||||
collide: 2,
|
||||
repel: 30,
|
||||
charge: -30,
|
||||
link: 30,
|
||||
velocityDecay: 0.4,
|
||||
},
|
||||
selection: {
|
||||
neighborDepth: 1,
|
||||
enableRefocus: true,
|
||||
enableZoom: true,
|
||||
}
|
||||
};
|
||||
|
||||
const graph = ForceGraph();
|
||||
const gui = initGUI();
|
||||
|
||||
function getNeighbors(nodeId, depth) {
|
||||
let neighbors = new Set([nodeId]);
|
||||
for (let i = 0; i < depth; i++) {
|
||||
let newNeighbors = new Set();
|
||||
for (const neighborId of neighbors) {
|
||||
if (model.graph.nodeInfo[neighborId]) {
|
||||
for (const n of model.graph.nodeInfo[neighborId].neighbors) {
|
||||
newNeighbors.add(n);
|
||||
}
|
||||
} else {
|
||||
// Node is missing from nodeInfo (e.g., has been deleted). Skipping.
|
||||
// This may make debugging difficult if nodes are unexpectedly missing from highlights.
|
||||
console.debug(`getNeighbors: node '${neighborId}' not found in nodeInfo, skipping.`);
|
||||
}
|
||||
}
|
||||
for (const newNeighbor of newNeighbors) {
|
||||
neighbors.add(newNeighbor);
|
||||
}
|
||||
}
|
||||
return neighbors;
|
||||
}
|
||||
|
||||
function update(patch) {
|
||||
const startTime = performance.now();
|
||||
// Apply the patch function to the model..
|
||||
@@ -105,20 +202,20 @@ function update(patch) {
|
||||
// compute highlighted elements
|
||||
const focusNodes = new Set();
|
||||
const focusLinks = new Set();
|
||||
if (model.hoverNode) {
|
||||
focusNodes.add(model.hoverNode);
|
||||
const info = model.graph.nodeInfo[model.hoverNode];
|
||||
info.neighbors.forEach(neighborId => focusNodes.add(neighborId));
|
||||
info.links.forEach(link => focusLinks.add(link));
|
||||
}
|
||||
if (model.selectedNodes) {
|
||||
model.selectedNodes.forEach(nodeId => {
|
||||
focusNodes.add(nodeId);
|
||||
const info = model.graph.nodeInfo[nodeId];
|
||||
info.neighbors.forEach(neighborId => focusNodes.add(neighborId));
|
||||
info.links.forEach(link => focusLinks.add(link));
|
||||
});
|
||||
}
|
||||
|
||||
const nodesToProcess = new Set([...model.selectedNodes, model.hoverNode].filter(Boolean));
|
||||
|
||||
nodesToProcess.forEach(nodeId => {
|
||||
const neighbors = getNeighbors(nodeId, model.selection.neighborDepth);
|
||||
neighbors.forEach(neighbor => focusNodes.add(neighbor));
|
||||
});
|
||||
|
||||
model.graph.links.forEach(link => {
|
||||
if (focusNodes.has(getLinkNodeId(link.source)) && focusNodes.has(getLinkNodeId(link.target))) {
|
||||
focusLinks.add(link);
|
||||
}
|
||||
});
|
||||
|
||||
model.focusNodes = focusNodes;
|
||||
model.focusLinks = focusLinks;
|
||||
|
||||
@@ -200,7 +297,13 @@ function initDataviz(channel) {
|
||||
.linkHoverPrecision(8)
|
||||
.d3Force('x', d3.forceX())
|
||||
.d3Force('y', d3.forceY())
|
||||
.d3Force('collide', d3.forceCollide(graph.nodeRelSize()))
|
||||
.d3Force(
|
||||
'collide',
|
||||
d3.forceCollide(graph.nodeRelSize() * model.forces.collide)
|
||||
)
|
||||
.d3Force('charge', d3.forceManyBody().strength(model.forces.charge))
|
||||
.d3Force('link', d3.forceLink(model.data.links).distance(model.forces.link))
|
||||
.d3VelocityDecay(1 - model.forces.velocityDecay)
|
||||
.linkWidth(() => model.style.lineWidth)
|
||||
.linkDirectionalParticles(1)
|
||||
.linkDirectionalParticleWidth(link =>
|
||||
@@ -216,7 +319,7 @@ function initDataviz(channel) {
|
||||
}
|
||||
const size = getNodeSize(info.neighbors.length);
|
||||
const { fill, border } = getNodeColor(node.id, model);
|
||||
const fontSize = model.style.fontSize / globalScale;
|
||||
const fontSize = (model.style.fontSize * model.nodeFontSizeMultiplier) / globalScale;
|
||||
const nodeState = getNodeState(node.id, model);
|
||||
const textColor = fill.copy({
|
||||
opacity:
|
||||
@@ -364,6 +467,7 @@ function updateForceGraphDataFromModel(m) {
|
||||
|
||||
// annoying we need to call this function, but I haven't found a good workaround
|
||||
graph.graphData(m.data);
|
||||
graph.d3Force('link').links(m.data.links);
|
||||
}
|
||||
|
||||
const getNodeSize = d3
|
||||
@@ -569,7 +673,12 @@ try {
|
||||
const noteId = message.payload;
|
||||
const node = graph.graphData().nodes.find(node => node.id === noteId);
|
||||
if (node) {
|
||||
graph.centerAt(node.x, node.y, 300).zoom(3, 300);
|
||||
if (model.selection.enableRefocus) {
|
||||
graph.centerAt(node.x, node.y, 300);
|
||||
}
|
||||
if (model.selection.enableZoom) {
|
||||
graph.zoom(3, 300);
|
||||
}
|
||||
Actions.selectNode(noteId);
|
||||
}
|
||||
break;
|
||||
@@ -577,6 +686,15 @@ try {
|
||||
const style = message.payload;
|
||||
Actions.updateStyle(style);
|
||||
break;
|
||||
case 'didUpdateNodeFontSizeMultiplier':
|
||||
const multiplier = message.payload;
|
||||
if (typeof multiplier === 'number') {
|
||||
model.nodeFontSizeMultiplier = multiplier;
|
||||
if (nodeFontSizeController) {
|
||||
nodeFontSizeController.updateDisplay();
|
||||
}
|
||||
}
|
||||
break;
|
||||
}
|
||||
});
|
||||
} catch {
|
||||
|
||||
@@ -3,19 +3,41 @@
|
||||
"injectionSelector": "L:meta.paragraph.markdown, L:markup.heading.markdown, L:markup.list.unnumbered.markdown",
|
||||
"patterns": [
|
||||
{
|
||||
"contentName": "string.other.link.title.markdown.foam",
|
||||
"name": "meta.link.wikilink.markdown.foam",
|
||||
"begin": "\\[\\[",
|
||||
"end": "\\]\\]",
|
||||
"beginCaptures": {
|
||||
"0": {
|
||||
"name": "punctuation.definition.metadata.markdown.foam"
|
||||
}
|
||||
},
|
||||
"end": "\\]\\]",
|
||||
"endCaptures": {
|
||||
"0": {
|
||||
"name": "punctuation.definition.metadata.markdown.foam"
|
||||
}
|
||||
}
|
||||
},
|
||||
"patterns": [
|
||||
{
|
||||
"comment": "Wikilink with alias: [[target|alias]]",
|
||||
"match": "([^|\\]]+)(\\|)([^\\]]+)",
|
||||
"captures": {
|
||||
"1": {
|
||||
"name": "comment.line.wikilink.target.markdown.foam"
|
||||
},
|
||||
"2": {
|
||||
"name": "punctuation.definition.metadata.markdown.foam"
|
||||
},
|
||||
"3": {
|
||||
"name": "string.other.link.title.markdown.foam"
|
||||
}
|
||||
}
|
||||
},
|
||||
{
|
||||
"comment": "Wikilink without alias: [[target]]",
|
||||
"match": "[^|\\]]+",
|
||||
"name": "string.other.link.title.markdown.foam"
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
}
|
||||
}
|
||||
|
||||
13
readme.md
13
readme.md
@@ -5,12 +5,13 @@
|
||||
👀*This is an early stage project under rapid development. For updates join the [Foam community Discord](https://foambubble.github.io/join-discord/g)! 💬*
|
||||
|
||||
<!-- ALL-CONTRIBUTORS-BADGE:START - Do not remove or modify this section -->
|
||||
|
||||
[](#contributors-)
|
||||
|
||||
[](#contributors-)
|
||||
<!-- ALL-CONTRIBUTORS-BADGE:END -->
|
||||
|
||||
[](https://marketplace.visualstudio.com/items?itemName=foam.foam-vscode)
|
||||
[](https://marketplace.visualstudio.com/items?itemName=foam.foam-vscode)
|
||||
[](https://marketplace.visualstudio.com/items?itemName=foam.foam-vscode)
|
||||
[](https://marketplace.visualstudio.com/items?itemName=foam.foam-vscode)
|
||||
[](https://marketplace.visualstudio.com/items?itemName=foam.foam-vscode)
|
||||
[](https://foambubble.github.io/join-discord/g)
|
||||
|
||||
**Foam** is a personal knowledge management and sharing system inspired by [Roam Research](https://roamresearch.com/), built on [Visual Studio Code](https://code.visualstudio.com/) and [GitHub](https://github.com/).
|
||||
@@ -373,6 +374,7 @@ Thanks goes to these wonderful people ([emoji key](https://allcontributors.org/d
|
||||
<td align="center" valign="top" width="14.28%"><a href="https://github.com/s-jacob-powell"><img src="https://avatars.githubusercontent.com/u/109111499?v=4?s=60" width="60px;" alt="S. Jacob Powell"/><br /><sub><b>S. Jacob Powell</b></sub></a><br /><a href="https://github.com/foambubble/foam/commits?author=s-jacob-powell" title="Code">💻</a></td>
|
||||
<td align="center" valign="top" width="14.28%"><a href="https://github.com/figdavi"><img src="https://avatars.githubusercontent.com/u/99026991?v=4?s=60" width="60px;" alt="Davi Figueiredo"/><br /><sub><b>Davi Figueiredo</b></sub></a><br /><a href="https://github.com/foambubble/foam/commits?author=figdavi" title="Documentation">📖</a></td>
|
||||
<td align="center" valign="top" width="14.28%"><a href="https://github.com/ChThH"><img src="https://avatars.githubusercontent.com/u/9499483?v=4?s=60" width="60px;" alt="CT Hall"/><br /><sub><b>CT Hall</b></sub></a><br /><a href="https://github.com/foambubble/foam/commits?author=ChThH" title="Code">💻</a></td>
|
||||
<td align="center" valign="top" width="14.28%"><a href="https://github.com/meestahp"><img src="https://avatars.githubusercontent.com/u/177708514?v=4?s=60" width="60px;" alt="meestahp"/><br /><sub><b>meestahp</b></sub></a><br /><a href="https://github.com/foambubble/foam/commits?author=meestahp" title="Code">💻</a></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
@@ -385,3 +387,6 @@ Thanks goes to these wonderful people ([emoji key](https://allcontributors.org/d
|
||||
This project follows the [all-contributors](https://github.com/all-contributors/all-contributors) specification. Contributions of any kind welcome!
|
||||
|
||||
[Backlinking]: docs/user/features/backlinking.md 'Backlinks'
|
||||
[//begin]: # 'Autogenerated link references for markdown compatibility'
|
||||
[Backlinking]: docs/user/features/backlinking.md 'Backlinks'
|
||||
[//end]: # 'Autogenerated link references'
|
||||
|
||||
Reference in New Issue
Block a user