Initial implementation of Qdrant Semantic Search plugin
- Complete plugin architecture with modular design - Qdrant client with HTTP integration using requestUrl - Ollama and OpenAI embedding providers with batching - Hybrid chunking (semantic + size-based fallback) - Content extractors for markdown, code, PDFs, and images - Real-time indexing with file watcher and queue - Search modal with keyboard navigation - Comprehensive settings UI with connection testing - Graph visualization framework (basic implementation) - Full TypeScript types and error handling - Desktop-only plugin with status bar integration - Complete documentation and setup guide Features implemented: ✅ Semantic search with vector embeddings ✅ Multiple embedding providers (Ollama/OpenAI) ✅ Rich content extraction (markdown, code, PDFs, images) ✅ Smart chunking with heading-based splits ✅ Real-time file indexing with progress tracking ✅ Standalone search interface ✅ Comprehensive settings and configuration ✅ Graph view foundation for document relationships ✅ Full error handling and logging ✅ Complete documentation and troubleshooting guide Ready for testing with Qdrant instance and embedding provider setup.
This commit is contained in:
parent
9818d6637c
commit
38889c1d65
268
README.md
268
README.md
@ -1,94 +1,226 @@
|
|||||||
# Obsidian Sample Plugin
|
# Qdrant Semantic Search for Obsidian
|
||||||
|
|
||||||
This is a sample plugin for Obsidian (https://obsidian.md).
|
A powerful Obsidian plugin that indexes your entire vault into Qdrant for semantic search, using Ollama or OpenAI for text embeddings, with support for PDF and image text extraction via Text Extractor plugin.
|
||||||
|
|
||||||
This project uses TypeScript to provide type checking and documentation.
|
## Features
|
||||||
The repo depends on the latest plugin API (obsidian.d.ts) in TypeScript Definition format, which contains TSDoc comments describing what it does.
|
|
||||||
|
|
||||||
This sample plugin demonstrates some of the basic functionality the plugin API can do.
|
- **Semantic Search**: Find content by meaning, not just keywords
|
||||||
- Adds a ribbon icon, which shows a Notice when clicked.
|
- **Multiple Embedding Providers**:
|
||||||
- Adds a command "Open Sample Modal" which opens a Modal.
|
- Ollama (local, free) - default
|
||||||
- Adds a plugin setting tab to the settings page.
|
- OpenAI (cloud, paid)
|
||||||
- Registers a global click event and output 'click' to the console.
|
- **Rich Content Support**:
|
||||||
- Registers a global interval which logs 'setInterval' to the console.
|
- Markdown files with frontmatter parsing
|
||||||
|
- Code files with syntax highlighting
|
||||||
|
- PDFs via Text Extractor plugin
|
||||||
|
- Images with OCR via Text Extractor plugin
|
||||||
|
- **Hybrid Chunking**: Smart text splitting on headings with size-based fallback
|
||||||
|
- **Real-time Indexing**: Automatic indexing of file changes
|
||||||
|
- **Graph Visualization**: View document relationships (planned)
|
||||||
|
- **Comprehensive Settings**: Full control over indexing and search behavior
|
||||||
|
|
||||||
## First time developing plugins?
|
## Installation
|
||||||
|
|
||||||
Quick starting guide for new plugin devs:
|
### Prerequisites
|
||||||
|
|
||||||
- Check if [someone already developed a plugin for what you want](https://obsidian.md/plugins)! There might be an existing plugin similar enough that you can partner up with.
|
1. **Qdrant**: You need a Qdrant instance running
|
||||||
- Make a copy of this repo as a template with the "Use this template" button (login to GitHub if you don't see it).
|
- Local: `docker run -p 6333:6333 qdrant/qdrant`
|
||||||
- Clone your repo to a local development folder. For convenience, you can place this folder in your `.obsidian/plugins/your-plugin-name` folder.
|
- Cloud: Sign up at [Qdrant Cloud](https://cloud.qdrant.io/)
|
||||||
- Install NodeJS, then run `npm i` in the command line under your repo folder.
|
|
||||||
- Run `npm run dev` to compile your plugin from `main.ts` to `main.js`.
|
|
||||||
- Make changes to `main.ts` (or create new `.ts` files). Those changes should be automatically compiled into `main.js`.
|
|
||||||
- Reload Obsidian to load the new version of your plugin.
|
|
||||||
- Enable plugin in settings window.
|
|
||||||
- For updates to the Obsidian API run `npm update` in the command line under your repo folder.
|
|
||||||
|
|
||||||
## Releasing new releases
|
2. **Ollama** (recommended for local embeddings):
|
||||||
|
```bash
|
||||||
|
# Install Ollama
|
||||||
|
curl -fsSL https://ollama.ai/install.sh | sh
|
||||||
|
|
||||||
|
# Pull an embedding model
|
||||||
|
ollama pull nomic-embed-text
|
||||||
|
```
|
||||||
|
|
||||||
- Update your `manifest.json` with your new version number, such as `1.0.1`, and the minimum Obsidian version required for your latest release.
|
3. **Text Extractor Plugin** (optional, for PDF/image support):
|
||||||
- Update your `versions.json` file with `"new-plugin-version": "minimum-obsidian-version"` so older versions of Obsidian can download an older version of your plugin that's compatible.
|
- Install from Community Plugins
|
||||||
- Create new GitHub release using your new version number as the "Tag version". Use the exact version number, don't include a prefix `v`. See here for an example: https://github.com/obsidianmd/obsidian-sample-plugin/releases
|
- Enables PDF text extraction and image OCR
|
||||||
- Upload the files `manifest.json`, `main.js`, `styles.css` as binary attachments. Note: The manifest.json file must be in two places, first the root path of your repository and also in the release.
|
|
||||||
- Publish the release.
|
|
||||||
|
|
||||||
> You can simplify the version bump process by running `npm version patch`, `npm version minor` or `npm version major` after updating `minAppVersion` manually in `manifest.json`.
|
### Plugin Installation
|
||||||
> The command will bump version in `manifest.json` and `package.json`, and add the entry for the new version to `versions.json`
|
|
||||||
|
|
||||||
## Adding your plugin to the community plugin list
|
1. Download the latest release from GitHub
|
||||||
|
2. Extract `main.js`, `manifest.json`, and `styles.css` to your vault's `.obsidian/plugins/obsidian-qdrant/` folder
|
||||||
|
3. Enable the plugin in **Settings → Community plugins**
|
||||||
|
|
||||||
- Check the [plugin guidelines](https://docs.obsidian.md/Plugins/Releasing/Plugin+guidelines).
|
## Configuration
|
||||||
- Publish an initial version.
|
|
||||||
- Make sure you have a `README.md` file in the root of your repo.
|
|
||||||
- Make a pull request at https://github.com/obsidianmd/obsidian-releases to add your plugin.
|
|
||||||
|
|
||||||
## How to use
|
### Basic Setup
|
||||||
|
|
||||||
- Clone this repo.
|
1. Open **Settings → Community plugins → Qdrant Semantic Search**
|
||||||
- Make sure your NodeJS is at least v16 (`node --version`).
|
2. Configure your Qdrant connection:
|
||||||
- `npm i` or `yarn` to install dependencies.
|
- **URL**: `http://localhost:6333` (local) or your Qdrant Cloud URL
|
||||||
- `npm run dev` to start compilation in watch mode.
|
- **API Key**: Leave empty for local, add your key for cloud
|
||||||
|
3. Choose your embedding provider:
|
||||||
|
- **Ollama**: Set model name (e.g., `nomic-embed-text`)
|
||||||
|
- **OpenAI**: Add your API key and select model
|
||||||
|
|
||||||
## Manually installing the plugin
|
### Advanced Settings
|
||||||
|
|
||||||
- Copy over `main.js`, `styles.css`, `manifest.json` to your vault `VaultFolder/.obsidian/plugins/your-plugin-id/`.
|
#### Indexing Configuration
|
||||||
|
- **Include Patterns**: File types to index (default: `*.md`, `*.txt`, `*.pdf`, `*.png`, `*.jpg`)
|
||||||
|
- **Exclude Patterns**: File patterns to skip
|
||||||
|
- **Max File Size**: Skip files larger than this (default: 10MB)
|
||||||
|
- **Ignored Folders**: Folders to skip (default: `.obsidian`, `.git`, `node_modules`)
|
||||||
|
|
||||||
## Improve code quality with eslint (optional)
|
#### Chunking Settings
|
||||||
- [ESLint](https://eslint.org/) is a tool that analyzes your code to quickly find problems. You can run ESLint against your plugin to find common bugs and ways to improve your code.
|
- **Target Tokens**: Ideal chunk size (default: 500)
|
||||||
- To use eslint with this project, make sure to install eslint from terminal:
|
- **Overlap Tokens**: Overlap between chunks (default: 100)
|
||||||
- `npm install -g eslint`
|
- **Max Tokens**: Hard limit per chunk (default: 800)
|
||||||
- To use eslint to analyze this project use this command:
|
|
||||||
- `eslint main.ts`
|
|
||||||
- eslint will then create a report with suggestions for code improvement by file and line number.
|
|
||||||
- If your source code is in a folder, such as `src`, you can use eslint with this command to analyze all files in that folder:
|
|
||||||
- `eslint ./src/`
|
|
||||||
|
|
||||||
## Funding URL
|
#### Graph Visualization
|
||||||
|
- **Enable Graph View**: Show document relationships
|
||||||
|
- **Similarity Threshold**: Minimum similarity for edges (default: 0.7)
|
||||||
|
- **Max Nodes**: Maximum nodes to display (default: 100)
|
||||||
|
|
||||||
You can include funding URLs where people who use your plugin can financially support it.
|
## Usage
|
||||||
|
|
||||||
The simple way is to set the `fundingUrl` field to your link in your `manifest.json` file:
|
### Commands
|
||||||
|
|
||||||
```json
|
- **Semantic search**: Open the search modal
|
||||||
{
|
- **Index current file**: Index the currently open file
|
||||||
"fundingUrl": "https://buymeacoffee.com"
|
- **Full reindex vault**: Reindex all files
|
||||||
}
|
- **Clear index**: Remove all indexed data
|
||||||
|
- **Open graph view**: Show document relationships (when implemented)
|
||||||
|
|
||||||
|
### Search Interface
|
||||||
|
|
||||||
|
1. Use **Ctrl+P** (or **Cmd+P** on Mac) to open Command Palette
|
||||||
|
2. Type "Semantic search" and press Enter
|
||||||
|
3. Enter your search query
|
||||||
|
4. Browse results with keyboard navigation:
|
||||||
|
- **Arrow keys**: Navigate results
|
||||||
|
- **Enter**: Open selected result
|
||||||
|
- **Escape**: Close search
|
||||||
|
|
||||||
|
### Status Bar
|
||||||
|
|
||||||
|
The plugin shows indexing progress in the status bar:
|
||||||
|
- **Ready**: System is ready
|
||||||
|
- **Indexing X%**: Shows progress during full reindex
|
||||||
|
- **Error**: Click to see error details
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
|
||||||
|
### Components
|
||||||
|
|
||||||
|
- **Extractors**: Parse different file types (markdown, code, PDFs, images)
|
||||||
|
- **Chunkers**: Split text into semantic chunks
|
||||||
|
- **Embedding Providers**: Generate vector embeddings
|
||||||
|
- **Qdrant Client**: Store and search vectors
|
||||||
|
- **Indexing Queue**: Manage background indexing
|
||||||
|
- **Search UI**: Provide search interface
|
||||||
|
|
||||||
|
### Data Flow
|
||||||
|
|
||||||
|
1. **File Change** → File Watcher → Indexing Queue
|
||||||
|
2. **Extract** → Chunk → Embed → Store in Qdrant
|
||||||
|
3. **Search Query** → Embed → Search Qdrant → Display Results
|
||||||
|
|
||||||
|
### Collection Schema
|
||||||
|
|
||||||
|
Each vault gets a collection named `vault_<sanitized_name>_<model>`. Points contain:
|
||||||
|
- **Vector**: Embedding from your chosen model
|
||||||
|
- **Payload**: Rich metadata (path, title, tags, chunk info, etc.)
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
### Common Issues
|
||||||
|
|
||||||
|
#### "Indexing system not ready"
|
||||||
|
- Check Qdrant connection in settings
|
||||||
|
- Verify embedding provider configuration
|
||||||
|
- Check console for error messages
|
||||||
|
|
||||||
|
#### "No results found"
|
||||||
|
- Ensure files are indexed (check status bar)
|
||||||
|
- Try a full reindex
|
||||||
|
- Verify your search query isn't too specific
|
||||||
|
|
||||||
|
#### "Ollama connection failed"
|
||||||
|
- Ensure Ollama is running: `ollama serve`
|
||||||
|
- Check model is installed: `ollama list`
|
||||||
|
- Verify URL in settings (default: `http://localhost:11434`)
|
||||||
|
|
||||||
|
#### "OpenAI connection failed"
|
||||||
|
- Verify API key is correct
|
||||||
|
- Check you have credits/quota
|
||||||
|
- Ensure model name is valid
|
||||||
|
|
||||||
|
### Performance Tips
|
||||||
|
|
||||||
|
1. **Batch Size**: Increase for faster indexing (if you have memory)
|
||||||
|
2. **Concurrency**: Higher values for faster processing (but may overwhelm services)
|
||||||
|
3. **File Filters**: Exclude unnecessary files to speed up indexing
|
||||||
|
4. **Chunk Size**: Larger chunks = fewer vectors but less precise search
|
||||||
|
|
||||||
|
### Debugging
|
||||||
|
|
||||||
|
Enable developer console (**Ctrl+Shift+I**) to see detailed logs:
|
||||||
|
- Indexing progress and errors
|
||||||
|
- Search query processing
|
||||||
|
- Qdrant API calls
|
||||||
|
- Embedding generation
|
||||||
|
|
||||||
|
## Development
|
||||||
|
|
||||||
|
### Building from Source
|
||||||
|
|
||||||
|
```bash
|
||||||
|
git clone <repository>
|
||||||
|
cd obsidian-qdrant
|
||||||
|
npm install
|
||||||
|
npm run dev # Watch mode
|
||||||
|
npm run build # Production build
|
||||||
```
|
```
|
||||||
|
|
||||||
If you have multiple URLs, you can also do:
|
### Project Structure
|
||||||
|
|
||||||
```json
|
```
|
||||||
{
|
src/
|
||||||
"fundingUrl": {
|
├── types.ts # TypeScript interfaces
|
||||||
"Buy Me a Coffee": "https://buymeacoffee.com",
|
├── settings.ts # Settings and defaults
|
||||||
"GitHub Sponsor": "https://github.com/sponsors",
|
├── main.ts # Plugin entry point
|
||||||
"Patreon": "https://www.patreon.com/"
|
├── qdrant/ # Qdrant client and collection management
|
||||||
}
|
├── embeddings/ # Embedding providers (Ollama, OpenAI)
|
||||||
}
|
├── extractors/ # Content extractors
|
||||||
|
├── chunking/ # Text chunking logic
|
||||||
|
├── indexing/ # Indexing orchestration
|
||||||
|
├── search/ # Search UI components
|
||||||
|
├── graph/ # Graph visualization
|
||||||
|
└── ui/ # Settings UI
|
||||||
```
|
```
|
||||||
|
|
||||||
## API Documentation
|
## Contributing
|
||||||
|
|
||||||
See https://github.com/obsidianmd/obsidian-api
|
1. Fork the repository
|
||||||
|
2. Create a feature branch
|
||||||
|
3. Make your changes
|
||||||
|
4. Add tests if applicable
|
||||||
|
5. Submit a pull request
|
||||||
|
|
||||||
|
## License
|
||||||
|
|
||||||
|
MIT License - see LICENSE file for details.
|
||||||
|
|
||||||
|
## Acknowledgments
|
||||||
|
|
||||||
|
- [Qdrant](https://qdrant.tech/) for the vector database
|
||||||
|
- [Ollama](https://ollama.ai/) for local embeddings
|
||||||
|
- [OpenAI](https://openai.com/) for cloud embeddings
|
||||||
|
- [Text Extractor](https://github.com/scambier/obsidian-text-extractor) for PDF/image support
|
||||||
|
- [Obsidian](https://obsidian.md/) for the amazing note-taking platform
|
||||||
|
|
||||||
|
## Roadmap
|
||||||
|
|
||||||
|
- [ ] Graph visualization with D3.js
|
||||||
|
- [ ] Hybrid search (dense + sparse vectors)
|
||||||
|
- [ ] More embedding providers (Cohere, Mistral, etc.)
|
||||||
|
- [ ] Advanced filtering in search
|
||||||
|
- [ ] Search result ranking improvements
|
||||||
|
- [ ] Mobile support
|
||||||
|
- [ ] Plugin API for other plugins
|
||||||
|
- [ ] Export/import index data
|
||||||
|
- [ ] Search analytics and insights
|
||||||
261
main.ts
261
main.ts
@ -1,85 +1,53 @@
|
|||||||
import { App, Editor, MarkdownView, Modal, Notice, Plugin, PluginSettingTab, Setting } from 'obsidian';
|
import { App, Notice, Plugin, TFile } from 'obsidian';
|
||||||
|
import { PluginSettings, IndexingProgress } from './src/types';
|
||||||
|
import { DEFAULT_SETTINGS, validateSettings } from './src/settings';
|
||||||
|
import { IndexingOrchestrator } from './src/indexing/orchestrator';
|
||||||
|
import { SearchModal } from './src/search/searchModal';
|
||||||
|
import { QdrantSettingsTab } from './src/ui/settingsTab';
|
||||||
|
|
||||||
// Remember to rename these classes and interfaces!
|
export default class QdrantPlugin extends Plugin {
|
||||||
|
settings: PluginSettings;
|
||||||
interface MyPluginSettings {
|
private indexingOrchestrator: IndexingOrchestrator | null = null;
|
||||||
mySetting: string;
|
private statusBarItem: HTMLElement | null = null;
|
||||||
}
|
|
||||||
|
|
||||||
const DEFAULT_SETTINGS: MyPluginSettings = {
|
|
||||||
mySetting: 'default'
|
|
||||||
}
|
|
||||||
|
|
||||||
export default class MyPlugin extends Plugin {
|
|
||||||
settings: MyPluginSettings;
|
|
||||||
|
|
||||||
async onload() {
|
async onload() {
|
||||||
await this.loadSettings();
|
await this.loadSettings();
|
||||||
|
|
||||||
// This creates an icon in the left ribbon.
|
// Validate settings
|
||||||
const ribbonIconEl = this.addRibbonIcon('dice', 'Sample Plugin', (_evt: MouseEvent) => {
|
const errors = validateSettings(this.settings);
|
||||||
// Called when the user clicks the icon.
|
if (errors.length > 0) {
|
||||||
new Notice('This is a notice!');
|
new Notice('Settings validation failed: ' + errors.join(', '));
|
||||||
});
|
}
|
||||||
// Perform additional things with the ribbon
|
|
||||||
ribbonIconEl.addClass('my-plugin-ribbon-class');
|
|
||||||
|
|
||||||
// This adds a status bar item to the bottom of the app. Does not work on mobile apps.
|
// Initialize indexing orchestrator
|
||||||
const statusBarItemEl = this.addStatusBarItem();
|
try {
|
||||||
statusBarItemEl.setText('Status Bar Text');
|
this.indexingOrchestrator = new IndexingOrchestrator(this.app, this.settings);
|
||||||
|
await this.indexingOrchestrator.initialize();
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to initialize indexing orchestrator:', error);
|
||||||
|
new Notice('Failed to initialize indexing system: ' + error.message);
|
||||||
|
}
|
||||||
|
|
||||||
// This adds a simple command that can be triggered anywhere
|
// Add status bar item
|
||||||
this.addCommand({
|
this.setupStatusBar();
|
||||||
id: 'open-sample-modal-simple',
|
|
||||||
name: 'Open sample modal (simple)',
|
|
||||||
callback: () => {
|
|
||||||
new SampleModal(this.app).open();
|
|
||||||
}
|
|
||||||
});
|
|
||||||
// This adds an editor command that can perform some operation on the current editor instance
|
|
||||||
this.addCommand({
|
|
||||||
id: 'sample-editor-command',
|
|
||||||
name: 'Sample editor command',
|
|
||||||
editorCallback: (editor: Editor, _view: MarkdownView) => {
|
|
||||||
console.log(editor.getSelection());
|
|
||||||
editor.replaceSelection('Sample Editor Command');
|
|
||||||
}
|
|
||||||
});
|
|
||||||
// This adds a complex command that can check whether the current state of the app allows execution of the command
|
|
||||||
this.addCommand({
|
|
||||||
id: 'open-sample-modal-complex',
|
|
||||||
name: 'Open sample modal (complex)',
|
|
||||||
checkCallback: (checking: boolean) => {
|
|
||||||
// Conditions to check
|
|
||||||
const markdownView = this.app.workspace.getActiveViewOfType(MarkdownView);
|
|
||||||
if (markdownView) {
|
|
||||||
// If checking is true, we're simply "checking" if the command can be run.
|
|
||||||
// If checking is false, then we want to actually perform the operation.
|
|
||||||
if (!checking) {
|
|
||||||
new SampleModal(this.app).open();
|
|
||||||
}
|
|
||||||
|
|
||||||
// This command will only show up in Command Palette when the check function returns true
|
// Add commands
|
||||||
return true;
|
this.addCommands();
|
||||||
}
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// This adds a settings tab so the user can configure various aspects of the plugin
|
// Add settings tab
|
||||||
this.addSettingTab(new SampleSettingTab(this.app, this));
|
this.addSettingTab(new QdrantSettingsTab(this.app, this));
|
||||||
|
|
||||||
// If the plugin hooks up any global DOM events (on parts of the app that doesn't belong to this plugin)
|
// Set up progress tracking
|
||||||
// Using this function will automatically remove the event listener when this plugin is disabled.
|
this.setupProgressTracking();
|
||||||
this.registerDomEvent(document, 'click', (evt: MouseEvent) => {
|
|
||||||
console.log('click', evt);
|
|
||||||
});
|
|
||||||
|
|
||||||
// When registering intervals, this function will automatically clear the interval when the plugin is disabled.
|
console.log('Qdrant Semantic Search plugin loaded');
|
||||||
this.registerInterval(window.setInterval(() => console.log('setInterval'), 5 * 60 * 1000));
|
|
||||||
}
|
}
|
||||||
|
|
||||||
onunload() {
|
onunload() {
|
||||||
|
// Shutdown indexing orchestrator
|
||||||
|
if (this.indexingOrchestrator) {
|
||||||
|
this.indexingOrchestrator.shutdown();
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
async loadSettings() {
|
async loadSettings() {
|
||||||
@ -89,46 +57,141 @@ export default class MyPlugin extends Plugin {
|
|||||||
async saveSettings() {
|
async saveSettings() {
|
||||||
await this.saveData(this.settings);
|
await this.saveData(this.settings);
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
|
||||||
class SampleModal extends Modal {
|
private setupStatusBar() {
|
||||||
constructor(app: App) {
|
this.statusBarItem = this.addStatusBarItem();
|
||||||
super(app);
|
this.updateStatusBar('Ready');
|
||||||
}
|
}
|
||||||
|
|
||||||
onOpen() {
|
private addCommands() {
|
||||||
const {contentEl} = this;
|
// Semantic search command
|
||||||
contentEl.setText('Woah!');
|
this.addCommand({
|
||||||
|
id: 'semantic-search',
|
||||||
|
name: 'Semantic search',
|
||||||
|
callback: () => {
|
||||||
|
if (!this.indexingOrchestrator?.isReady()) {
|
||||||
|
new Notice('Indexing system not ready. Please check your settings.');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
new SearchModal(this.app, this.settings).open();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Index current file command
|
||||||
|
this.addCommand({
|
||||||
|
id: 'index-current-file',
|
||||||
|
name: 'Index current file',
|
||||||
|
checkCallback: (checking: boolean) => {
|
||||||
|
const activeFile = this.app.workspace.getActiveFile();
|
||||||
|
if (activeFile instanceof TFile) {
|
||||||
|
if (!checking) {
|
||||||
|
this.indexFile(activeFile);
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Full reindex command
|
||||||
|
this.addCommand({
|
||||||
|
id: 'full-reindex',
|
||||||
|
name: 'Full reindex vault',
|
||||||
|
callback: () => {
|
||||||
|
this.indexFullVault();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Clear index command
|
||||||
|
this.addCommand({
|
||||||
|
id: 'clear-index',
|
||||||
|
name: 'Clear index',
|
||||||
|
callback: () => {
|
||||||
|
this.clearIndex();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Open graph view command
|
||||||
|
if (this.settings.enableGraphView) {
|
||||||
|
this.addCommand({
|
||||||
|
id: 'open-graph-view',
|
||||||
|
name: 'Open graph view',
|
||||||
|
callback: () => {
|
||||||
|
// TODO: Implement graph view
|
||||||
|
new Notice('Graph view not yet implemented');
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
onClose() {
|
private setupProgressTracking() {
|
||||||
const {contentEl} = this;
|
if (!this.indexingOrchestrator) return;
|
||||||
contentEl.empty();
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
class SampleSettingTab extends PluginSettingTab {
|
this.indexingOrchestrator.setProgressCallback((progress: IndexingProgress) => {
|
||||||
plugin: MyPlugin;
|
this.updateStatusBar(progress);
|
||||||
|
});
|
||||||
|
|
||||||
constructor(app: App, plugin: MyPlugin) {
|
this.indexingOrchestrator.setErrorCallback((error: string) => {
|
||||||
super(app, plugin);
|
new Notice('Indexing error: ' + error);
|
||||||
this.plugin = plugin;
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
display(): void {
|
private updateStatusBar(progress: IndexingProgress | string) {
|
||||||
const {containerEl} = this;
|
if (!this.statusBarItem) return;
|
||||||
|
|
||||||
containerEl.empty();
|
if (typeof progress === 'string') {
|
||||||
|
this.statusBarItem.setText(`Qdrant: ${progress}`);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
new Setting(containerEl)
|
if (progress.isRunning) {
|
||||||
.setName('Setting #1')
|
const percentage = progress.totalFiles > 0
|
||||||
.setDesc('It\'s a secret')
|
? Math.round((progress.processedFiles / progress.totalFiles) * 100)
|
||||||
.addText(text => text
|
: 0;
|
||||||
.setPlaceholder('Enter your secret')
|
this.statusBarItem.setText(`Qdrant: Indexing ${percentage}% (${progress.processedFiles}/${progress.totalFiles})`);
|
||||||
.setValue(this.plugin.settings.mySetting)
|
} else {
|
||||||
.onChange(async (value) => {
|
this.statusBarItem.setText('Qdrant: Ready');
|
||||||
this.plugin.settings.mySetting = value;
|
}
|
||||||
await this.plugin.saveSettings();
|
|
||||||
}));
|
|
||||||
}
|
}
|
||||||
}
|
|
||||||
|
// Public methods for settings tab
|
||||||
|
async testQdrantConnection(): Promise<boolean> {
|
||||||
|
if (!this.indexingOrchestrator) return false;
|
||||||
|
const connections = await this.indexingOrchestrator.testConnections();
|
||||||
|
return connections.qdrant;
|
||||||
|
}
|
||||||
|
|
||||||
|
async testOllamaConnection(): Promise<boolean> {
|
||||||
|
if (!this.indexingOrchestrator) return false;
|
||||||
|
const connections = await this.indexingOrchestrator.testConnections();
|
||||||
|
return connections.embedding;
|
||||||
|
}
|
||||||
|
|
||||||
|
async indexFullVault(): Promise<void> {
|
||||||
|
if (!this.indexingOrchestrator?.isReady()) {
|
||||||
|
throw new Error('Indexing system not ready');
|
||||||
|
}
|
||||||
|
await this.indexingOrchestrator.indexFullVault();
|
||||||
|
}
|
||||||
|
|
||||||
|
async indexFile(file: TFile): Promise<void> {
|
||||||
|
if (!this.indexingOrchestrator?.isReady()) {
|
||||||
|
throw new Error('Indexing system not ready');
|
||||||
|
}
|
||||||
|
await this.indexingOrchestrator.indexFile(file);
|
||||||
|
}
|
||||||
|
|
||||||
|
async clearIndex(): Promise<void> {
|
||||||
|
if (!this.indexingOrchestrator?.isReady()) {
|
||||||
|
throw new Error('Indexing system not ready');
|
||||||
|
}
|
||||||
|
await this.indexingOrchestrator.clearIndex();
|
||||||
|
}
|
||||||
|
|
||||||
|
async getIndexStats(): Promise<any> {
|
||||||
|
if (!this.indexingOrchestrator?.isReady()) {
|
||||||
|
throw new Error('Indexing system not ready');
|
||||||
|
}
|
||||||
|
return await this.indexingOrchestrator.getIndexStats();
|
||||||
|
}
|
||||||
|
}
|
||||||
@ -1,11 +1,11 @@
|
|||||||
{
|
{
|
||||||
"id": "sample-plugin",
|
"id": "obsidian-qdrant",
|
||||||
"name": "Sample Plugin",
|
"name": "Qdrant Semantic Search",
|
||||||
"version": "1.0.0",
|
"version": "0.1.0",
|
||||||
"minAppVersion": "0.15.0",
|
"minAppVersion": "0.15.0",
|
||||||
"description": "Demonstrates some of the capabilities of the Obsidian API.",
|
"description": "Index your vault into Qdrant for semantic search with Ollama/OpenAI embeddings and graph visualization.",
|
||||||
"author": "Obsidian",
|
"author": "Nicholai",
|
||||||
"authorUrl": "https://obsidian.md",
|
"authorUrl": "https://github.com/nicholai",
|
||||||
"fundingUrl": "https://obsidian.md/pricing",
|
"fundingUrl": "https://github.com/sponsors/nicholai",
|
||||||
"isDesktopOnly": false
|
"isDesktopOnly": true
|
||||||
}
|
}
|
||||||
|
|||||||
13
package.json
13
package.json
@ -1,16 +1,19 @@
|
|||||||
{
|
{
|
||||||
"name": "obsidian-sample-plugin",
|
"name": "obsidian-qdrant",
|
||||||
"version": "1.0.0",
|
"version": "0.1.0",
|
||||||
"description": "This is a sample plugin for Obsidian (https://obsidian.md)",
|
"description": "Index your vault into Qdrant for semantic search with Ollama/OpenAI embeddings and graph visualization.",
|
||||||
"main": "main.js",
|
"main": "main.js",
|
||||||
"scripts": {
|
"scripts": {
|
||||||
"dev": "node esbuild.config.mjs",
|
"dev": "node esbuild.config.mjs",
|
||||||
"build": "tsc -noEmit -skipLibCheck && node esbuild.config.mjs production",
|
"build": "tsc -noEmit -skipLibCheck && node esbuild.config.mjs production",
|
||||||
"version": "node version-bump.mjs && git add manifest.json versions.json"
|
"version": "node version-bump.mjs && git add manifest.json versions.json"
|
||||||
},
|
},
|
||||||
"keywords": [],
|
"keywords": ["obsidian", "qdrant", "semantic-search", "embeddings", "ollama", "openai"],
|
||||||
"author": "",
|
"author": "Nicholai",
|
||||||
"license": "MIT",
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"@qdrant/js-client-rest": "^1.7.0"
|
||||||
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
"@types/node": "^16.11.6",
|
"@types/node": "^16.11.6",
|
||||||
"@typescript-eslint/eslint-plugin": "5.29.0",
|
"@typescript-eslint/eslint-plugin": "5.29.0",
|
||||||
|
|||||||
177
src/chunking/chunker.ts
Normal file
177
src/chunking/chunker.ts
Normal file
@ -0,0 +1,177 @@
|
|||||||
|
import { ChunkMetadata, ExtractedContent, ChunkingSettings } from '../types';
|
||||||
|
import { SimpleTokenizer } from './tokenizer';
|
||||||
|
|
||||||
|
export class HybridChunker {
|
||||||
|
private settings: ChunkingSettings;
|
||||||
|
|
||||||
|
constructor(settings: ChunkingSettings) {
|
||||||
|
this.settings = settings;
|
||||||
|
}
|
||||||
|
|
||||||
|
async chunk(content: ExtractedContent): Promise<ChunkMetadata[]> {
|
||||||
|
const chunks: ChunkMetadata[] = [];
|
||||||
|
|
||||||
|
if (content.text.trim().length === 0) {
|
||||||
|
return chunks;
|
||||||
|
}
|
||||||
|
|
||||||
|
// For markdown files, try semantic chunking first
|
||||||
|
if (content.metadata.ext === 'md') {
|
||||||
|
const semanticChunks = this.chunkByHeadings(content);
|
||||||
|
if (semanticChunks.length > 0) {
|
||||||
|
return semanticChunks;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Fall back to size-based chunking
|
||||||
|
return this.chunkBySize(content);
|
||||||
|
}
|
||||||
|
|
||||||
|
private chunkByHeadings(content: ExtractedContent): ChunkMetadata[] {
|
||||||
|
const chunks: ChunkMetadata[] = [];
|
||||||
|
const text = content.text;
|
||||||
|
|
||||||
|
// Split by headings (h1, h2, h3)
|
||||||
|
const headingRegex = /^(#{1,3})\s+(.+)$/gm;
|
||||||
|
const sections: Array<{ level: number; title: string; start: number; end: number }> = [];
|
||||||
|
|
||||||
|
let match;
|
||||||
|
while ((match = headingRegex.exec(text)) !== null) {
|
||||||
|
const level = match[1].length;
|
||||||
|
const title = match[2].trim();
|
||||||
|
const start = match.index;
|
||||||
|
|
||||||
|
// Find the end of this section (next heading or end of text)
|
||||||
|
const nextMatch = headingRegex.exec(text);
|
||||||
|
const end = nextMatch ? nextMatch.index : text.length;
|
||||||
|
|
||||||
|
sections.push({ level, title, start, end });
|
||||||
|
|
||||||
|
// Reset regex lastIndex to continue from current position
|
||||||
|
headingRegex.lastIndex = start + match[0].length;
|
||||||
|
}
|
||||||
|
|
||||||
|
// If no headings found, fall back to size-based chunking
|
||||||
|
if (sections.length === 0) {
|
||||||
|
return this.chunkBySize(content);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create chunks for each section
|
||||||
|
for (let i = 0; i < sections.length; i++) {
|
||||||
|
const section = sections[i];
|
||||||
|
const sectionText = text.substring(section.start, section.end).trim();
|
||||||
|
|
||||||
|
if (sectionText.length === 0) continue;
|
||||||
|
|
||||||
|
const estimatedTokens = SimpleTokenizer.estimateTokens(sectionText);
|
||||||
|
|
||||||
|
if (estimatedTokens <= this.settings.maxTokens) {
|
||||||
|
// Section fits in one chunk
|
||||||
|
chunks.push(this.createChunkMetadata(content, sectionText, section.start, section.end, i));
|
||||||
|
} else {
|
||||||
|
// Section is too large, split it further
|
||||||
|
const subChunks = this.splitLargeSection(content, sectionText, section.start, i);
|
||||||
|
chunks.push(...subChunks);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return chunks;
|
||||||
|
}
|
||||||
|
|
||||||
|
private splitLargeSection(
|
||||||
|
content: ExtractedContent,
|
||||||
|
sectionText: string,
|
||||||
|
sectionStart: number,
|
||||||
|
baseChunkIndex: number
|
||||||
|
): ChunkMetadata[] {
|
||||||
|
const chunks: ChunkMetadata[] = [];
|
||||||
|
const subChunks = SimpleTokenizer.createOverlappingChunks(
|
||||||
|
sectionText,
|
||||||
|
this.settings.targetTokens,
|
||||||
|
this.settings.overlapTokens
|
||||||
|
);
|
||||||
|
|
||||||
|
for (let i = 0; i < subChunks.length; i++) {
|
||||||
|
const subChunk = subChunks[i];
|
||||||
|
const start = sectionStart + subChunk.start;
|
||||||
|
const end = sectionStart + subChunk.end;
|
||||||
|
|
||||||
|
chunks.push(this.createChunkMetadata(
|
||||||
|
content,
|
||||||
|
subChunk.text,
|
||||||
|
start,
|
||||||
|
end,
|
||||||
|
baseChunkIndex * 1000 + i // Ensure unique chunk indices
|
||||||
|
));
|
||||||
|
}
|
||||||
|
|
||||||
|
return chunks;
|
||||||
|
}
|
||||||
|
|
||||||
|
private chunkBySize(content: ExtractedContent): ChunkMetadata[] {
|
||||||
|
const chunks: ChunkMetadata[] = [];
|
||||||
|
const text = content.text;
|
||||||
|
|
||||||
|
const textChunks = SimpleTokenizer.createOverlappingChunks(
|
||||||
|
text,
|
||||||
|
this.settings.targetTokens,
|
||||||
|
this.settings.overlapTokens
|
||||||
|
);
|
||||||
|
|
||||||
|
for (let i = 0; i < textChunks.length; i++) {
|
||||||
|
const chunk = textChunks[i];
|
||||||
|
chunks.push(this.createChunkMetadata(
|
||||||
|
content,
|
||||||
|
chunk.text,
|
||||||
|
chunk.start,
|
||||||
|
chunk.end,
|
||||||
|
i
|
||||||
|
));
|
||||||
|
}
|
||||||
|
|
||||||
|
return chunks;
|
||||||
|
}
|
||||||
|
|
||||||
|
private createChunkMetadata(
|
||||||
|
content: ExtractedContent,
|
||||||
|
chunkText: string,
|
||||||
|
start: number,
|
||||||
|
end: number,
|
||||||
|
chunkIndex: number
|
||||||
|
): ChunkMetadata {
|
||||||
|
return {
|
||||||
|
...content.metadata,
|
||||||
|
chunk_index: chunkIndex,
|
||||||
|
chunk_start: start,
|
||||||
|
chunk_end: end
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Estimate the total number of chunks that would be created for a given text
|
||||||
|
*/
|
||||||
|
estimateChunkCount(text: string): number {
|
||||||
|
const estimatedTokens = SimpleTokenizer.estimateTokens(text);
|
||||||
|
const chunksPerTarget = Math.ceil(estimatedTokens / this.settings.targetTokens);
|
||||||
|
return Math.max(1, chunksPerTarget);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get chunk statistics for a text
|
||||||
|
*/
|
||||||
|
getChunkStats(text: string): {
|
||||||
|
estimatedTokens: number;
|
||||||
|
estimatedChunks: number;
|
||||||
|
averageChunkSize: number;
|
||||||
|
} {
|
||||||
|
const estimatedTokens = SimpleTokenizer.estimateTokens(text);
|
||||||
|
const estimatedChunks = this.estimateChunkCount(text);
|
||||||
|
const averageChunkSize = estimatedTokens / estimatedChunks;
|
||||||
|
|
||||||
|
return {
|
||||||
|
estimatedTokens,
|
||||||
|
estimatedChunks,
|
||||||
|
averageChunkSize
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
142
src/chunking/tokenizer.ts
Normal file
142
src/chunking/tokenizer.ts
Normal file
@ -0,0 +1,142 @@
|
|||||||
|
export class SimpleTokenizer {
|
||||||
|
private static readonly CHARS_PER_TOKEN = 4; // Rough approximation
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Estimate the number of tokens in a text using character count
|
||||||
|
* This is a simple approximation - for production, consider using a proper tokenizer
|
||||||
|
*/
|
||||||
|
static estimateTokens(text: string): number {
|
||||||
|
if (!text || text.length === 0) return 0;
|
||||||
|
return Math.ceil(text.length / this.CHARS_PER_TOKEN);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Truncate text to approximately fit within the token limit
|
||||||
|
*/
|
||||||
|
static truncateToTokens(text: string, maxTokens: number): string {
|
||||||
|
const estimatedTokens = this.estimateTokens(text);
|
||||||
|
if (estimatedTokens <= maxTokens) {
|
||||||
|
return text;
|
||||||
|
}
|
||||||
|
|
||||||
|
const maxChars = maxTokens * this.CHARS_PER_TOKEN;
|
||||||
|
return text.substring(0, maxChars);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Split text into sentences for better chunking boundaries
|
||||||
|
*/
|
||||||
|
static splitIntoSentences(text: string): string[] {
|
||||||
|
// Simple sentence splitting - could be improved with more sophisticated regex
|
||||||
|
const sentences = text
|
||||||
|
.split(/[.!?]+/)
|
||||||
|
.map(s => s.trim())
|
||||||
|
.filter(s => s.length > 0);
|
||||||
|
|
||||||
|
return sentences;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Split text into paragraphs
|
||||||
|
*/
|
||||||
|
static splitIntoParagraphs(text: string): string[] {
|
||||||
|
return text
|
||||||
|
.split(/\n\s*\n/)
|
||||||
|
.map(p => p.trim())
|
||||||
|
.filter(p => p.length > 0);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Split text into lines
|
||||||
|
*/
|
||||||
|
static splitIntoLines(text: string): string[] {
|
||||||
|
return text
|
||||||
|
.split(/\n/)
|
||||||
|
.map(l => l.trim())
|
||||||
|
.filter(l => l.length > 0);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Find the best split point within a text to stay under token limit
|
||||||
|
* Prefers sentence boundaries, then paragraph boundaries, then line boundaries
|
||||||
|
*/
|
||||||
|
static findBestSplitPoint(text: string, maxTokens: number): number {
|
||||||
|
const maxChars = maxTokens * this.CHARS_PER_TOKEN;
|
||||||
|
|
||||||
|
if (text.length <= maxChars) {
|
||||||
|
return text.length;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try to split at sentence boundaries
|
||||||
|
const sentences = this.splitIntoSentences(text);
|
||||||
|
let currentLength = 0;
|
||||||
|
|
||||||
|
for (let i = 0; i < sentences.length; i++) {
|
||||||
|
const sentenceLength = sentences[i].length + (i > 0 ? 1 : 0); // +1 for punctuation
|
||||||
|
if (currentLength + sentenceLength > maxChars) {
|
||||||
|
return currentLength;
|
||||||
|
}
|
||||||
|
currentLength += sentenceLength;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Fall back to paragraph boundaries
|
||||||
|
const paragraphs = this.splitIntoParagraphs(text);
|
||||||
|
currentLength = 0;
|
||||||
|
|
||||||
|
for (let i = 0; i < paragraphs.length; i++) {
|
||||||
|
const paragraphLength = paragraphs[i].length + (i > 0 ? 2 : 0); // +2 for paragraph break
|
||||||
|
if (currentLength + paragraphLength > maxChars) {
|
||||||
|
return currentLength;
|
||||||
|
}
|
||||||
|
currentLength += paragraphLength;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Fall back to line boundaries
|
||||||
|
const lines = this.splitIntoLines(text);
|
||||||
|
currentLength = 0;
|
||||||
|
|
||||||
|
for (let i = 0; i < lines.length; i++) {
|
||||||
|
const lineLength = lines[i].length + (i > 0 ? 1 : 0); // +1 for newline
|
||||||
|
if (currentLength + lineLength > maxChars) {
|
||||||
|
return currentLength;
|
||||||
|
}
|
||||||
|
currentLength += lineLength;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Last resort: hard truncation
|
||||||
|
return maxChars;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Create overlapping chunks with proper boundaries
|
||||||
|
*/
|
||||||
|
static createOverlappingChunks(
|
||||||
|
text: string,
|
||||||
|
targetTokens: number,
|
||||||
|
overlapTokens: number
|
||||||
|
): Array<{ text: string; start: number; end: number }> {
|
||||||
|
const chunks: Array<{ text: string; start: number; end: number }> = [];
|
||||||
|
let start = 0;
|
||||||
|
|
||||||
|
while (start < text.length) {
|
||||||
|
const end = this.findBestSplitPoint(text.substring(start), targetTokens) + start;
|
||||||
|
const chunkText = text.substring(start, end).trim();
|
||||||
|
|
||||||
|
if (chunkText.length > 0) {
|
||||||
|
chunks.push({
|
||||||
|
text: chunkText,
|
||||||
|
start,
|
||||||
|
end
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Move start position with overlap
|
||||||
|
if (end >= text.length) break;
|
||||||
|
|
||||||
|
const overlapChars = overlapTokens * this.CHARS_PER_TOKEN;
|
||||||
|
start = Math.max(start + 1, end - overlapChars);
|
||||||
|
}
|
||||||
|
|
||||||
|
return chunks;
|
||||||
|
}
|
||||||
|
}
|
||||||
18
src/embeddings/index.ts
Normal file
18
src/embeddings/index.ts
Normal file
@ -0,0 +1,18 @@
|
|||||||
|
import { EmbeddingProvider, EmbeddingProviderInterface, PluginSettings } from '../types';
|
||||||
|
import { OllamaEmbeddingProvider } from './ollama';
|
||||||
|
import { OpenAIEmbeddingProvider } from './openai';
|
||||||
|
|
||||||
|
export function createEmbeddingProvider(settings: PluginSettings): EmbeddingProviderInterface {
|
||||||
|
switch (settings.embedding.provider) {
|
||||||
|
case EmbeddingProvider.OLLAMA:
|
||||||
|
return new OllamaEmbeddingProvider(settings.embedding.ollama);
|
||||||
|
case EmbeddingProvider.OPENAI:
|
||||||
|
return new OpenAIEmbeddingProvider(settings.embedding.openai);
|
||||||
|
default:
|
||||||
|
throw new Error(`Unsupported embedding provider: ${settings.embedding.provider}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export { BaseEmbeddingProvider } from './provider';
|
||||||
|
export { OllamaEmbeddingProvider } from './ollama';
|
||||||
|
export { OpenAIEmbeddingProvider } from './openai';
|
||||||
119
src/embeddings/ollama.ts
Normal file
119
src/embeddings/ollama.ts
Normal file
@ -0,0 +1,119 @@
|
|||||||
|
import { requestUrl, RequestUrlParam } from 'obsidian';
|
||||||
|
import { BaseEmbeddingProvider } from './provider';
|
||||||
|
import { OllamaSettings } from '../types';
|
||||||
|
|
||||||
|
export interface OllamaEmbeddingRequest {
|
||||||
|
model: string;
|
||||||
|
prompt: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface OllamaEmbeddingResponse {
|
||||||
|
embedding: number[];
|
||||||
|
}
|
||||||
|
|
||||||
|
export class OllamaEmbeddingProvider extends BaseEmbeddingProvider {
|
||||||
|
private settings: OllamaSettings;
|
||||||
|
private dimension: number | null = null;
|
||||||
|
|
||||||
|
constructor(settings: OllamaSettings) {
|
||||||
|
super();
|
||||||
|
this.settings = settings;
|
||||||
|
}
|
||||||
|
|
||||||
|
getName(): string {
|
||||||
|
return `Ollama (${this.settings.model})`;
|
||||||
|
}
|
||||||
|
|
||||||
|
async getDimension(): Promise<number> {
|
||||||
|
if (this.dimension === null) {
|
||||||
|
// Test with a small text to get the dimension
|
||||||
|
const testEmbedding = await this.embed(['test']);
|
||||||
|
this.dimension = testEmbedding[0].length;
|
||||||
|
}
|
||||||
|
return this.dimension;
|
||||||
|
}
|
||||||
|
|
||||||
|
async embed(texts: string[]): Promise<number[][]> {
|
||||||
|
this.validateTexts(texts);
|
||||||
|
|
||||||
|
const results: number[][] = [];
|
||||||
|
const batchSize = this.settings.batchSize;
|
||||||
|
const maxConcurrency = this.settings.maxConcurrency;
|
||||||
|
|
||||||
|
// Process texts in batches with concurrency control
|
||||||
|
for (let i = 0; i < texts.length; i += batchSize) {
|
||||||
|
const batch = texts.slice(i, i + batchSize);
|
||||||
|
const batchPromises: Promise<number[][]>[] = [];
|
||||||
|
|
||||||
|
// Create concurrent requests within the batch
|
||||||
|
for (let j = 0; j < batch.length; j += maxConcurrency) {
|
||||||
|
const concurrentBatch = batch.slice(j, j + maxConcurrency);
|
||||||
|
const concurrentPromises = concurrentBatch.map(text => this.embedSingle(text));
|
||||||
|
batchPromises.push(Promise.all(concurrentPromises));
|
||||||
|
}
|
||||||
|
|
||||||
|
const batchResults = await Promise.all(batchPromises);
|
||||||
|
results.push(...batchResults.flat());
|
||||||
|
}
|
||||||
|
|
||||||
|
return results;
|
||||||
|
}
|
||||||
|
|
||||||
|
private async embedSingle(text: string): Promise<number[]> {
|
||||||
|
return this.retryWithBackoff(async () => {
|
||||||
|
const request: OllamaEmbeddingRequest = {
|
||||||
|
model: this.settings.model,
|
||||||
|
prompt: text
|
||||||
|
};
|
||||||
|
|
||||||
|
const requestParams: RequestUrlParam = {
|
||||||
|
url: `${this.settings.url}/api/embeddings`,
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/json'
|
||||||
|
},
|
||||||
|
body: JSON.stringify(request)
|
||||||
|
};
|
||||||
|
|
||||||
|
const response = await requestUrl(requestParams);
|
||||||
|
|
||||||
|
if (response.status !== 200) {
|
||||||
|
throw new Error(`Ollama API error: ${response.status} ${response.text}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
const data: OllamaEmbeddingResponse = JSON.parse(response.text);
|
||||||
|
return data.embedding;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async testConnection(): Promise<boolean> {
|
||||||
|
try {
|
||||||
|
await this.embedSingle('test');
|
||||||
|
return true;
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Ollama connection test failed:', error);
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async getAvailableModels(): Promise<string[]> {
|
||||||
|
try {
|
||||||
|
const requestParams: RequestUrlParam = {
|
||||||
|
url: `${this.settings.url}/api/tags`,
|
||||||
|
method: 'GET'
|
||||||
|
};
|
||||||
|
|
||||||
|
const response = await requestUrl(requestParams);
|
||||||
|
|
||||||
|
if (response.status !== 200) {
|
||||||
|
throw new Error(`Ollama API error: ${response.status} ${response.text}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
const data = JSON.parse(response.text);
|
||||||
|
return data.models?.map((model: any) => model.name) || [];
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to get Ollama models:', error);
|
||||||
|
return [];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
127
src/embeddings/openai.ts
Normal file
127
src/embeddings/openai.ts
Normal file
@ -0,0 +1,127 @@
|
|||||||
|
import { requestUrl, RequestUrlParam } from 'obsidian';
|
||||||
|
import { BaseEmbeddingProvider } from './provider';
|
||||||
|
import { OpenAISettings } from '../types';
|
||||||
|
|
||||||
|
export interface OpenAIEmbeddingRequest {
|
||||||
|
input: string[];
|
||||||
|
model: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface OpenAIEmbeddingResponse {
|
||||||
|
data: Array<{
|
||||||
|
embedding: number[];
|
||||||
|
index: number;
|
||||||
|
}>;
|
||||||
|
model: string;
|
||||||
|
usage: {
|
||||||
|
prompt_tokens: number;
|
||||||
|
total_tokens: number;
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export class OpenAIEmbeddingProvider extends BaseEmbeddingProvider {
|
||||||
|
private settings: OpenAISettings;
|
||||||
|
private dimension: number | null = null;
|
||||||
|
|
||||||
|
constructor(settings: OpenAISettings) {
|
||||||
|
super();
|
||||||
|
this.settings = settings;
|
||||||
|
}
|
||||||
|
|
||||||
|
getName(): string {
|
||||||
|
return `OpenAI (${this.settings.model})`;
|
||||||
|
}
|
||||||
|
|
||||||
|
async getDimension(): Promise<number> {
|
||||||
|
if (this.dimension === null) {
|
||||||
|
// Test with a small text to get the dimension
|
||||||
|
const testEmbedding = await this.embed(['test']);
|
||||||
|
this.dimension = testEmbedding[0].length;
|
||||||
|
}
|
||||||
|
return this.dimension;
|
||||||
|
}
|
||||||
|
|
||||||
|
async embed(texts: string[]): Promise<number[][]> {
|
||||||
|
this.validateTexts(texts);
|
||||||
|
|
||||||
|
const results: number[][] = [];
|
||||||
|
const batchSize = this.settings.batchSize;
|
||||||
|
const maxConcurrency = this.settings.maxConcurrency;
|
||||||
|
|
||||||
|
// Process texts in batches with concurrency control
|
||||||
|
for (let i = 0; i < texts.length; i += batchSize) {
|
||||||
|
const batch = texts.slice(i, i + batchSize);
|
||||||
|
const batchPromises: Promise<number[][]>[] = [];
|
||||||
|
|
||||||
|
// Create concurrent requests within the batch
|
||||||
|
for (let j = 0; j < batch.length; j += maxConcurrency) {
|
||||||
|
const concurrentBatch = batch.slice(j, j + maxConcurrency);
|
||||||
|
const concurrentPromises = concurrentBatch.map(text => this.embedSingle([text]));
|
||||||
|
const batchResult = await Promise.all(concurrentPromises);
|
||||||
|
results.push(...batchResult.flat());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return results;
|
||||||
|
}
|
||||||
|
|
||||||
|
private async embedSingle(texts: string[]): Promise<number[][]> {
|
||||||
|
return this.retryWithBackoff(async () => {
|
||||||
|
const request: OpenAIEmbeddingRequest = {
|
||||||
|
input: texts,
|
||||||
|
model: this.settings.model
|
||||||
|
};
|
||||||
|
|
||||||
|
const requestParams: RequestUrlParam = {
|
||||||
|
url: 'https://api.openai.com/v1/embeddings',
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
'Authorization': `Bearer ${this.settings.apiKey}`
|
||||||
|
},
|
||||||
|
body: JSON.stringify(request)
|
||||||
|
};
|
||||||
|
|
||||||
|
const response = await requestUrl(requestParams);
|
||||||
|
|
||||||
|
if (response.status !== 200) {
|
||||||
|
const errorText = response.text;
|
||||||
|
let errorMessage = `OpenAI API error: ${response.status}`;
|
||||||
|
|
||||||
|
try {
|
||||||
|
const errorData = JSON.parse(errorText);
|
||||||
|
errorMessage += ` - ${errorData.error?.message || errorText}`;
|
||||||
|
} catch {
|
||||||
|
errorMessage += ` - ${errorText}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
throw new Error(errorMessage);
|
||||||
|
}
|
||||||
|
|
||||||
|
const data: OpenAIEmbeddingResponse = JSON.parse(response.text);
|
||||||
|
|
||||||
|
// Sort by index to maintain order
|
||||||
|
const sortedData = data.data.sort((a, b) => a.index - b.index);
|
||||||
|
return sortedData.map(item => item.embedding);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async testConnection(): Promise<boolean> {
|
||||||
|
try {
|
||||||
|
await this.embedSingle(['test']);
|
||||||
|
return true;
|
||||||
|
} catch (error) {
|
||||||
|
console.error('OpenAI connection test failed:', error);
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async getUsage(): Promise<{
|
||||||
|
promptTokens: number;
|
||||||
|
totalTokens: number;
|
||||||
|
} | null> {
|
||||||
|
// This would require storing usage data from previous requests
|
||||||
|
// For now, return null - could be implemented with a usage tracker
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
69
src/embeddings/provider.ts
Normal file
69
src/embeddings/provider.ts
Normal file
@ -0,0 +1,69 @@
|
|||||||
|
import { EmbeddingProviderInterface } from '../types';
|
||||||
|
|
||||||
|
export abstract class BaseEmbeddingProvider implements EmbeddingProviderInterface {
|
||||||
|
abstract embed(texts: string[]): Promise<number[][]>;
|
||||||
|
abstract getDimension(): Promise<number>;
|
||||||
|
abstract getName(): string;
|
||||||
|
|
||||||
|
protected async delay(ms: number): Promise<void> {
|
||||||
|
return new Promise(resolve => setTimeout(resolve, ms));
|
||||||
|
}
|
||||||
|
|
||||||
|
protected async retryWithBackoff<T>(
|
||||||
|
operation: () => Promise<T>,
|
||||||
|
maxRetries: number = 3,
|
||||||
|
baseDelay: number = 1000
|
||||||
|
): Promise<T> {
|
||||||
|
let lastError: Error;
|
||||||
|
|
||||||
|
for (let attempt = 0; attempt <= maxRetries; attempt++) {
|
||||||
|
try {
|
||||||
|
return await operation();
|
||||||
|
} catch (error) {
|
||||||
|
lastError = error as Error;
|
||||||
|
|
||||||
|
if (attempt === maxRetries) {
|
||||||
|
throw lastError;
|
||||||
|
}
|
||||||
|
|
||||||
|
const delay = baseDelay * Math.pow(2, attempt) + Math.random() * 1000;
|
||||||
|
console.log(`Embedding request failed, retrying in ${delay}ms (attempt ${attempt + 1}/${maxRetries + 1})`);
|
||||||
|
await this.delay(delay);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
throw lastError!;
|
||||||
|
}
|
||||||
|
|
||||||
|
protected estimateTokens(text: string): number {
|
||||||
|
// Simple token estimation: roughly 4 characters per token
|
||||||
|
// This is a rough approximation - for production, consider using a proper tokenizer
|
||||||
|
return Math.ceil(text.length / 4);
|
||||||
|
}
|
||||||
|
|
||||||
|
protected validateTexts(texts: string[]): void {
|
||||||
|
if (!texts || texts.length === 0) {
|
||||||
|
throw new Error('No texts provided for embedding');
|
||||||
|
}
|
||||||
|
|
||||||
|
for (const text of texts) {
|
||||||
|
if (typeof text !== 'string') {
|
||||||
|
throw new Error('All texts must be strings');
|
||||||
|
}
|
||||||
|
if (text.trim().length === 0) {
|
||||||
|
throw new Error('Empty texts are not allowed');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
protected truncateText(text: string, maxTokens: number): string {
|
||||||
|
const estimatedTokens = this.estimateTokens(text);
|
||||||
|
if (estimatedTokens <= maxTokens) {
|
||||||
|
return text;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Simple truncation - in production, you'd want to truncate at word boundaries
|
||||||
|
const maxChars = maxTokens * 4;
|
||||||
|
return text.substring(0, maxChars) + '...';
|
||||||
|
}
|
||||||
|
}
|
||||||
170
src/extractors/base.ts
Normal file
170
src/extractors/base.ts
Normal file
@ -0,0 +1,170 @@
|
|||||||
|
import { TFile, App } from 'obsidian';
|
||||||
|
import { ExtractedContent, ChunkMetadata } from '../types';
|
||||||
|
|
||||||
|
export abstract class BaseExtractor {
|
||||||
|
protected app: App;
|
||||||
|
|
||||||
|
constructor(app: App) {
|
||||||
|
this.app = app;
|
||||||
|
}
|
||||||
|
|
||||||
|
abstract canHandle(file: TFile): boolean;
|
||||||
|
abstract extract(file: TFile): Promise<ExtractedContent>;
|
||||||
|
|
||||||
|
protected async getFileContent(file: TFile): Promise<string> {
|
||||||
|
try {
|
||||||
|
return await this.app.vault.read(file);
|
||||||
|
} catch (error) {
|
||||||
|
console.error(`Failed to read file ${file.path}:`, error);
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
protected getFileStats(file: TFile): { modified: number; created: number } {
|
||||||
|
const stat = file.stat;
|
||||||
|
return {
|
||||||
|
modified: stat.mtime,
|
||||||
|
created: stat.ctime
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
protected createBaseMetadata(file: TFile): ChunkMetadata {
|
||||||
|
const stats = this.getFileStats(file);
|
||||||
|
const ext = file.extension || '';
|
||||||
|
|
||||||
|
return {
|
||||||
|
path: file.path,
|
||||||
|
ext,
|
||||||
|
mime: this.getMimeType(ext),
|
||||||
|
title: file.basename,
|
||||||
|
h1: [],
|
||||||
|
tags: [],
|
||||||
|
aliases: [],
|
||||||
|
links: [],
|
||||||
|
modified: stats.modified,
|
||||||
|
created: stats.created,
|
||||||
|
model: '', // Will be set by the embedding provider
|
||||||
|
chunk_index: 0,
|
||||||
|
chunk_start: 0,
|
||||||
|
chunk_end: 0,
|
||||||
|
fm: {}
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
private getMimeType(ext: string): string {
|
||||||
|
const mimeTypes: Record<string, string> = {
|
||||||
|
'md': 'text/markdown',
|
||||||
|
'txt': 'text/plain',
|
||||||
|
'pdf': 'application/pdf',
|
||||||
|
'png': 'image/png',
|
||||||
|
'jpg': 'image/jpeg',
|
||||||
|
'jpeg': 'image/jpeg',
|
||||||
|
'gif': 'image/gif',
|
||||||
|
'svg': 'image/svg+xml',
|
||||||
|
'js': 'application/javascript',
|
||||||
|
'ts': 'application/typescript',
|
||||||
|
'json': 'application/json',
|
||||||
|
'html': 'text/html',
|
||||||
|
'css': 'text/css',
|
||||||
|
'py': 'text/x-python',
|
||||||
|
'java': 'text/x-java',
|
||||||
|
'cpp': 'text/x-c++',
|
||||||
|
'c': 'text/x-c',
|
||||||
|
'go': 'text/x-go',
|
||||||
|
'rs': 'text/x-rust',
|
||||||
|
'php': 'text/x-php',
|
||||||
|
'rb': 'text/x-ruby',
|
||||||
|
'sh': 'text/x-shellscript',
|
||||||
|
'yml': 'text/x-yaml',
|
||||||
|
'yaml': 'text/x-yaml',
|
||||||
|
'xml': 'text/xml'
|
||||||
|
};
|
||||||
|
|
||||||
|
return mimeTypes[ext.toLowerCase()] || 'application/octet-stream';
|
||||||
|
}
|
||||||
|
|
||||||
|
protected extractFrontmatter(content: string): { frontmatter: Record<string, any>; body: string } {
|
||||||
|
const frontmatterRegex = /^---\s*\n([\s\S]*?)\n---\s*\n([\s\S]*)$/;
|
||||||
|
const match = content.match(frontmatterRegex);
|
||||||
|
|
||||||
|
if (!match) {
|
||||||
|
return { frontmatter: {}, body: content };
|
||||||
|
}
|
||||||
|
|
||||||
|
const frontmatterText = match[1];
|
||||||
|
const body = match[2];
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Simple YAML-like parsing for basic frontmatter
|
||||||
|
const frontmatter: Record<string, any> = {};
|
||||||
|
const lines = frontmatterText.split('\n');
|
||||||
|
|
||||||
|
for (const line of lines) {
|
||||||
|
const trimmed = line.trim();
|
||||||
|
if (trimmed && !trimmed.startsWith('#')) {
|
||||||
|
const colonIndex = trimmed.indexOf(':');
|
||||||
|
if (colonIndex > 0) {
|
||||||
|
const key = trimmed.substring(0, colonIndex).trim();
|
||||||
|
let value = trimmed.substring(colonIndex + 1).trim();
|
||||||
|
|
||||||
|
// Remove quotes if present
|
||||||
|
if ((value.startsWith('"') && value.endsWith('"')) ||
|
||||||
|
(value.startsWith("'") && value.endsWith("'"))) {
|
||||||
|
value = value.slice(1, -1);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parse arrays (simple format: [item1, item2, item3])
|
||||||
|
if (value.startsWith('[') && value.endsWith(']')) {
|
||||||
|
const arrayContent = value.slice(1, -1);
|
||||||
|
frontmatter[key] = arrayContent.split(',').map(item => item.trim().replace(/^["']|["']$/g, ''));
|
||||||
|
} else {
|
||||||
|
frontmatter[key] = value;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return { frontmatter, body };
|
||||||
|
} catch (error) {
|
||||||
|
console.warn('Failed to parse frontmatter:', error);
|
||||||
|
return { frontmatter: {}, body: content };
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
protected extractMarkdownElements(content: string): {
|
||||||
|
headings: string[];
|
||||||
|
tags: string[];
|
||||||
|
links: string[];
|
||||||
|
} {
|
||||||
|
const headings: string[] = [];
|
||||||
|
const tags: string[] = [];
|
||||||
|
const links: string[] = [];
|
||||||
|
|
||||||
|
// Extract headings
|
||||||
|
const headingRegex = /^#{1,6}\s+(.+)$/gm;
|
||||||
|
let match;
|
||||||
|
while ((match = headingRegex.exec(content)) !== null) {
|
||||||
|
headings.push(match[1].trim());
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract tags
|
||||||
|
const tagRegex = /#([a-zA-Z0-9_-]+)/g;
|
||||||
|
while ((match = tagRegex.exec(content)) !== null) {
|
||||||
|
tags.push(match[1]);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract links
|
||||||
|
const linkRegex = /\[([^\]]+)\]\(([^)]+)\)/g;
|
||||||
|
while ((match = linkRegex.exec(content)) !== null) {
|
||||||
|
links.push(match[2]);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract wiki links
|
||||||
|
const wikiLinkRegex = /\[\[([^\]]+)\]\]/g;
|
||||||
|
while ((match = wikiLinkRegex.exec(content)) !== null) {
|
||||||
|
links.push(match[1]);
|
||||||
|
}
|
||||||
|
|
||||||
|
return { headings, tags, links };
|
||||||
|
}
|
||||||
|
}
|
||||||
94
src/extractors/index.ts
Normal file
94
src/extractors/index.ts
Normal file
@ -0,0 +1,94 @@
|
|||||||
|
import { App } from 'obsidian';
|
||||||
|
import { TFile } from 'obsidian';
|
||||||
|
import { ExtractorInterface, ExtractedContent } from '../types';
|
||||||
|
import { MarkdownExtractor } from './markdown';
|
||||||
|
import { TextExtractor } from './text';
|
||||||
|
import { TextExtractorPlugin } from './textExtractor';
|
||||||
|
|
||||||
|
export class ExtractorManager implements ExtractorInterface {
|
||||||
|
private extractors: ExtractorInterface[] = [];
|
||||||
|
private app: App;
|
||||||
|
|
||||||
|
constructor(app: App) {
|
||||||
|
this.app = app;
|
||||||
|
this.initializeExtractors();
|
||||||
|
}
|
||||||
|
|
||||||
|
private initializeExtractors(): void {
|
||||||
|
// Add extractors in order of preference
|
||||||
|
this.extractors.push(new MarkdownExtractor(this.app));
|
||||||
|
this.extractors.push(new TextExtractor(this.app));
|
||||||
|
this.extractors.push(new TextExtractorPlugin(this.app));
|
||||||
|
}
|
||||||
|
|
||||||
|
canHandle(file: TFile): boolean {
|
||||||
|
return this.extractors.some(extractor => extractor.canHandle(file));
|
||||||
|
}
|
||||||
|
|
||||||
|
async extract(file: TFile): Promise<ExtractedContent> {
|
||||||
|
const extractor = this.extractors.find(ext => ext.canHandle(file));
|
||||||
|
|
||||||
|
if (!extractor) {
|
||||||
|
throw new Error(`No extractor found for file: ${file.path}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
return await extractor.extract(file);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get the appropriate extractor for a file
|
||||||
|
*/
|
||||||
|
getExtractor(file: TFile): ExtractorInterface | null {
|
||||||
|
return this.extractors.find(ext => ext.canHandle(file)) || null;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get status of all extractors
|
||||||
|
*/
|
||||||
|
getExtractorStatus(): Array<{
|
||||||
|
name: string;
|
||||||
|
available: boolean;
|
||||||
|
supportedExtensions: string[];
|
||||||
|
}> {
|
||||||
|
return this.extractors.map(extractor => {
|
||||||
|
if (extractor instanceof MarkdownExtractor) {
|
||||||
|
return {
|
||||||
|
name: 'Markdown Extractor',
|
||||||
|
available: true,
|
||||||
|
supportedExtensions: ['md']
|
||||||
|
};
|
||||||
|
} else if (extractor instanceof TextExtractor) {
|
||||||
|
return {
|
||||||
|
name: 'Text Extractor',
|
||||||
|
available: true,
|
||||||
|
supportedExtensions: ['txt', 'js', 'ts', 'json', 'html', 'css', 'py', 'java', 'cpp', 'c', 'go', 'rs', 'php', 'rb', 'sh', 'yml', 'yaml', 'xml']
|
||||||
|
};
|
||||||
|
} else if (extractor instanceof TextExtractorPlugin) {
|
||||||
|
const status = (extractor as any).getStatus();
|
||||||
|
return {
|
||||||
|
name: 'Text Extractor Plugin',
|
||||||
|
available: status.available,
|
||||||
|
supportedExtensions: status.supportedFormats
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
name: 'Unknown Extractor',
|
||||||
|
available: false,
|
||||||
|
supportedExtensions: []
|
||||||
|
};
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if Text Extractor plugin is available
|
||||||
|
*/
|
||||||
|
isTextExtractorAvailable(): boolean {
|
||||||
|
const textExtractor = this.extractors.find(ext => ext instanceof TextExtractorPlugin);
|
||||||
|
return textExtractor ? (textExtractor as any).isAvailable() : false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export { MarkdownExtractor } from './markdown';
|
||||||
|
export { TextExtractor } from './text';
|
||||||
|
export { TextExtractorPlugin } from './textExtractor';
|
||||||
44
src/extractors/markdown.ts
Normal file
44
src/extractors/markdown.ts
Normal file
@ -0,0 +1,44 @@
|
|||||||
|
import { TFile } from 'obsidian';
|
||||||
|
import { BaseExtractor } from './base';
|
||||||
|
import { ExtractedContent } from '../types';
|
||||||
|
|
||||||
|
export class MarkdownExtractor extends BaseExtractor {
|
||||||
|
canHandle(file: TFile): boolean {
|
||||||
|
return file.extension === 'md';
|
||||||
|
}
|
||||||
|
|
||||||
|
async extract(file: TFile): Promise<ExtractedContent> {
|
||||||
|
const content = await this.getFileContent(file);
|
||||||
|
const metadata = this.createBaseMetadata(file);
|
||||||
|
|
||||||
|
// Extract frontmatter and body
|
||||||
|
const { frontmatter, body } = this.extractFrontmatter(content);
|
||||||
|
|
||||||
|
// Extract markdown elements
|
||||||
|
const { headings, tags, links } = this.extractMarkdownElements(body);
|
||||||
|
|
||||||
|
// Update metadata with extracted information
|
||||||
|
metadata.h1 = headings.filter(h => h.length > 0);
|
||||||
|
metadata.tags = [...new Set(tags)]; // Remove duplicates
|
||||||
|
metadata.links = [...new Set(links)]; // Remove duplicates
|
||||||
|
metadata.fm = frontmatter;
|
||||||
|
|
||||||
|
// Extract aliases from frontmatter
|
||||||
|
if (frontmatter.aliases) {
|
||||||
|
metadata.aliases = Array.isArray(frontmatter.aliases)
|
||||||
|
? frontmatter.aliases
|
||||||
|
: [frontmatter.aliases];
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract title from frontmatter if present
|
||||||
|
if (frontmatter.title) {
|
||||||
|
metadata.title = frontmatter.title;
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
text: body,
|
||||||
|
metadata,
|
||||||
|
pageNumbers: undefined // Markdown files don't have page numbers
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
255
src/extractors/text.ts
Normal file
255
src/extractors/text.ts
Normal file
@ -0,0 +1,255 @@
|
|||||||
|
import { TFile } from 'obsidian';
|
||||||
|
import { BaseExtractor } from './base';
|
||||||
|
import { ExtractedContent } from '../types';
|
||||||
|
|
||||||
|
export class TextExtractor extends BaseExtractor {
|
||||||
|
private supportedExtensions = ['txt', 'js', 'ts', 'json', 'html', 'css', 'py', 'java', 'cpp', 'c', 'go', 'rs', 'php', 'rb', 'sh', 'yml', 'yaml', 'xml'];
|
||||||
|
|
||||||
|
canHandle(file: TFile): boolean {
|
||||||
|
return this.supportedExtensions.includes(file.extension || '');
|
||||||
|
}
|
||||||
|
|
||||||
|
async extract(file: TFile): Promise<ExtractedContent> {
|
||||||
|
const content = await this.getFileContent(file);
|
||||||
|
const metadata = this.createBaseMetadata(file);
|
||||||
|
|
||||||
|
// For code files, we might want to extract some basic structure
|
||||||
|
if (this.isCodeFile(file)) {
|
||||||
|
const codeElements = this.extractCodeElements(content, file.extension || '');
|
||||||
|
metadata.fm = {
|
||||||
|
...metadata.fm,
|
||||||
|
language: file.extension,
|
||||||
|
functions: codeElements.functions,
|
||||||
|
classes: codeElements.classes,
|
||||||
|
imports: codeElements.imports
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
text: content,
|
||||||
|
metadata,
|
||||||
|
pageNumbers: undefined
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
private isCodeFile(file: TFile): boolean {
|
||||||
|
const codeExtensions = ['js', 'ts', 'py', 'java', 'cpp', 'c', 'go', 'rs', 'php', 'rb'];
|
||||||
|
return codeExtensions.includes(file.extension || '');
|
||||||
|
}
|
||||||
|
|
||||||
|
private extractCodeElements(content: string, extension: string): {
|
||||||
|
functions: string[];
|
||||||
|
classes: string[];
|
||||||
|
imports: string[];
|
||||||
|
} {
|
||||||
|
const functions: string[] = [];
|
||||||
|
const classes: string[] = [];
|
||||||
|
const imports: string[] = [];
|
||||||
|
|
||||||
|
switch (extension) {
|
||||||
|
case 'js':
|
||||||
|
case 'ts':
|
||||||
|
this.extractJSElements(content, functions, classes, imports);
|
||||||
|
break;
|
||||||
|
case 'py':
|
||||||
|
this.extractPythonElements(content, functions, classes, imports);
|
||||||
|
break;
|
||||||
|
case 'java':
|
||||||
|
this.extractJavaElements(content, functions, classes, imports);
|
||||||
|
break;
|
||||||
|
case 'cpp':
|
||||||
|
case 'c':
|
||||||
|
this.extractCElements(content, functions, classes, imports);
|
||||||
|
break;
|
||||||
|
case 'go':
|
||||||
|
this.extractGoElements(content, functions, classes, imports);
|
||||||
|
break;
|
||||||
|
case 'rs':
|
||||||
|
this.extractRustElements(content, functions, classes, imports);
|
||||||
|
break;
|
||||||
|
case 'php':
|
||||||
|
this.extractPhpElements(content, functions, classes, imports);
|
||||||
|
break;
|
||||||
|
case 'rb':
|
||||||
|
this.extractRubyElements(content, functions, classes, imports);
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
|
||||||
|
return { functions, classes, imports };
|
||||||
|
}
|
||||||
|
|
||||||
|
private extractJSElements(content: string, functions: string[], classes: string[], imports: string[]): void {
|
||||||
|
// Extract function declarations
|
||||||
|
const functionRegex = /(?:function\s+(\w+)|const\s+(\w+)\s*=\s*(?:async\s+)?\(|(\w+)\s*:\s*(?:async\s+)?\(/g;
|
||||||
|
let match;
|
||||||
|
while ((match = functionRegex.exec(content)) !== null) {
|
||||||
|
const funcName = match[1] || match[2] || match[3];
|
||||||
|
if (funcName) functions.push(funcName);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract class declarations
|
||||||
|
const classRegex = /class\s+(\w+)/g;
|
||||||
|
while ((match = classRegex.exec(content)) !== null) {
|
||||||
|
classes.push(match[1]);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract imports
|
||||||
|
const importRegex = /import\s+(?:.*\s+from\s+)?['"]([^'"]+)['"]/g;
|
||||||
|
while ((match = importRegex.exec(content)) !== null) {
|
||||||
|
imports.push(match[1]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private extractPythonElements(content: string, functions: string[], classes: string[], imports: string[]): void {
|
||||||
|
// Extract function definitions
|
||||||
|
const functionRegex = /def\s+(\w+)\s*\(/g;
|
||||||
|
let match;
|
||||||
|
while ((match = functionRegex.exec(content)) !== null) {
|
||||||
|
functions.push(match[1]);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract class definitions
|
||||||
|
const classRegex = /class\s+(\w+)/g;
|
||||||
|
while ((match = classRegex.exec(content)) !== null) {
|
||||||
|
classes.push(match[1]);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract imports
|
||||||
|
const importRegex = /(?:from\s+(\S+)\s+import|import\s+(\S+))/g;
|
||||||
|
while ((match = importRegex.exec(content)) !== null) {
|
||||||
|
imports.push(match[1] || match[2]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private extractJavaElements(content: string, functions: string[], classes: string[], imports: string[]): void {
|
||||||
|
// Extract method declarations
|
||||||
|
const methodRegex = /(?:public|private|protected)?\s*(?:static\s+)?\s*(?:void|\w+)\s+(\w+)\s*\(/g;
|
||||||
|
let match;
|
||||||
|
while ((match = methodRegex.exec(content)) !== null) {
|
||||||
|
functions.push(match[1]);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract class declarations
|
||||||
|
const classRegex = /(?:public\s+)?class\s+(\w+)/g;
|
||||||
|
while ((match = classRegex.exec(content)) !== null) {
|
||||||
|
classes.push(match[1]);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract imports
|
||||||
|
const importRegex = /import\s+([^;]+);/g;
|
||||||
|
while ((match = importRegex.exec(content)) !== null) {
|
||||||
|
imports.push(match[1]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private extractCElements(content: string, functions: string[], classes: string[], imports: string[]): void {
|
||||||
|
// Extract function declarations
|
||||||
|
const functionRegex = /(?:static\s+)?\s*(?:void|\w+)\s+(\w+)\s*\(/g;
|
||||||
|
let match;
|
||||||
|
while ((match = functionRegex.exec(content)) !== null) {
|
||||||
|
functions.push(match[1]);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract struct declarations
|
||||||
|
const structRegex = /struct\s+(\w+)/g;
|
||||||
|
while ((match = structRegex.exec(content)) !== null) {
|
||||||
|
classes.push(match[1]);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract includes
|
||||||
|
const includeRegex = /#include\s*[<"]([^>"]+)[>"]/g;
|
||||||
|
while ((match = includeRegex.exec(content)) !== null) {
|
||||||
|
imports.push(match[1]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private extractGoElements(content: string, functions: string[], classes: string[], imports: string[]): void {
|
||||||
|
// Extract function declarations
|
||||||
|
const functionRegex = /func\s+(?:\(\w+\s+\*?\w+\)\s+)?(\w+)\s*\(/g;
|
||||||
|
let match;
|
||||||
|
while ((match = functionRegex.exec(content)) !== null) {
|
||||||
|
functions.push(match[1]);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract type declarations
|
||||||
|
const typeRegex = /type\s+(\w+)\s+(?:struct|interface)/g;
|
||||||
|
while ((match = typeRegex.exec(content)) !== null) {
|
||||||
|
classes.push(match[1]);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract imports
|
||||||
|
const importRegex = /import\s+(?:\(([^)]+)\)|['"]([^'"]+)['"])/g;
|
||||||
|
while ((match = importRegex.exec(content)) !== null) {
|
||||||
|
if (match[1]) {
|
||||||
|
// Multi-line import
|
||||||
|
const importsList = match[1].split('\n').map(imp => imp.trim().replace(/['"]/g, ''));
|
||||||
|
imports.push(...importsList);
|
||||||
|
} else if (match[2]) {
|
||||||
|
imports.push(match[2]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private extractRustElements(content: string, functions: string[], classes: string[], imports: string[]): void {
|
||||||
|
// Extract function declarations
|
||||||
|
const functionRegex = /fn\s+(\w+)\s*\(/g;
|
||||||
|
let match;
|
||||||
|
while ((match = functionRegex.exec(content)) !== null) {
|
||||||
|
functions.push(match[1]);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract struct and enum declarations
|
||||||
|
const structRegex = /(?:struct|enum)\s+(\w+)/g;
|
||||||
|
while ((match = structRegex.exec(content)) !== null) {
|
||||||
|
classes.push(match[1]);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract use statements
|
||||||
|
const useRegex = /use\s+([^;]+);/g;
|
||||||
|
while ((match = useRegex.exec(content)) !== null) {
|
||||||
|
imports.push(match[1]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private extractPhpElements(content: string, functions: string[], classes: string[], imports: string[]): void {
|
||||||
|
// Extract function declarations
|
||||||
|
const functionRegex = /function\s+(\w+)\s*\(/g;
|
||||||
|
let match;
|
||||||
|
while ((match = functionRegex.exec(content)) !== null) {
|
||||||
|
functions.push(match[1]);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract class declarations
|
||||||
|
const classRegex = /class\s+(\w+)/g;
|
||||||
|
while ((match = classRegex.exec(content)) !== null) {
|
||||||
|
classes.push(match[1]);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract require/include statements
|
||||||
|
const requireRegex = /(?:require|include)(?:_once)?\s*['"]([^'"]+)['"]/g;
|
||||||
|
while ((match = requireRegex.exec(content)) !== null) {
|
||||||
|
imports.push(match[1]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private extractRubyElements(content: string, functions: string[], classes: string[], imports: string[]): void {
|
||||||
|
// Extract method definitions
|
||||||
|
const methodRegex = /def\s+(\w+)/g;
|
||||||
|
let match;
|
||||||
|
while ((match = methodRegex.exec(content)) !== null) {
|
||||||
|
functions.push(match[1]);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract class definitions
|
||||||
|
const classRegex = /class\s+(\w+)/g;
|
||||||
|
while ((match = classRegex.exec(content)) !== null) {
|
||||||
|
classes.push(match[1]);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract require statements
|
||||||
|
const requireRegex = /require\s+['"]([^'"]+)['"]/g;
|
||||||
|
while ((match = requireRegex.exec(content)) !== null) {
|
||||||
|
imports.push(match[1]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
178
src/extractors/textExtractor.ts
Normal file
178
src/extractors/textExtractor.ts
Normal file
@ -0,0 +1,178 @@
|
|||||||
|
import { TFile, App } from 'obsidian';
|
||||||
|
import { BaseExtractor } from './base';
|
||||||
|
import { ExtractedContent } from '../types';
|
||||||
|
|
||||||
|
export class TextExtractorPlugin extends BaseExtractor {
|
||||||
|
private textExtractorPlugin: any = null;
|
||||||
|
|
||||||
|
constructor(app: App) {
|
||||||
|
super(app);
|
||||||
|
this.initializeTextExtractor();
|
||||||
|
}
|
||||||
|
|
||||||
|
private initializeTextExtractor(): void {
|
||||||
|
// Check if Text Extractor plugin is installed and enabled
|
||||||
|
const plugins = (this.app as any).plugins;
|
||||||
|
if (plugins && plugins.plugins && plugins.plugins['text-extractor']) {
|
||||||
|
this.textExtractorPlugin = plugins.plugins['text-extractor'];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
canHandle(file: TFile): boolean {
|
||||||
|
if (!this.textExtractorPlugin) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if Text Extractor can handle this file type
|
||||||
|
const supportedExtensions = ['pdf', 'png', 'jpg', 'jpeg', 'gif', 'svg'];
|
||||||
|
return supportedExtensions.includes(file.extension || '');
|
||||||
|
}
|
||||||
|
|
||||||
|
async extract(file: TFile): Promise<ExtractedContent> {
|
||||||
|
if (!this.textExtractorPlugin) {
|
||||||
|
throw new Error('Text Extractor plugin not available');
|
||||||
|
}
|
||||||
|
|
||||||
|
const metadata = this.createBaseMetadata(file);
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Try to get cached text from Text Extractor
|
||||||
|
const cachedText = await this.getCachedText(file);
|
||||||
|
|
||||||
|
if (cachedText) {
|
||||||
|
metadata.ocr = this.isImageFile(file);
|
||||||
|
|
||||||
|
return {
|
||||||
|
text: cachedText,
|
||||||
|
metadata,
|
||||||
|
pageNumbers: this.isPdfFile(file) ? [1] : undefined // Simple assumption for PDFs
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// If no cached text, try to extract directly
|
||||||
|
const extractedText = await this.extractTextDirectly(file);
|
||||||
|
|
||||||
|
if (extractedText) {
|
||||||
|
metadata.ocr = this.isImageFile(file);
|
||||||
|
|
||||||
|
return {
|
||||||
|
text: extractedText,
|
||||||
|
metadata,
|
||||||
|
pageNumbers: this.isPdfFile(file) ? [1] : undefined
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Fallback: return empty content
|
||||||
|
return {
|
||||||
|
text: '',
|
||||||
|
metadata,
|
||||||
|
pageNumbers: undefined
|
||||||
|
};
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error(`Failed to extract text from ${file.path}:`, error);
|
||||||
|
|
||||||
|
// Return empty content on error
|
||||||
|
return {
|
||||||
|
text: '',
|
||||||
|
metadata,
|
||||||
|
pageNumbers: undefined
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private async getCachedText(file: TFile): Promise<string | null> {
|
||||||
|
try {
|
||||||
|
// Access Text Extractor's cache
|
||||||
|
if (this.textExtractorPlugin.api && this.textExtractorPlugin.api.getCachedText) {
|
||||||
|
return await this.textExtractorPlugin.api.getCachedText(file);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Alternative: try to access the cache directly
|
||||||
|
if (this.textExtractorPlugin.cache) {
|
||||||
|
const cacheKey = file.path;
|
||||||
|
return this.textExtractorPlugin.cache[cacheKey] || null;
|
||||||
|
}
|
||||||
|
|
||||||
|
return null;
|
||||||
|
} catch (error) {
|
||||||
|
console.warn('Failed to get cached text from Text Extractor:', error);
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private async extractTextDirectly(file: TFile): Promise<string | null> {
|
||||||
|
try {
|
||||||
|
// Try to use Text Extractor's extraction API
|
||||||
|
if (this.textExtractorPlugin.api && this.textExtractorPlugin.api.extractText) {
|
||||||
|
return await this.textExtractorPlugin.api.extractText(file);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Alternative: trigger extraction and wait for completion
|
||||||
|
if (this.textExtractorPlugin.extractText) {
|
||||||
|
return await this.textExtractorPlugin.extractText(file);
|
||||||
|
}
|
||||||
|
|
||||||
|
return null;
|
||||||
|
} catch (error) {
|
||||||
|
console.warn('Failed to extract text directly from Text Extractor:', error);
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private isPdfFile(file: TFile): boolean {
|
||||||
|
return file.extension === 'pdf';
|
||||||
|
}
|
||||||
|
|
||||||
|
private isImageFile(file: TFile): boolean {
|
||||||
|
const imageExtensions = ['png', 'jpg', 'jpeg', 'gif', 'svg'];
|
||||||
|
return imageExtensions.includes(file.extension || '');
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if Text Extractor plugin is available and working
|
||||||
|
*/
|
||||||
|
isAvailable(): boolean {
|
||||||
|
return this.textExtractorPlugin !== null;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get the status of Text Extractor plugin
|
||||||
|
*/
|
||||||
|
getStatus(): {
|
||||||
|
available: boolean;
|
||||||
|
version?: string;
|
||||||
|
supportedFormats: string[];
|
||||||
|
} {
|
||||||
|
const supportedFormats = ['pdf', 'png', 'jpg', 'jpeg', 'gif', 'svg'];
|
||||||
|
|
||||||
|
if (!this.textExtractorPlugin) {
|
||||||
|
return {
|
||||||
|
available: false,
|
||||||
|
supportedFormats
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
available: true,
|
||||||
|
version: this.textExtractorPlugin.manifest?.version,
|
||||||
|
supportedFormats
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Force refresh of cached text for a file
|
||||||
|
*/
|
||||||
|
async refreshCache(file: TFile): Promise<boolean> {
|
||||||
|
try {
|
||||||
|
if (this.textExtractorPlugin.api && this.textExtractorPlugin.api.refreshCache) {
|
||||||
|
await this.textExtractorPlugin.api.refreshCache(file);
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
return false;
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to refresh Text Extractor cache:', error);
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
166
src/graph/graphView.ts
Normal file
166
src/graph/graphView.ts
Normal file
@ -0,0 +1,166 @@
|
|||||||
|
import { ItemView, WorkspaceLeaf } from 'obsidian';
|
||||||
|
import { GraphData, GraphNode, GraphEdge } from '../types';
|
||||||
|
|
||||||
|
export const GRAPH_VIEW_TYPE = 'qdrant-graph-view';
|
||||||
|
|
||||||
|
export class GraphView extends ItemView {
|
||||||
|
private graphData: GraphData | null = null;
|
||||||
|
private svg: SVGElement | null = null;
|
||||||
|
|
||||||
|
constructor(leaf: WorkspaceLeaf) {
|
||||||
|
super(leaf);
|
||||||
|
}
|
||||||
|
|
||||||
|
getViewType(): string {
|
||||||
|
return GRAPH_VIEW_TYPE;
|
||||||
|
}
|
||||||
|
|
||||||
|
getDisplayText(): string {
|
||||||
|
return 'Qdrant Graph';
|
||||||
|
}
|
||||||
|
|
||||||
|
getIcon(): string {
|
||||||
|
return 'graph';
|
||||||
|
}
|
||||||
|
|
||||||
|
async onOpen() {
|
||||||
|
const container = this.contentEl;
|
||||||
|
container.empty();
|
||||||
|
|
||||||
|
// Create graph container
|
||||||
|
const graphContainer = container.createEl('div', { cls: 'qdrant-graph-container' });
|
||||||
|
|
||||||
|
// Create SVG element
|
||||||
|
this.svg = document.createElementNS('http://www.w3.org/2000/svg', 'svg');
|
||||||
|
this.svg.addClass('qdrant-graph-svg');
|
||||||
|
graphContainer.appendChild(this.svg);
|
||||||
|
|
||||||
|
// Create placeholder content
|
||||||
|
this.renderPlaceholder();
|
||||||
|
}
|
||||||
|
|
||||||
|
async onClose() {
|
||||||
|
// Cleanup if needed
|
||||||
|
}
|
||||||
|
|
||||||
|
private renderPlaceholder(): void {
|
||||||
|
if (!this.svg) return;
|
||||||
|
|
||||||
|
this.svg.empty();
|
||||||
|
|
||||||
|
// Add placeholder text
|
||||||
|
const text = document.createElementNS('http://www.w3.org/2000/svg', 'text');
|
||||||
|
text.setAttribute('x', '50%');
|
||||||
|
text.setAttribute('y', '50%');
|
||||||
|
text.setAttribute('text-anchor', 'middle');
|
||||||
|
text.setAttribute('dominant-baseline', 'middle');
|
||||||
|
text.setAttribute('fill', 'var(--text-muted)');
|
||||||
|
text.setAttribute('font-size', '16');
|
||||||
|
text.textContent = 'Graph visualization will be implemented here';
|
||||||
|
this.svg.appendChild(text);
|
||||||
|
|
||||||
|
// Add instructions
|
||||||
|
const instructions = document.createElementNS('http://www.w3.org/2000/svg', 'text');
|
||||||
|
instructions.setAttribute('x', '50%');
|
||||||
|
instructions.setAttribute('y', '60%');
|
||||||
|
instructions.setAttribute('text-anchor', 'middle');
|
||||||
|
instructions.setAttribute('dominant-baseline', 'middle');
|
||||||
|
instructions.setAttribute('fill', 'var(--text-muted)');
|
||||||
|
instructions.setAttribute('font-size', '12');
|
||||||
|
instructions.textContent = 'This will show document relationships based on semantic similarity';
|
||||||
|
this.svg.appendChild(instructions);
|
||||||
|
}
|
||||||
|
|
||||||
|
setGraphData(data: GraphData): void {
|
||||||
|
this.graphData = data;
|
||||||
|
this.renderGraph();
|
||||||
|
}
|
||||||
|
|
||||||
|
private renderGraph(): void {
|
||||||
|
if (!this.svg || !this.graphData) return;
|
||||||
|
|
||||||
|
this.svg.empty();
|
||||||
|
|
||||||
|
// Simple graph rendering - this would be enhanced with D3.js or similar
|
||||||
|
const nodes = this.graphData.nodes;
|
||||||
|
const edges = this.graphData.edges;
|
||||||
|
|
||||||
|
// Render edges first (so they appear behind nodes)
|
||||||
|
edges.forEach(edge => {
|
||||||
|
const line = document.createElementNS('http://www.w3.org/2000/svg', 'line');
|
||||||
|
line.setAttribute('x1', edge.source);
|
||||||
|
line.setAttribute('y1', edge.source);
|
||||||
|
line.setAttribute('x2', edge.target);
|
||||||
|
line.setAttribute('y2', edge.target);
|
||||||
|
line.setAttribute('stroke', 'var(--text-muted)');
|
||||||
|
line.setAttribute('stroke-width', '1');
|
||||||
|
line.setAttribute('opacity', '0.6');
|
||||||
|
line.addClass('qdrant-graph-edge');
|
||||||
|
this.svg!.appendChild(line);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Render nodes
|
||||||
|
nodes.forEach(node => {
|
||||||
|
const circle = document.createElementNS('http://www.w3.org/2000/svg', 'circle');
|
||||||
|
circle.setAttribute('cx', node.x?.toString() || '0');
|
||||||
|
circle.setAttribute('cy', node.y?.toString() || '0');
|
||||||
|
circle.setAttribute('r', '10');
|
||||||
|
circle.setAttribute('fill', node.color || 'var(--interactive-accent)');
|
||||||
|
circle.setAttribute('stroke', 'var(--background-primary)');
|
||||||
|
circle.setAttribute('stroke-width', '2');
|
||||||
|
circle.addClass('qdrant-graph-node');
|
||||||
|
|
||||||
|
// Add click handler
|
||||||
|
circle.addEventListener('click', () => {
|
||||||
|
this.onNodeClick(node);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Add tooltip
|
||||||
|
circle.addEventListener('mouseenter', (e: MouseEvent) => {
|
||||||
|
this.showTooltip(e, node);
|
||||||
|
});
|
||||||
|
|
||||||
|
circle.addEventListener('mouseleave', () => {
|
||||||
|
this.hideTooltip();
|
||||||
|
});
|
||||||
|
|
||||||
|
this.svg!.appendChild(circle);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
private onNodeClick(node: GraphNode): void {
|
||||||
|
// Open the file associated with this node
|
||||||
|
if (node.path) {
|
||||||
|
this.app.workspace.openLinkText(node.path, '');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private showTooltip(event: MouseEvent, node: GraphNode): void {
|
||||||
|
// Create tooltip element
|
||||||
|
const tooltip = document.createElement('div');
|
||||||
|
tooltip.className = 'qdrant-graph-tooltip';
|
||||||
|
tooltip.innerHTML = `
|
||||||
|
<div><strong>${node.title}</strong></div>
|
||||||
|
<div>${node.path}</div>
|
||||||
|
<div>Type: ${node.type}</div>
|
||||||
|
`;
|
||||||
|
|
||||||
|
document.body.appendChild(tooltip);
|
||||||
|
|
||||||
|
// Position tooltip
|
||||||
|
tooltip.style.left = (event.pageX + 10) + 'px';
|
||||||
|
tooltip.style.top = (event.pageY - 10) + 'px';
|
||||||
|
}
|
||||||
|
|
||||||
|
private hideTooltip(): void {
|
||||||
|
const tooltip = document.querySelector('.qdrant-graph-tooltip');
|
||||||
|
if (tooltip) {
|
||||||
|
tooltip.remove();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
clearGraph(): void {
|
||||||
|
this.graphData = null;
|
||||||
|
this.renderPlaceholder();
|
||||||
|
}
|
||||||
|
}
|
||||||
158
src/indexing/fileWatcher.ts
Normal file
158
src/indexing/fileWatcher.ts
Normal file
@ -0,0 +1,158 @@
|
|||||||
|
import { TFile, Vault } from 'obsidian';
|
||||||
|
import { IndexingQueue } from './indexQueue';
|
||||||
|
|
||||||
|
export class FileWatcher {
|
||||||
|
private vault: Vault;
|
||||||
|
private indexingQueue: IndexingQueue;
|
||||||
|
private debounceTimeout: number | null = null;
|
||||||
|
private debounceDelay = 300; // 300ms debounce
|
||||||
|
private pendingFiles: Set<string> = new Set();
|
||||||
|
|
||||||
|
constructor(vault: Vault, indexingQueue: IndexingQueue) {
|
||||||
|
this.vault = vault;
|
||||||
|
this.indexingQueue = indexingQueue;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Start watching for file changes
|
||||||
|
*/
|
||||||
|
startWatching(): void {
|
||||||
|
// Watch for file creation
|
||||||
|
this.vault.on('create', (file) => {
|
||||||
|
if (file instanceof TFile) {
|
||||||
|
this.handleFileChange(file, 'create');
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Watch for file modification
|
||||||
|
this.vault.on('modify', (file) => {
|
||||||
|
if (file instanceof TFile) {
|
||||||
|
this.handleFileChange(file, 'update');
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Watch for file deletion
|
||||||
|
this.vault.on('delete', (file) => {
|
||||||
|
if (file instanceof TFile) {
|
||||||
|
this.handleFileChange(file, 'delete');
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Watch for file renaming
|
||||||
|
this.vault.on('rename', (file, oldPath) => {
|
||||||
|
if (file instanceof TFile) {
|
||||||
|
// Handle the old file as deleted
|
||||||
|
this.handleFileChange(file, 'delete', oldPath);
|
||||||
|
// Handle the new file as created
|
||||||
|
this.handleFileChange(file, 'create');
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Stop watching for file changes
|
||||||
|
*/
|
||||||
|
stopWatching(): void {
|
||||||
|
if (this.debounceTimeout) {
|
||||||
|
clearTimeout(this.debounceTimeout);
|
||||||
|
this.debounceTimeout = null;
|
||||||
|
}
|
||||||
|
this.pendingFiles.clear();
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handle file change with debouncing
|
||||||
|
*/
|
||||||
|
private handleFileChange(file: TFile, action: 'create' | 'update' | 'delete', oldPath?: string): void {
|
||||||
|
const filePath = oldPath || file.path;
|
||||||
|
|
||||||
|
// Add to pending files
|
||||||
|
this.pendingFiles.add(filePath);
|
||||||
|
|
||||||
|
// Clear existing timeout
|
||||||
|
if (this.debounceTimeout) {
|
||||||
|
clearTimeout(this.debounceTimeout);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Set new timeout
|
||||||
|
this.debounceTimeout = window.setTimeout(() => {
|
||||||
|
this.processPendingFiles();
|
||||||
|
}, this.debounceDelay);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Process all pending file changes
|
||||||
|
*/
|
||||||
|
private processPendingFiles(): void {
|
||||||
|
if (this.pendingFiles.size === 0) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const filesToProcess: TFile[] = [];
|
||||||
|
|
||||||
|
for (const filePath of this.pendingFiles) {
|
||||||
|
const file = this.vault.getAbstractFileByPath(filePath);
|
||||||
|
if (file instanceof TFile) {
|
||||||
|
filesToProcess.push(file);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (filesToProcess.length > 0) {
|
||||||
|
// Add files to indexing queue
|
||||||
|
this.indexingQueue.addFiles(filesToProcess, 'update');
|
||||||
|
|
||||||
|
// Start processing if not already running
|
||||||
|
this.indexingQueue.startProcessing();
|
||||||
|
}
|
||||||
|
|
||||||
|
// Clear pending files
|
||||||
|
this.pendingFiles.clear();
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Manually trigger indexing for specific files
|
||||||
|
*/
|
||||||
|
triggerIndexing(files: TFile[], action: 'create' | 'update' | 'delete' = 'update'): void {
|
||||||
|
this.indexingQueue.addFiles(files, action);
|
||||||
|
this.indexingQueue.startProcessing();
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get pending files count
|
||||||
|
*/
|
||||||
|
getPendingFilesCount(): number {
|
||||||
|
return this.pendingFiles.size;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get pending files list
|
||||||
|
*/
|
||||||
|
getPendingFiles(): string[] {
|
||||||
|
return Array.from(this.pendingFiles);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Clear pending files
|
||||||
|
*/
|
||||||
|
clearPendingFiles(): void {
|
||||||
|
this.pendingFiles.clear();
|
||||||
|
if (this.debounceTimeout) {
|
||||||
|
clearTimeout(this.debounceTimeout);
|
||||||
|
this.debounceTimeout = null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Set debounce delay
|
||||||
|
*/
|
||||||
|
setDebounceDelay(delay: number): void {
|
||||||
|
this.debounceDelay = delay;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get current debounce delay
|
||||||
|
*/
|
||||||
|
getDebounceDelay(): number {
|
||||||
|
return this.debounceDelay;
|
||||||
|
}
|
||||||
|
}
|
||||||
273
src/indexing/indexQueue.ts
Normal file
273
src/indexing/indexQueue.ts
Normal file
@ -0,0 +1,273 @@
|
|||||||
|
import { TFile } from 'obsidian';
|
||||||
|
import { IndexingQueueItem, IndexingProgress } from '../types';
|
||||||
|
import { ExtractorManager } from '../extractors';
|
||||||
|
import { HybridChunker } from '../chunking/chunker';
|
||||||
|
import { EmbeddingProviderInterface } from '../types';
|
||||||
|
import { CollectionManager } from '../qdrant/collection';
|
||||||
|
import { CollectionManager as QdrantCollectionManager } from '../qdrant/collection';
|
||||||
|
|
||||||
|
export class IndexingQueue {
|
||||||
|
private queue: IndexingQueueItem[] = [];
|
||||||
|
private isProcessing = false;
|
||||||
|
private progress: IndexingProgress;
|
||||||
|
private extractorManager: ExtractorManager;
|
||||||
|
private chunker: HybridChunker;
|
||||||
|
private embeddingProvider: EmbeddingProviderInterface;
|
||||||
|
private collectionManager: QdrantCollectionManager;
|
||||||
|
private onProgressUpdate?: (progress: IndexingProgress) => void;
|
||||||
|
private onError?: (error: string) => void;
|
||||||
|
private maxConcurrency = 3;
|
||||||
|
private batchSize = 10;
|
||||||
|
|
||||||
|
constructor(
|
||||||
|
extractorManager: ExtractorManager,
|
||||||
|
chunker: HybridChunker,
|
||||||
|
embeddingProvider: EmbeddingProviderInterface,
|
||||||
|
collectionManager: QdrantCollectionManager
|
||||||
|
) {
|
||||||
|
this.extractorManager = extractorManager;
|
||||||
|
this.chunker = chunker;
|
||||||
|
this.embeddingProvider = embeddingProvider;
|
||||||
|
this.collectionManager = collectionManager;
|
||||||
|
|
||||||
|
this.progress = {
|
||||||
|
totalFiles: 0,
|
||||||
|
processedFiles: 0,
|
||||||
|
totalChunks: 0,
|
||||||
|
processedChunks: 0,
|
||||||
|
errors: [],
|
||||||
|
isRunning: false
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Add files to the indexing queue
|
||||||
|
*/
|
||||||
|
addFiles(files: TFile[], action: 'create' | 'update' | 'delete' = 'update'): void {
|
||||||
|
for (const file of files) {
|
||||||
|
this.addFile(file, action);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Add a single file to the indexing queue
|
||||||
|
*/
|
||||||
|
addFile(file: TFile, action: 'create' | 'update' | 'delete' = 'update'): void {
|
||||||
|
const priority = action === 'delete' ? 0 : (action === 'create' ? 2 : 1);
|
||||||
|
|
||||||
|
const item: IndexingQueueItem = {
|
||||||
|
file,
|
||||||
|
action,
|
||||||
|
priority
|
||||||
|
};
|
||||||
|
|
||||||
|
// Remove existing entry for this file
|
||||||
|
this.queue = this.queue.filter(item => item.file.path !== file.path);
|
||||||
|
|
||||||
|
// Add new entry
|
||||||
|
this.queue.push(item);
|
||||||
|
|
||||||
|
// Sort by priority (higher priority first)
|
||||||
|
this.queue.sort((a, b) => b.priority - a.priority);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Start processing the queue
|
||||||
|
*/
|
||||||
|
async startProcessing(): Promise<void> {
|
||||||
|
if (this.isProcessing) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
this.isProcessing = true;
|
||||||
|
this.progress.isRunning = true;
|
||||||
|
this.progress.totalFiles = this.queue.length;
|
||||||
|
this.progress.processedFiles = 0;
|
||||||
|
this.progress.errors = [];
|
||||||
|
|
||||||
|
this.updateProgress();
|
||||||
|
|
||||||
|
try {
|
||||||
|
await this.processQueue();
|
||||||
|
} finally {
|
||||||
|
this.isProcessing = false;
|
||||||
|
this.progress.isRunning = false;
|
||||||
|
this.updateProgress();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Stop processing the queue
|
||||||
|
*/
|
||||||
|
stopProcessing(): void {
|
||||||
|
this.isProcessing = false;
|
||||||
|
this.progress.isRunning = false;
|
||||||
|
this.updateProgress();
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Clear the queue
|
||||||
|
*/
|
||||||
|
clearQueue(): void {
|
||||||
|
this.queue = [];
|
||||||
|
this.progress.totalFiles = 0;
|
||||||
|
this.progress.processedFiles = 0;
|
||||||
|
this.updateProgress();
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get current progress
|
||||||
|
*/
|
||||||
|
getProgress(): IndexingProgress {
|
||||||
|
return { ...this.progress };
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Set progress update callback
|
||||||
|
*/
|
||||||
|
setProgressCallback(callback: (progress: IndexingProgress) => void): void {
|
||||||
|
this.onProgressUpdate = callback;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Set error callback
|
||||||
|
*/
|
||||||
|
setErrorCallback(callback: (error: string) => void): void {
|
||||||
|
this.onError = callback;
|
||||||
|
}
|
||||||
|
|
||||||
|
private async processQueue(): Promise<void> {
|
||||||
|
while (this.queue.length > 0 && this.isProcessing) {
|
||||||
|
const batch = this.queue.splice(0, this.batchSize);
|
||||||
|
await this.processBatch(batch);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private async processBatch(batch: IndexingQueueItem[]): Promise<void> {
|
||||||
|
const promises = batch.map(item => this.processItem(item));
|
||||||
|
await Promise.allSettled(promises);
|
||||||
|
}
|
||||||
|
|
||||||
|
private async processItem(item: IndexingQueueItem): Promise<void> {
|
||||||
|
try {
|
||||||
|
this.progress.currentFile = item.file.path;
|
||||||
|
this.updateProgress();
|
||||||
|
|
||||||
|
if (item.action === 'delete') {
|
||||||
|
await this.deleteFile(item.file);
|
||||||
|
} else {
|
||||||
|
await this.indexFile(item.file);
|
||||||
|
}
|
||||||
|
|
||||||
|
this.progress.processedFiles++;
|
||||||
|
this.updateProgress();
|
||||||
|
} catch (error) {
|
||||||
|
const errorMessage = `Failed to process ${item.file.path}: ${error}`;
|
||||||
|
this.progress.errors.push(errorMessage);
|
||||||
|
this.onError?.(errorMessage);
|
||||||
|
console.error(errorMessage, error);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private async indexFile(file: TFile): Promise<void> {
|
||||||
|
try {
|
||||||
|
// Check if file can be handled
|
||||||
|
if (!this.extractorManager.canHandle(file)) {
|
||||||
|
console.log(`Skipping file ${file.path} - no suitable extractor`);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract content
|
||||||
|
const extractedContent = await this.extractorManager.extract(file);
|
||||||
|
|
||||||
|
if (!extractedContent.text.trim()) {
|
||||||
|
console.log(`Skipping file ${file.path} - no text content`);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Chunk content
|
||||||
|
const chunks = await this.chunker.chunk(extractedContent);
|
||||||
|
|
||||||
|
if (chunks.length === 0) {
|
||||||
|
console.log(`Skipping file ${file.path} - no chunks created`);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
this.progress.totalChunks += chunks.length;
|
||||||
|
|
||||||
|
// Generate embeddings
|
||||||
|
const texts = chunks.map(chunk => extractedContent.text.substring(chunk.chunk_start, chunk.chunk_end));
|
||||||
|
const embeddings = await this.embeddingProvider.embed(texts);
|
||||||
|
|
||||||
|
// Prepare points for Qdrant
|
||||||
|
const points = chunks.map((chunk, index) => ({
|
||||||
|
id: this.generatePointId(file, chunk.chunk_index),
|
||||||
|
vector: embeddings[index],
|
||||||
|
metadata: chunk
|
||||||
|
}));
|
||||||
|
|
||||||
|
// Index in Qdrant
|
||||||
|
await this.collectionManager.indexChunks(points);
|
||||||
|
|
||||||
|
this.progress.processedChunks += chunks.length;
|
||||||
|
this.updateProgress();
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
throw new Error(`Failed to index file ${file.path}: ${error}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private async deleteFile(file: TFile): Promise<void> {
|
||||||
|
try {
|
||||||
|
// Delete all chunks for this file from Qdrant
|
||||||
|
await this.collectionManager.deleteFileChunks(file.path);
|
||||||
|
} catch (error) {
|
||||||
|
throw new Error(`Failed to delete file ${file.path}: ${error}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private generatePointId(file: TFile, chunkIndex: number): string {
|
||||||
|
// Generate a consistent ID for the point
|
||||||
|
return `${file.path}:${chunkIndex}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
private updateProgress(): void {
|
||||||
|
this.onProgressUpdate?.(this.getProgress());
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get queue statistics
|
||||||
|
*/
|
||||||
|
getQueueStats(): {
|
||||||
|
queueLength: number;
|
||||||
|
isProcessing: boolean;
|
||||||
|
estimatedTimeRemaining: number;
|
||||||
|
} {
|
||||||
|
const averageTimePerFile = 2000; // 2 seconds per file (rough estimate)
|
||||||
|
const estimatedTimeRemaining = this.queue.length * averageTimePerFile;
|
||||||
|
|
||||||
|
return {
|
||||||
|
queueLength: this.queue.length,
|
||||||
|
isProcessing: this.isProcessing,
|
||||||
|
estimatedTimeRemaining
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get files in queue by action type
|
||||||
|
*/
|
||||||
|
getFilesByAction(action: 'create' | 'update' | 'delete'): TFile[] {
|
||||||
|
return this.queue
|
||||||
|
.filter(item => item.action === action)
|
||||||
|
.map(item => item.file);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Remove files from queue
|
||||||
|
*/
|
||||||
|
removeFiles(filePaths: string[]): void {
|
||||||
|
this.queue = this.queue.filter(item => !filePaths.includes(item.file.path));
|
||||||
|
this.progress.totalFiles = this.queue.length;
|
||||||
|
this.updateProgress();
|
||||||
|
}
|
||||||
|
}
|
||||||
204
src/indexing/manifest.ts
Normal file
204
src/indexing/manifest.ts
Normal file
@ -0,0 +1,204 @@
|
|||||||
|
import { TFile } from 'obsidian';
|
||||||
|
import { FileManifestEntry } from '../types';
|
||||||
|
|
||||||
|
export class FileManifest {
|
||||||
|
private manifest: Map<string, FileManifestEntry> = new Map();
|
||||||
|
private app: any;
|
||||||
|
|
||||||
|
constructor(app: any) {
|
||||||
|
this.app = app;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Load manifest from plugin data
|
||||||
|
*/
|
||||||
|
async load(): Promise<void> {
|
||||||
|
try {
|
||||||
|
const data = await this.app.plugins.plugins['obsidian-qdrant']?.loadData();
|
||||||
|
if (data && data.manifest) {
|
||||||
|
this.manifest = new Map(Object.entries(data.manifest));
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.warn('Failed to load file manifest:', error);
|
||||||
|
this.manifest = new Map();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Save manifest to plugin data
|
||||||
|
*/
|
||||||
|
async save(): Promise<void> {
|
||||||
|
try {
|
||||||
|
const data = {
|
||||||
|
manifest: Object.fromEntries(this.manifest)
|
||||||
|
};
|
||||||
|
await this.app.plugins.plugins['obsidian-qdrant']?.saveData(data);
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to save file manifest:', error);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get manifest entry for a file
|
||||||
|
*/
|
||||||
|
getEntry(filePath: string): FileManifestEntry | null {
|
||||||
|
return this.manifest.get(filePath) || null;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Update manifest entry for a file
|
||||||
|
*/
|
||||||
|
updateEntry(filePath: string, entry: FileManifestEntry): void {
|
||||||
|
this.manifest.set(filePath, entry);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Remove manifest entry for a file
|
||||||
|
*/
|
||||||
|
removeEntry(filePath: string): void {
|
||||||
|
this.manifest.delete(filePath);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if a file needs to be re-indexed
|
||||||
|
*/
|
||||||
|
needsReindexing(file: TFile): boolean {
|
||||||
|
const entry = this.getEntry(file.path);
|
||||||
|
if (!entry) {
|
||||||
|
return true; // File not in manifest, needs indexing
|
||||||
|
}
|
||||||
|
|
||||||
|
const stat = file.stat;
|
||||||
|
return (
|
||||||
|
entry.mtime !== stat.mtime ||
|
||||||
|
entry.size !== stat.size ||
|
||||||
|
entry.hash !== this.calculateHash(file)
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Calculate a simple hash for file content
|
||||||
|
*/
|
||||||
|
private calculateHash(file: TFile): string {
|
||||||
|
// Simple hash based on file stats
|
||||||
|
// In production, you might want to use a proper content hash
|
||||||
|
const stat = file.stat;
|
||||||
|
return `${stat.mtime}-${stat.size}-${stat.ctime}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get all files that need re-indexing
|
||||||
|
*/
|
||||||
|
getFilesNeedingReindexing(files: TFile[]): TFile[] {
|
||||||
|
return files.filter(file => this.needsReindexing(file));
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get all tracked files
|
||||||
|
*/
|
||||||
|
getAllTrackedFiles(): string[] {
|
||||||
|
return Array.from(this.manifest.keys());
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get files that are no longer in the vault
|
||||||
|
*/
|
||||||
|
getOrphanedFiles(vaultFiles: TFile[]): string[] {
|
||||||
|
const vaultPaths = new Set(vaultFiles.map(f => f.path));
|
||||||
|
return this.getAllTrackedFiles().filter(path => !vaultPaths.has(path));
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Clear all manifest entries
|
||||||
|
*/
|
||||||
|
clear(): void {
|
||||||
|
this.manifest.clear();
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get manifest statistics
|
||||||
|
*/
|
||||||
|
getStats(): {
|
||||||
|
totalFiles: number;
|
||||||
|
totalChunks: number;
|
||||||
|
totalSize: number;
|
||||||
|
lastIndexed: number;
|
||||||
|
} {
|
||||||
|
let totalChunks = 0;
|
||||||
|
let totalSize = 0;
|
||||||
|
let lastIndexed = 0;
|
||||||
|
|
||||||
|
for (const entry of this.manifest.values()) {
|
||||||
|
totalChunks += entry.chunkCount;
|
||||||
|
totalSize += entry.size;
|
||||||
|
lastIndexed = Math.max(lastIndexed, entry.lastIndexed);
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
totalFiles: this.manifest.size,
|
||||||
|
totalChunks,
|
||||||
|
totalSize,
|
||||||
|
lastIndexed
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Update file entry after successful indexing
|
||||||
|
*/
|
||||||
|
updateAfterIndexing(file: TFile, chunkCount: number): void {
|
||||||
|
const entry: FileManifestEntry = {
|
||||||
|
mtime: file.stat.mtime,
|
||||||
|
size: file.stat.size,
|
||||||
|
hash: this.calculateHash(file),
|
||||||
|
chunkCount,
|
||||||
|
lastIndexed: Date.now()
|
||||||
|
};
|
||||||
|
|
||||||
|
this.updateEntry(file.path, entry);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get files modified since a timestamp
|
||||||
|
*/
|
||||||
|
getFilesModifiedSince(timestamp: number): string[] {
|
||||||
|
const result: string[] = [];
|
||||||
|
|
||||||
|
for (const [path, entry] of this.manifest.entries()) {
|
||||||
|
if (entry.lastIndexed > timestamp) {
|
||||||
|
result.push(path);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get files by extension
|
||||||
|
*/
|
||||||
|
getFilesByExtension(extension: string): string[] {
|
||||||
|
const result: string[] = [];
|
||||||
|
|
||||||
|
for (const path of this.manifest.keys()) {
|
||||||
|
if (path.endsWith(`.${extension}`)) {
|
||||||
|
result.push(path);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get files by size range
|
||||||
|
*/
|
||||||
|
getFilesBySizeRange(minSize: number, maxSize: number): string[] {
|
||||||
|
const result: string[] = [];
|
||||||
|
|
||||||
|
for (const [path, entry] of this.manifest.entries()) {
|
||||||
|
if (entry.size >= minSize && entry.size <= maxSize) {
|
||||||
|
result.push(path);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
}
|
||||||
307
src/indexing/orchestrator.ts
Normal file
307
src/indexing/orchestrator.ts
Normal file
@ -0,0 +1,307 @@
|
|||||||
|
import { App, TFile } from 'obsidian';
|
||||||
|
import { PluginSettings, IndexingProgress } from '../types';
|
||||||
|
import { ExtractorManager } from '../extractors';
|
||||||
|
import { HybridChunker } from '../chunking/chunker';
|
||||||
|
import { createEmbeddingProvider } from '../embeddings';
|
||||||
|
import { QdrantClient } from '../qdrant/client';
|
||||||
|
import { CollectionManager } from '../qdrant/collection';
|
||||||
|
import { IndexingQueue } from './indexQueue';
|
||||||
|
import { FileWatcher } from './fileWatcher';
|
||||||
|
import { FileManifest } from './manifest';
|
||||||
|
|
||||||
|
export class IndexingOrchestrator {
|
||||||
|
private app: App;
|
||||||
|
private settings: PluginSettings;
|
||||||
|
private extractorManager: ExtractorManager;
|
||||||
|
private chunker: HybridChunker;
|
||||||
|
private embeddingProvider: any;
|
||||||
|
private qdrantClient: QdrantClient;
|
||||||
|
private collectionManager: CollectionManager;
|
||||||
|
private indexingQueue: IndexingQueue;
|
||||||
|
private fileWatcher: FileWatcher;
|
||||||
|
private fileManifest: FileManifest;
|
||||||
|
private isInitialized = false;
|
||||||
|
|
||||||
|
constructor(app: App, settings: PluginSettings) {
|
||||||
|
this.app = app;
|
||||||
|
this.settings = settings;
|
||||||
|
|
||||||
|
// Initialize components
|
||||||
|
this.extractorManager = new ExtractorManager(app);
|
||||||
|
this.chunker = new HybridChunker(settings.chunking);
|
||||||
|
this.embeddingProvider = createEmbeddingProvider(settings);
|
||||||
|
this.qdrantClient = new QdrantClient(settings.qdrant);
|
||||||
|
this.collectionManager = new CollectionManager(this.qdrantClient, settings, this.getVaultName());
|
||||||
|
this.indexingQueue = new IndexingQueue(
|
||||||
|
this.extractorManager,
|
||||||
|
this.chunker,
|
||||||
|
this.embeddingProvider,
|
||||||
|
this.collectionManager
|
||||||
|
);
|
||||||
|
this.fileWatcher = new FileWatcher(app.vault, this.indexingQueue);
|
||||||
|
this.fileManifest = new FileManifest(app);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Initialize the indexing system
|
||||||
|
*/
|
||||||
|
async initialize(): Promise<void> {
|
||||||
|
if (this.isInitialized) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Load file manifest
|
||||||
|
await this.fileManifest.load();
|
||||||
|
|
||||||
|
// Initialize embedding provider and get dimension
|
||||||
|
const embeddingDimension = await this.embeddingProvider.getDimension();
|
||||||
|
|
||||||
|
// Initialize Qdrant collection
|
||||||
|
await this.collectionManager.initialize(embeddingDimension);
|
||||||
|
|
||||||
|
// Start file watching
|
||||||
|
this.fileWatcher.startWatching();
|
||||||
|
|
||||||
|
this.isInitialized = true;
|
||||||
|
console.log('Indexing orchestrator initialized successfully');
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to initialize indexing orchestrator:', error);
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Shutdown the indexing system
|
||||||
|
*/
|
||||||
|
async shutdown(): Promise<void> {
|
||||||
|
if (!this.isInitialized) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Stop file watching
|
||||||
|
this.fileWatcher.stopWatching();
|
||||||
|
|
||||||
|
// Stop indexing queue
|
||||||
|
this.indexingQueue.stopProcessing();
|
||||||
|
|
||||||
|
// Save file manifest
|
||||||
|
await this.fileManifest.save();
|
||||||
|
|
||||||
|
this.isInitialized = false;
|
||||||
|
console.log('Indexing orchestrator shutdown successfully');
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to shutdown indexing orchestrator:', error);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Perform full vault indexing
|
||||||
|
*/
|
||||||
|
async indexFullVault(): Promise<void> {
|
||||||
|
if (!this.isInitialized) {
|
||||||
|
throw new Error('Indexing orchestrator not initialized');
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Get all files that can be indexed
|
||||||
|
const allFiles = this.getIndexableFiles();
|
||||||
|
|
||||||
|
// Get files that need re-indexing
|
||||||
|
const filesToIndex = this.fileManifest.getFilesNeedingReindexing(allFiles);
|
||||||
|
|
||||||
|
// Get orphaned files (files in manifest but not in vault)
|
||||||
|
const orphanedFiles = this.fileManifest.getOrphanedFiles(allFiles);
|
||||||
|
|
||||||
|
// Add files to indexing queue
|
||||||
|
this.indexingQueue.addFiles(filesToIndex, 'update');
|
||||||
|
|
||||||
|
// Add orphaned files for deletion
|
||||||
|
for (const orphanedPath of orphanedFiles) {
|
||||||
|
const orphanedFile = this.app.vault.getAbstractFileByPath(orphanedPath);
|
||||||
|
if (orphanedFile instanceof TFile) {
|
||||||
|
this.indexingQueue.addFile(orphanedFile, 'delete');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Start processing
|
||||||
|
await this.indexingQueue.startProcessing();
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to index full vault:', error);
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Index a specific file
|
||||||
|
*/
|
||||||
|
async indexFile(file: TFile): Promise<void> {
|
||||||
|
if (!this.isInitialized) {
|
||||||
|
throw new Error('Indexing orchestrator not initialized');
|
||||||
|
}
|
||||||
|
|
||||||
|
this.indexingQueue.addFile(file, 'update');
|
||||||
|
await this.indexingQueue.startProcessing();
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Delete a file from the index
|
||||||
|
*/
|
||||||
|
async deleteFile(file: TFile): Promise<void> {
|
||||||
|
if (!this.isInitialized) {
|
||||||
|
throw new Error('Indexing orchestrator not initialized');
|
||||||
|
}
|
||||||
|
|
||||||
|
this.indexingQueue.addFile(file, 'delete');
|
||||||
|
await this.indexingQueue.startProcessing();
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get indexing progress
|
||||||
|
*/
|
||||||
|
getProgress(): IndexingProgress {
|
||||||
|
return this.indexingQueue.getProgress();
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Set progress callback
|
||||||
|
*/
|
||||||
|
setProgressCallback(callback: (progress: IndexingProgress) => void): void {
|
||||||
|
this.indexingQueue.setProgressCallback(callback);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Set error callback
|
||||||
|
*/
|
||||||
|
setErrorCallback(callback: (error: string) => void): void {
|
||||||
|
this.indexingQueue.setErrorCallback(callback);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get index statistics
|
||||||
|
*/
|
||||||
|
async getIndexStats(): Promise<{
|
||||||
|
collectionStats: any;
|
||||||
|
manifestStats: any;
|
||||||
|
queueStats: any;
|
||||||
|
}> {
|
||||||
|
const collectionStats = await this.collectionManager.getStats();
|
||||||
|
const manifestStats = this.fileManifest.getStats();
|
||||||
|
const queueStats = this.indexingQueue.getQueueStats();
|
||||||
|
|
||||||
|
return {
|
||||||
|
collectionStats,
|
||||||
|
manifestStats,
|
||||||
|
queueStats
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Clear the entire index
|
||||||
|
*/
|
||||||
|
async clearIndex(): Promise<void> {
|
||||||
|
if (!this.isInitialized) {
|
||||||
|
throw new Error('Indexing orchestrator not initialized');
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Clear Qdrant collection
|
||||||
|
await this.collectionManager.clearCollection();
|
||||||
|
|
||||||
|
// Clear file manifest
|
||||||
|
this.fileManifest.clear();
|
||||||
|
await this.fileManifest.save();
|
||||||
|
|
||||||
|
// Clear indexing queue
|
||||||
|
this.indexingQueue.clearQueue();
|
||||||
|
|
||||||
|
console.log('Index cleared successfully');
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Failed to clear index:', error);
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get files that can be indexed
|
||||||
|
*/
|
||||||
|
private getIndexableFiles(): TFile[] {
|
||||||
|
const files = this.app.vault.getFiles();
|
||||||
|
|
||||||
|
return files.filter(file => {
|
||||||
|
// Check file size
|
||||||
|
if (file.stat.size > this.settings.indexing.maxFileSize) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check ignored folders
|
||||||
|
const pathParts = file.path.split('/');
|
||||||
|
for (const part of pathParts) {
|
||||||
|
if (this.settings.indexing.ignoredFolders.includes(part)) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check include patterns
|
||||||
|
const matchesInclude = this.settings.indexing.includePatterns.some(pattern => {
|
||||||
|
const regex = new RegExp(pattern.replace(/\*/g, '.*'));
|
||||||
|
return regex.test(file.path);
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!matchesInclude) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check exclude patterns
|
||||||
|
const matchesExclude = this.settings.indexing.excludePatterns.some(pattern => {
|
||||||
|
const regex = new RegExp(pattern.replace(/\*/g, '.*'));
|
||||||
|
return regex.test(file.path);
|
||||||
|
});
|
||||||
|
|
||||||
|
if (matchesExclude) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if extractor can handle the file
|
||||||
|
return this.extractorManager.canHandle(file);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get vault name
|
||||||
|
*/
|
||||||
|
private getVaultName(): string {
|
||||||
|
return this.app.vault.getName();
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if the system is initialized
|
||||||
|
*/
|
||||||
|
isReady(): boolean {
|
||||||
|
return this.isInitialized;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get extractor status
|
||||||
|
*/
|
||||||
|
getExtractorStatus(): any[] {
|
||||||
|
return this.extractorManager.getExtractorStatus();
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Test connections
|
||||||
|
*/
|
||||||
|
async testConnections(): Promise<{
|
||||||
|
qdrant: boolean;
|
||||||
|
embedding: boolean;
|
||||||
|
}> {
|
||||||
|
const qdrantTest = await this.qdrantClient.testConnection();
|
||||||
|
const embeddingTest = await this.embeddingProvider.testConnection();
|
||||||
|
|
||||||
|
return {
|
||||||
|
qdrant: qdrantTest,
|
||||||
|
embedding: embeddingTest
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
220
src/qdrant/client.ts
Normal file
220
src/qdrant/client.ts
Normal file
@ -0,0 +1,220 @@
|
|||||||
|
import { requestUrl, RequestUrlParam, RequestUrlResponse } from 'obsidian';
|
||||||
|
import { QdrantSettings, SearchResult, SearchOptions, ChunkMetadata } from '../types';
|
||||||
|
|
||||||
|
export interface QdrantPoint {
|
||||||
|
id: string;
|
||||||
|
vector: number[];
|
||||||
|
payload: ChunkMetadata;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface QdrantSearchRequest {
|
||||||
|
vector: number[];
|
||||||
|
limit: number;
|
||||||
|
score_threshold?: number;
|
||||||
|
filter?: Record<string, any>;
|
||||||
|
with_payload?: boolean;
|
||||||
|
with_vector?: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface QdrantSearchResponse {
|
||||||
|
result: Array<{
|
||||||
|
id: string;
|
||||||
|
score: number;
|
||||||
|
payload: ChunkMetadata;
|
||||||
|
vector?: number[];
|
||||||
|
}>;
|
||||||
|
status: string;
|
||||||
|
time: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface QdrantUpsertRequest {
|
||||||
|
points: QdrantPoint[];
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface QdrantCollectionInfo {
|
||||||
|
vectors_count: number;
|
||||||
|
indexed_vectors_count: number;
|
||||||
|
points_count: number;
|
||||||
|
segments_count: number;
|
||||||
|
disk_data_size: number;
|
||||||
|
ram_data_size: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export class QdrantClient {
|
||||||
|
private settings: QdrantSettings;
|
||||||
|
private baseUrl: string;
|
||||||
|
|
||||||
|
constructor(settings: QdrantSettings) {
|
||||||
|
this.settings = settings;
|
||||||
|
this.baseUrl = settings.url.replace(/\/$/, ''); // Remove trailing slash
|
||||||
|
}
|
||||||
|
|
||||||
|
private async makeRequest<T>(
|
||||||
|
endpoint: string,
|
||||||
|
method: 'GET' | 'POST' | 'PUT' | 'DELETE' = 'GET',
|
||||||
|
body?: any
|
||||||
|
): Promise<T> {
|
||||||
|
const url = `${this.baseUrl}${endpoint}`;
|
||||||
|
|
||||||
|
const headers: Record<string, string> = {
|
||||||
|
'Content-Type': 'application/json'
|
||||||
|
};
|
||||||
|
|
||||||
|
if (this.settings.apiKey) {
|
||||||
|
const headerName = this.settings.apiKeyHeader || 'api-key';
|
||||||
|
headers[headerName] = this.settings.apiKey;
|
||||||
|
}
|
||||||
|
|
||||||
|
const requestParams: RequestUrlParam = {
|
||||||
|
url,
|
||||||
|
method,
|
||||||
|
headers,
|
||||||
|
body: body ? JSON.stringify(body) : undefined
|
||||||
|
};
|
||||||
|
|
||||||
|
try {
|
||||||
|
const response: RequestUrlResponse = await requestUrl(requestParams);
|
||||||
|
|
||||||
|
if (response.status >= 400) {
|
||||||
|
throw new Error(`Qdrant API error: ${response.status} ${response.text}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
return JSON.parse(response.text) as T;
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Qdrant request failed:', error);
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async testConnection(): Promise<boolean> {
|
||||||
|
try {
|
||||||
|
await this.makeRequest('/collections');
|
||||||
|
return true;
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Qdrant connection test failed:', error);
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async ensureCollection(collectionName: string, vectorSize: number): Promise<void> {
|
||||||
|
try {
|
||||||
|
// Check if collection exists
|
||||||
|
await this.makeRequest(`/collections/${collectionName}`);
|
||||||
|
} catch (error) {
|
||||||
|
// Collection doesn't exist, create it
|
||||||
|
await this.createCollection(collectionName, vectorSize);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private async createCollection(collectionName: string, vectorSize: number): Promise<void> {
|
||||||
|
const config = {
|
||||||
|
vectors: {
|
||||||
|
size: vectorSize,
|
||||||
|
distance: 'Cosine'
|
||||||
|
},
|
||||||
|
sparse_vectors: {
|
||||||
|
bm25: {}
|
||||||
|
},
|
||||||
|
optimizers_config: {
|
||||||
|
default_segment_number: 2
|
||||||
|
},
|
||||||
|
replication_factor: 1,
|
||||||
|
write_consistency_factor: 1
|
||||||
|
};
|
||||||
|
|
||||||
|
await this.makeRequest(`/collections/${collectionName}`, 'PUT', config);
|
||||||
|
}
|
||||||
|
|
||||||
|
async getCollectionInfo(collectionName: string): Promise<QdrantCollectionInfo> {
|
||||||
|
const response = await this.makeRequest<{result: QdrantCollectionInfo}>(`/collections/${collectionName}`);
|
||||||
|
return response.result;
|
||||||
|
}
|
||||||
|
|
||||||
|
async upsertPoints(collectionName: string, points: QdrantPoint[]): Promise<void> {
|
||||||
|
if (points.length === 0) return;
|
||||||
|
|
||||||
|
const request: QdrantUpsertRequest = { points };
|
||||||
|
await this.makeRequest(`/collections/${collectionName}/points`, 'PUT', request);
|
||||||
|
}
|
||||||
|
|
||||||
|
async search(
|
||||||
|
collectionName: string,
|
||||||
|
options: SearchOptions,
|
||||||
|
queryVector: number[]
|
||||||
|
): Promise<SearchResult[]> {
|
||||||
|
const request: QdrantSearchRequest = {
|
||||||
|
vector: queryVector,
|
||||||
|
limit: options.top_k,
|
||||||
|
score_threshold: options.score_threshold,
|
||||||
|
filter: options.filters,
|
||||||
|
with_payload: true,
|
||||||
|
with_vector: false
|
||||||
|
};
|
||||||
|
|
||||||
|
const response = await this.makeRequest<QdrantSearchResponse>(
|
||||||
|
`/collections/${collectionName}/points/search`,
|
||||||
|
'POST',
|
||||||
|
request
|
||||||
|
);
|
||||||
|
|
||||||
|
return response.result.map(result => ({
|
||||||
|
id: result.id,
|
||||||
|
score: result.score,
|
||||||
|
payload: result.payload,
|
||||||
|
vector: result.vector
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
|
||||||
|
async deletePoints(collectionName: string, pointIds: string[]): Promise<void> {
|
||||||
|
if (pointIds.length === 0) return;
|
||||||
|
|
||||||
|
const request = {
|
||||||
|
points: pointIds
|
||||||
|
};
|
||||||
|
|
||||||
|
await this.makeRequest(`/collections/${collectionName}/points/delete`, 'POST', request);
|
||||||
|
}
|
||||||
|
|
||||||
|
async deleteCollection(collectionName: string): Promise<void> {
|
||||||
|
await this.makeRequest(`/collections/${collectionName}`, 'DELETE');
|
||||||
|
}
|
||||||
|
|
||||||
|
async getSimilarPoints(
|
||||||
|
collectionName: string,
|
||||||
|
pointId: string,
|
||||||
|
limit: number = 10,
|
||||||
|
scoreThreshold: number = 0.7
|
||||||
|
): Promise<SearchResult[]> {
|
||||||
|
const request = {
|
||||||
|
id: pointId,
|
||||||
|
limit,
|
||||||
|
score_threshold: scoreThreshold,
|
||||||
|
with_payload: true,
|
||||||
|
with_vector: false
|
||||||
|
};
|
||||||
|
|
||||||
|
const response = await this.makeRequest<QdrantSearchResponse>(
|
||||||
|
`/collections/${collectionName}/points/recommend`,
|
||||||
|
'POST',
|
||||||
|
request
|
||||||
|
);
|
||||||
|
|
||||||
|
return response.result.map(result => ({
|
||||||
|
id: result.id,
|
||||||
|
score: result.score,
|
||||||
|
payload: result.payload,
|
||||||
|
vector: result.vector
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
|
||||||
|
async batchUpsert(
|
||||||
|
collectionName: string,
|
||||||
|
points: QdrantPoint[],
|
||||||
|
batchSize: number = 100
|
||||||
|
): Promise<void> {
|
||||||
|
for (let i = 0; i < points.length; i += batchSize) {
|
||||||
|
const batch = points.slice(i, i + batchSize);
|
||||||
|
await this.upsertPoints(collectionName, batch);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
110
src/qdrant/collection.ts
Normal file
110
src/qdrant/collection.ts
Normal file
@ -0,0 +1,110 @@
|
|||||||
|
import { QdrantClient } from './client';
|
||||||
|
import { PluginSettings, ChunkMetadata } from '../types';
|
||||||
|
import { getCollectionName } from '../settings';
|
||||||
|
|
||||||
|
export class CollectionManager {
|
||||||
|
private client: QdrantClient;
|
||||||
|
private settings: PluginSettings;
|
||||||
|
private vaultName: string;
|
||||||
|
private collectionName: string;
|
||||||
|
private vectorSize: number | null = null;
|
||||||
|
|
||||||
|
constructor(client: QdrantClient, settings: PluginSettings, vaultName: string) {
|
||||||
|
this.client = client;
|
||||||
|
this.settings = settings;
|
||||||
|
this.vaultName = vaultName;
|
||||||
|
this.collectionName = getCollectionName(settings, vaultName);
|
||||||
|
}
|
||||||
|
|
||||||
|
async initialize(embeddingDimension: number): Promise<void> {
|
||||||
|
this.vectorSize = embeddingDimension;
|
||||||
|
await this.client.ensureCollection(this.collectionName, embeddingDimension);
|
||||||
|
}
|
||||||
|
|
||||||
|
getCollectionName(): string {
|
||||||
|
return this.collectionName;
|
||||||
|
}
|
||||||
|
|
||||||
|
async getStats(): Promise<{
|
||||||
|
pointsCount: number;
|
||||||
|
vectorsCount: number;
|
||||||
|
diskSize: number;
|
||||||
|
ramSize: number;
|
||||||
|
}> {
|
||||||
|
const info = await this.client.getCollectionInfo(this.collectionName);
|
||||||
|
return {
|
||||||
|
pointsCount: info.points_count,
|
||||||
|
vectorsCount: info.vectors_count,
|
||||||
|
diskSize: info.disk_data_size,
|
||||||
|
ramSize: info.ram_data_size
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
async indexChunks(chunks: Array<{ id: string; vector: number[]; metadata: ChunkMetadata }>): Promise<void> {
|
||||||
|
const points = chunks.map(chunk => ({
|
||||||
|
id: chunk.id,
|
||||||
|
vector: chunk.vector,
|
||||||
|
payload: chunk.metadata
|
||||||
|
}));
|
||||||
|
|
||||||
|
await this.client.batchUpsert(this.collectionName, points, 100);
|
||||||
|
}
|
||||||
|
|
||||||
|
async deleteFileChunks(filePath: string): Promise<void> {
|
||||||
|
// This would require a scroll operation to find all points with the file path
|
||||||
|
// For now, we'll implement a simpler approach by tracking chunk IDs
|
||||||
|
// In a full implementation, you'd want to use Qdrant's filter operations
|
||||||
|
console.log(`Deleting chunks for file: ${filePath}`);
|
||||||
|
// TODO: Implement proper deletion using Qdrant filters
|
||||||
|
}
|
||||||
|
|
||||||
|
async search(
|
||||||
|
queryVector: number[],
|
||||||
|
options: {
|
||||||
|
topK: number;
|
||||||
|
scoreThreshold?: number;
|
||||||
|
filters?: Record<string, any>;
|
||||||
|
}
|
||||||
|
): Promise<Array<{ id: string; score: number; payload: ChunkMetadata }>> {
|
||||||
|
return await this.client.search(this.collectionName, {
|
||||||
|
query: '', // Not used in vector search
|
||||||
|
top_k: options.topK,
|
||||||
|
score_threshold: options.scoreThreshold,
|
||||||
|
filters: options.filters
|
||||||
|
}, queryVector);
|
||||||
|
}
|
||||||
|
|
||||||
|
async getSimilarDocuments(
|
||||||
|
documentId: string,
|
||||||
|
limit: number = 10,
|
||||||
|
scoreThreshold: number = 0.7
|
||||||
|
): Promise<Array<{ id: string; score: number; payload: ChunkMetadata }>> {
|
||||||
|
return await this.client.getSimilarPoints(this.collectionName, documentId, limit, scoreThreshold);
|
||||||
|
}
|
||||||
|
|
||||||
|
async clearCollection(): Promise<void> {
|
||||||
|
await this.client.deleteCollection(this.collectionName);
|
||||||
|
if (this.vectorSize) {
|
||||||
|
await this.client.ensureCollection(this.collectionName, this.vectorSize);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async rebuildCollection(): Promise<void> {
|
||||||
|
await this.clearCollection();
|
||||||
|
}
|
||||||
|
|
||||||
|
// Helper method to generate consistent point IDs
|
||||||
|
static generatePointId(vaultName: string, filePath: string, chunkIndex: number, pageNo?: number): string {
|
||||||
|
const baseId = `${vaultName}:${filePath}:${chunkIndex}`;
|
||||||
|
return pageNo !== undefined ? `${baseId}:${pageNo}` : baseId;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Helper method to extract file path from point ID
|
||||||
|
static extractFilePathFromId(pointId: string): string | null {
|
||||||
|
const parts = pointId.split(':');
|
||||||
|
if (parts.length >= 3) {
|
||||||
|
return parts.slice(1, -1).join(':'); // Remove vault name and chunk index
|
||||||
|
}
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
242
src/search/resultRenderer.ts
Normal file
242
src/search/resultRenderer.ts
Normal file
@ -0,0 +1,242 @@
|
|||||||
|
import { App, TFile } from 'obsidian';
|
||||||
|
import { SearchResult } from '../types';
|
||||||
|
|
||||||
|
export class ResultRenderer {
|
||||||
|
private app: App;
|
||||||
|
|
||||||
|
constructor(app: App) {
|
||||||
|
this.app = app;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Render search results in a container
|
||||||
|
*/
|
||||||
|
renderResults(container: HTMLElement, results: SearchResult[]): void {
|
||||||
|
container.empty();
|
||||||
|
|
||||||
|
if (results.length === 0) {
|
||||||
|
this.renderNoResults(container);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
results.forEach((result, index) => {
|
||||||
|
const resultEl = this.renderResult(result, index);
|
||||||
|
container.appendChild(resultEl);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Render a single search result
|
||||||
|
*/
|
||||||
|
private renderResult(result: SearchResult, index: number): HTMLElement {
|
||||||
|
const resultEl = document.createElement('div');
|
||||||
|
resultEl.className = 'qdrant-search-result';
|
||||||
|
resultEl.dataset.index = index.toString();
|
||||||
|
|
||||||
|
// File path
|
||||||
|
const pathEl = document.createElement('div');
|
||||||
|
pathEl.className = 'qdrant-search-result-path';
|
||||||
|
pathEl.textContent = result.payload.path;
|
||||||
|
resultEl.appendChild(pathEl);
|
||||||
|
|
||||||
|
// Title
|
||||||
|
if (result.payload.title) {
|
||||||
|
const titleEl = document.createElement('div');
|
||||||
|
titleEl.className = 'qdrant-search-result-title';
|
||||||
|
titleEl.textContent = result.payload.title;
|
||||||
|
resultEl.appendChild(titleEl);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Snippet
|
||||||
|
const snippetEl = document.createElement('div');
|
||||||
|
snippetEl.className = 'qdrant-search-result-snippet';
|
||||||
|
snippetEl.textContent = this.generateSnippet(result);
|
||||||
|
resultEl.appendChild(snippetEl);
|
||||||
|
|
||||||
|
// Metadata
|
||||||
|
const metadataEl = document.createElement('div');
|
||||||
|
metadataEl.className = 'qdrant-search-result-metadata';
|
||||||
|
this.renderMetadata(metadataEl, result);
|
||||||
|
resultEl.appendChild(metadataEl);
|
||||||
|
|
||||||
|
// Score
|
||||||
|
const scoreEl = document.createElement('div');
|
||||||
|
scoreEl.className = 'qdrant-search-result-score';
|
||||||
|
scoreEl.textContent = `Score: ${result.score.toFixed(3)}`;
|
||||||
|
resultEl.appendChild(scoreEl);
|
||||||
|
|
||||||
|
// Click handler
|
||||||
|
resultEl.addEventListener('click', () => {
|
||||||
|
this.openResult(result);
|
||||||
|
});
|
||||||
|
|
||||||
|
return resultEl;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Render metadata for a result
|
||||||
|
*/
|
||||||
|
private renderMetadata(container: HTMLElement, result: SearchResult): void {
|
||||||
|
const metadata = result.payload;
|
||||||
|
const metadataItems: string[] = [];
|
||||||
|
|
||||||
|
// File type
|
||||||
|
if (metadata.ext) {
|
||||||
|
metadataItems.push(`Type: ${metadata.ext.toUpperCase()}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Modified date
|
||||||
|
if (metadata.modified) {
|
||||||
|
const date = new Date(metadata.modified);
|
||||||
|
metadataItems.push(`Modified: ${date.toLocaleDateString()}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Tags
|
||||||
|
if (metadata.tags && metadata.tags.length > 0) {
|
||||||
|
metadataItems.push(`Tags: ${metadata.tags.join(', ')}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Chunk info
|
||||||
|
if (metadata.chunk_index !== undefined) {
|
||||||
|
metadataItems.push(`Chunk: ${metadata.chunk_index + 1}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Page number (for PDFs)
|
||||||
|
if (metadata.page_no !== undefined) {
|
||||||
|
metadataItems.push(`Page: ${metadata.page_no}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
// OCR indicator
|
||||||
|
if (metadata.ocr) {
|
||||||
|
metadataItems.push('OCR');
|
||||||
|
}
|
||||||
|
|
||||||
|
container.textContent = metadataItems.join(' • ');
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Generate a snippet from the result
|
||||||
|
*/
|
||||||
|
private generateSnippet(result: SearchResult): string {
|
||||||
|
const text = this.getChunkText(result);
|
||||||
|
const maxLength = 200;
|
||||||
|
|
||||||
|
if (text.length <= maxLength) {
|
||||||
|
return text;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try to find a good break point
|
||||||
|
const halfLength = Math.floor(maxLength / 2);
|
||||||
|
const start = Math.max(0, result.payload.chunk_start - halfLength);
|
||||||
|
const end = Math.min(text.length, result.payload.chunk_end + halfLength);
|
||||||
|
|
||||||
|
let snippet = text.substring(start, end);
|
||||||
|
|
||||||
|
if (start > 0) {
|
||||||
|
snippet = '...' + snippet;
|
||||||
|
}
|
||||||
|
if (end < text.length) {
|
||||||
|
snippet = snippet + '...';
|
||||||
|
}
|
||||||
|
|
||||||
|
return snippet;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get the text content for a chunk
|
||||||
|
*/
|
||||||
|
private getChunkText(result: SearchResult): string {
|
||||||
|
// This would need to be implemented to get the actual text content
|
||||||
|
// For now, return a placeholder
|
||||||
|
return `Chunk ${result.payload.chunk_index} from ${result.payload.path}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Open a search result
|
||||||
|
*/
|
||||||
|
private openResult(result: SearchResult): void {
|
||||||
|
const file = this.app.vault.getAbstractFileByPath(result.payload.path);
|
||||||
|
|
||||||
|
if (file instanceof TFile) {
|
||||||
|
// Open file
|
||||||
|
this.app.workspace.openLinkText(result.payload.path, '');
|
||||||
|
|
||||||
|
// Try to scroll to the chunk location
|
||||||
|
setTimeout(() => {
|
||||||
|
this.scrollToChunk(result);
|
||||||
|
}, 500);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Scroll to a specific chunk in the file
|
||||||
|
*/
|
||||||
|
private scrollToChunk(result: SearchResult): void {
|
||||||
|
// This would need to be implemented to scroll to the specific chunk
|
||||||
|
// For now, just open the file
|
||||||
|
console.log('Would scroll to chunk:', result.payload.chunk_start, result.payload.chunk_end);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Render no results message
|
||||||
|
*/
|
||||||
|
private renderNoResults(container: HTMLElement): void {
|
||||||
|
const noResultsEl = document.createElement('div');
|
||||||
|
noResultsEl.className = 'qdrant-search-no-results';
|
||||||
|
noResultsEl.textContent = 'No results found';
|
||||||
|
container.appendChild(noResultsEl);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Highlight search terms in text
|
||||||
|
*/
|
||||||
|
highlightSearchTerms(text: string, searchTerms: string[]): string {
|
||||||
|
if (searchTerms.length === 0) {
|
||||||
|
return text;
|
||||||
|
}
|
||||||
|
|
||||||
|
let highlightedText = text;
|
||||||
|
|
||||||
|
for (const term of searchTerms) {
|
||||||
|
const regex = new RegExp(`(${term})`, 'gi');
|
||||||
|
highlightedText = highlightedText.replace(regex, '<mark>$1</mark>');
|
||||||
|
}
|
||||||
|
|
||||||
|
return highlightedText;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get file icon based on extension
|
||||||
|
*/
|
||||||
|
private getFileIcon(extension: string): string {
|
||||||
|
const iconMap: Record<string, string> = {
|
||||||
|
'md': '📝',
|
||||||
|
'txt': '📄',
|
||||||
|
'pdf': '📕',
|
||||||
|
'png': '🖼️',
|
||||||
|
'jpg': '🖼️',
|
||||||
|
'jpeg': '🖼️',
|
||||||
|
'gif': '🖼️',
|
||||||
|
'svg': '🖼️',
|
||||||
|
'js': '📜',
|
||||||
|
'ts': '📜',
|
||||||
|
'json': '📜',
|
||||||
|
'html': '🌐',
|
||||||
|
'css': '🎨',
|
||||||
|
'py': '🐍',
|
||||||
|
'java': '☕',
|
||||||
|
'cpp': '⚙️',
|
||||||
|
'c': '⚙️',
|
||||||
|
'go': '🐹',
|
||||||
|
'rs': '🦀',
|
||||||
|
'php': '🐘',
|
||||||
|
'rb': '💎',
|
||||||
|
'sh': '🐚',
|
||||||
|
'yml': '⚙️',
|
||||||
|
'yaml': '⚙️',
|
||||||
|
'xml': '📄'
|
||||||
|
};
|
||||||
|
|
||||||
|
return iconMap[extension.toLowerCase()] || '📄';
|
||||||
|
}
|
||||||
|
}
|
||||||
300
src/search/searchModal.ts
Normal file
300
src/search/searchModal.ts
Normal file
@ -0,0 +1,300 @@
|
|||||||
|
import { App, Modal, TFile, Notice } from 'obsidian';
|
||||||
|
import { SearchResult, SearchOptions } from '../types';
|
||||||
|
import { QdrantClient } from '../qdrant/client';
|
||||||
|
import { CollectionManager } from '../qdrant/collection';
|
||||||
|
import { createEmbeddingProvider } from '../embeddings';
|
||||||
|
import { PluginSettings } from '../types';
|
||||||
|
|
||||||
|
export class SearchModal extends Modal {
|
||||||
|
private settings: PluginSettings;
|
||||||
|
private qdrantClient: QdrantClient;
|
||||||
|
private collectionManager: CollectionManager;
|
||||||
|
private embeddingProvider: any;
|
||||||
|
private searchResults: SearchResult[] = [];
|
||||||
|
private selectedIndex = 0;
|
||||||
|
private searchInput: HTMLInputElement;
|
||||||
|
private resultsContainer: HTMLDivElement;
|
||||||
|
private statusBar: HTMLDivElement;
|
||||||
|
private debounceTimeout: number | null = null;
|
||||||
|
private isSearching = false;
|
||||||
|
|
||||||
|
constructor(app: App, settings: PluginSettings) {
|
||||||
|
super(app);
|
||||||
|
this.settings = settings;
|
||||||
|
|
||||||
|
// Initialize components
|
||||||
|
this.qdrantClient = new QdrantClient(settings.qdrant);
|
||||||
|
this.collectionManager = new CollectionManager(this.qdrantClient, settings, this.getVaultName());
|
||||||
|
this.embeddingProvider = createEmbeddingProvider(settings);
|
||||||
|
}
|
||||||
|
|
||||||
|
onOpen() {
|
||||||
|
const { contentEl } = this;
|
||||||
|
contentEl.empty();
|
||||||
|
|
||||||
|
// Create modal header
|
||||||
|
const header = contentEl.createEl('div', { cls: 'qdrant-search-header' });
|
||||||
|
header.createEl('h2', { text: 'Semantic Search' });
|
||||||
|
|
||||||
|
// Create search input
|
||||||
|
this.searchInput = contentEl.createEl('input', {
|
||||||
|
type: 'text',
|
||||||
|
placeholder: 'Enter your search query...',
|
||||||
|
cls: 'qdrant-search-input'
|
||||||
|
});
|
||||||
|
|
||||||
|
// Create results container
|
||||||
|
this.resultsContainer = contentEl.createEl('div', { cls: 'qdrant-search-results' });
|
||||||
|
|
||||||
|
// Create status bar
|
||||||
|
this.statusBar = contentEl.createEl('div', { cls: 'qdrant-search-status' });
|
||||||
|
|
||||||
|
// Set up event listeners
|
||||||
|
this.setupEventListeners();
|
||||||
|
|
||||||
|
// Focus search input
|
||||||
|
this.searchInput.focus();
|
||||||
|
}
|
||||||
|
|
||||||
|
onClose() {
|
||||||
|
const { contentEl } = this;
|
||||||
|
contentEl.empty();
|
||||||
|
|
||||||
|
if (this.debounceTimeout) {
|
||||||
|
clearTimeout(this.debounceTimeout);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private setupEventListeners(): void {
|
||||||
|
// Search input events
|
||||||
|
this.searchInput.addEventListener('input', () => {
|
||||||
|
this.handleSearchInput();
|
||||||
|
});
|
||||||
|
|
||||||
|
this.searchInput.addEventListener('keydown', (e) => {
|
||||||
|
this.handleKeydown(e);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Click outside to close
|
||||||
|
this.contentEl.addEventListener('click', (e) => {
|
||||||
|
if (e.target === this.contentEl) {
|
||||||
|
this.close();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
private handleSearchInput(): void {
|
||||||
|
const query = this.searchInput.value.trim();
|
||||||
|
|
||||||
|
if (query.length === 0) {
|
||||||
|
this.clearResults();
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Debounce search
|
||||||
|
if (this.debounceTimeout) {
|
||||||
|
clearTimeout(this.debounceTimeout);
|
||||||
|
}
|
||||||
|
|
||||||
|
this.debounceTimeout = window.setTimeout(() => {
|
||||||
|
this.performSearch(query);
|
||||||
|
}, 300);
|
||||||
|
}
|
||||||
|
|
||||||
|
private handleKeydown(e: KeyboardEvent): void {
|
||||||
|
switch (e.key) {
|
||||||
|
case 'Escape':
|
||||||
|
this.close();
|
||||||
|
break;
|
||||||
|
case 'Enter':
|
||||||
|
if (this.searchResults.length > 0) {
|
||||||
|
this.openSelectedResult();
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
case 'ArrowDown':
|
||||||
|
e.preventDefault();
|
||||||
|
this.selectNext();
|
||||||
|
break;
|
||||||
|
case 'ArrowUp':
|
||||||
|
e.preventDefault();
|
||||||
|
this.selectPrevious();
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private async performSearch(query: string): Promise<void> {
|
||||||
|
if (this.isSearching) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
this.isSearching = true;
|
||||||
|
this.updateStatus('Searching...');
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Generate embedding for query
|
||||||
|
const queryEmbedding = await this.embeddingProvider.embed([query]);
|
||||||
|
|
||||||
|
// Search in Qdrant
|
||||||
|
const results = await this.collectionManager.search(
|
||||||
|
queryEmbedding[0],
|
||||||
|
{
|
||||||
|
topK: 20,
|
||||||
|
scoreThreshold: 0.5
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
this.searchResults = results;
|
||||||
|
this.selectedIndex = 0;
|
||||||
|
this.renderResults();
|
||||||
|
|
||||||
|
if (results.length === 0) {
|
||||||
|
this.updateStatus('No results found');
|
||||||
|
} else {
|
||||||
|
this.updateStatus(`Found ${results.length} results`);
|
||||||
|
}
|
||||||
|
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Search failed:', error);
|
||||||
|
this.updateStatus('Search failed');
|
||||||
|
new Notice('Search failed: ' + error.message);
|
||||||
|
} finally {
|
||||||
|
this.isSearching = false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private renderResults(): void {
|
||||||
|
this.resultsContainer.empty();
|
||||||
|
|
||||||
|
if (this.searchResults.length === 0) {
|
||||||
|
const noResults = this.resultsContainer.createEl('div', {
|
||||||
|
cls: 'qdrant-search-no-results',
|
||||||
|
text: 'No results found'
|
||||||
|
});
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
this.searchResults.forEach((result, index) => {
|
||||||
|
const resultEl = this.resultsContainer.createEl('div', {
|
||||||
|
cls: `qdrant-search-result ${index === this.selectedIndex ? 'selected' : ''}`
|
||||||
|
});
|
||||||
|
|
||||||
|
// File path
|
||||||
|
const pathEl = resultEl.createEl('div', { cls: 'qdrant-search-result-path' });
|
||||||
|
pathEl.textContent = result.payload.path;
|
||||||
|
|
||||||
|
// Title
|
||||||
|
if (result.payload.title) {
|
||||||
|
const titleEl = resultEl.createEl('div', { cls: 'qdrant-search-result-title' });
|
||||||
|
titleEl.textContent = result.payload.title;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Snippet
|
||||||
|
const snippetEl = resultEl.createEl('div', { cls: 'qdrant-search-result-snippet' });
|
||||||
|
snippetEl.textContent = this.generateSnippet(result);
|
||||||
|
|
||||||
|
// Score
|
||||||
|
const scoreEl = resultEl.createEl('div', { cls: 'qdrant-search-result-score' });
|
||||||
|
scoreEl.textContent = `Score: ${result.score.toFixed(3)}`;
|
||||||
|
|
||||||
|
// Click handler
|
||||||
|
resultEl.addEventListener('click', () => {
|
||||||
|
this.selectedIndex = index;
|
||||||
|
this.renderResults();
|
||||||
|
this.openSelectedResult();
|
||||||
|
});
|
||||||
|
|
||||||
|
// Hover handler
|
||||||
|
resultEl.addEventListener('mouseenter', () => {
|
||||||
|
this.selectedIndex = index;
|
||||||
|
this.renderResults();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
private generateSnippet(result: SearchResult): string {
|
||||||
|
const text = this.getChunkText(result);
|
||||||
|
const maxLength = 200;
|
||||||
|
|
||||||
|
if (text.length <= maxLength) {
|
||||||
|
return text;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try to find a good break point
|
||||||
|
const halfLength = Math.floor(maxLength / 2);
|
||||||
|
const start = Math.max(0, result.payload.chunk_start - halfLength);
|
||||||
|
const end = Math.min(text.length, result.payload.chunk_end + halfLength);
|
||||||
|
|
||||||
|
let snippet = text.substring(start, end);
|
||||||
|
|
||||||
|
if (start > 0) {
|
||||||
|
snippet = '...' + snippet;
|
||||||
|
}
|
||||||
|
if (end < text.length) {
|
||||||
|
snippet = snippet + '...';
|
||||||
|
}
|
||||||
|
|
||||||
|
return snippet;
|
||||||
|
}
|
||||||
|
|
||||||
|
private getChunkText(result: SearchResult): string {
|
||||||
|
// This would need to be implemented to get the actual text content
|
||||||
|
// For now, return a placeholder
|
||||||
|
return `Chunk ${result.payload.chunk_index} from ${result.payload.path}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
private selectNext(): void {
|
||||||
|
if (this.selectedIndex < this.searchResults.length - 1) {
|
||||||
|
this.selectedIndex++;
|
||||||
|
this.renderResults();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private selectPrevious(): void {
|
||||||
|
if (this.selectedIndex > 0) {
|
||||||
|
this.selectedIndex--;
|
||||||
|
this.renderResults();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private openSelectedResult(): void {
|
||||||
|
if (this.searchResults.length === 0) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const selectedResult = this.searchResults[this.selectedIndex];
|
||||||
|
const file = this.app.vault.getAbstractFileByPath(selectedResult.payload.path);
|
||||||
|
|
||||||
|
if (file instanceof TFile) {
|
||||||
|
// Open file
|
||||||
|
this.app.workspace.openLinkText(selectedResult.payload.path, '');
|
||||||
|
|
||||||
|
// Try to scroll to the chunk location
|
||||||
|
setTimeout(() => {
|
||||||
|
this.scrollToChunk(selectedResult);
|
||||||
|
}, 500);
|
||||||
|
}
|
||||||
|
|
||||||
|
this.close();
|
||||||
|
}
|
||||||
|
|
||||||
|
private scrollToChunk(result: SearchResult): void {
|
||||||
|
// This would need to be implemented to scroll to the specific chunk
|
||||||
|
// For now, just open the file
|
||||||
|
console.log('Would scroll to chunk:', result.payload.chunk_start, result.payload.chunk_end);
|
||||||
|
}
|
||||||
|
|
||||||
|
private clearResults(): void {
|
||||||
|
this.searchResults = [];
|
||||||
|
this.selectedIndex = 0;
|
||||||
|
this.resultsContainer.empty();
|
||||||
|
this.updateStatus('');
|
||||||
|
}
|
||||||
|
|
||||||
|
private updateStatus(message: string): void {
|
||||||
|
this.statusBar.textContent = message;
|
||||||
|
}
|
||||||
|
|
||||||
|
private getVaultName(): string {
|
||||||
|
return this.app.vault.getName();
|
||||||
|
}
|
||||||
|
}
|
||||||
85
src/settings.ts
Normal file
85
src/settings.ts
Normal file
@ -0,0 +1,85 @@
|
|||||||
|
import { PluginSettings, EmbeddingProvider } from './types';
|
||||||
|
|
||||||
|
export const DEFAULT_SETTINGS: PluginSettings = {
|
||||||
|
qdrant: {
|
||||||
|
url: 'http://localhost:6333',
|
||||||
|
apiKey: '',
|
||||||
|
apiKeyHeader: 'api-key',
|
||||||
|
collectionPrefix: 'vault'
|
||||||
|
},
|
||||||
|
embedding: {
|
||||||
|
provider: EmbeddingProvider.OLLAMA,
|
||||||
|
ollama: {
|
||||||
|
url: 'http://localhost:11434',
|
||||||
|
model: 'nomic-embed-text',
|
||||||
|
batchSize: 10,
|
||||||
|
maxConcurrency: 3
|
||||||
|
},
|
||||||
|
openai: {
|
||||||
|
apiKey: '',
|
||||||
|
model: 'text-embedding-3-small',
|
||||||
|
batchSize: 100,
|
||||||
|
maxConcurrency: 5
|
||||||
|
}
|
||||||
|
},
|
||||||
|
indexing: {
|
||||||
|
includePatterns: ['*.md', '*.txt', '*.pdf', '*.png', '*.jpg', '*.jpeg'],
|
||||||
|
excludePatterns: ['*.tmp', '*.log'],
|
||||||
|
maxFileSize: 10 * 1024 * 1024, // 10MB
|
||||||
|
ignoredFolders: ['.obsidian', '.git', 'node_modules'],
|
||||||
|
useTextExtractor: true,
|
||||||
|
enableImageOCR: true
|
||||||
|
},
|
||||||
|
chunking: {
|
||||||
|
targetTokens: 500,
|
||||||
|
overlapTokens: 100,
|
||||||
|
maxTokens: 800
|
||||||
|
},
|
||||||
|
enableGraphView: true,
|
||||||
|
graphSimilarityThreshold: 0.7,
|
||||||
|
graphMaxNodes: 100
|
||||||
|
};
|
||||||
|
|
||||||
|
export function validateSettings(settings: PluginSettings): string[] {
|
||||||
|
const errors: string[] = [];
|
||||||
|
|
||||||
|
// Qdrant validation
|
||||||
|
if (!settings.qdrant.url) {
|
||||||
|
errors.push('Qdrant URL is required');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Embedding provider validation
|
||||||
|
if (settings.embedding.provider === EmbeddingProvider.OPENAI) {
|
||||||
|
if (!settings.embedding.openai.apiKey) {
|
||||||
|
errors.push('OpenAI API key is required when using OpenAI provider');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Chunking validation
|
||||||
|
if (settings.chunking.targetTokens <= 0) {
|
||||||
|
errors.push('Target tokens must be positive');
|
||||||
|
}
|
||||||
|
if (settings.chunking.overlapTokens < 0) {
|
||||||
|
errors.push('Overlap tokens cannot be negative');
|
||||||
|
}
|
||||||
|
if (settings.chunking.overlapTokens >= settings.chunking.targetTokens) {
|
||||||
|
errors.push('Overlap tokens must be less than target tokens');
|
||||||
|
}
|
||||||
|
|
||||||
|
return errors;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function sanitizeCollectionName(vaultName: string, modelName: string): string {
|
||||||
|
// Remove special characters and replace with underscores
|
||||||
|
const sanitizedVault = vaultName.replace(/[^a-zA-Z0-9]/g, '_');
|
||||||
|
const sanitizedModel = modelName.replace(/[^a-zA-Z0-9]/g, '_');
|
||||||
|
return `${sanitizedVault}_${sanitizedModel}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function getCollectionName(settings: PluginSettings, vaultName: string): string {
|
||||||
|
const modelName = settings.embedding.provider === EmbeddingProvider.OLLAMA
|
||||||
|
? settings.embedding.ollama.model
|
||||||
|
: settings.embedding.openai.model;
|
||||||
|
|
||||||
|
return sanitizeCollectionName(vaultName, modelName);
|
||||||
|
}
|
||||||
160
src/types.ts
Normal file
160
src/types.ts
Normal file
@ -0,0 +1,160 @@
|
|||||||
|
import { TFile } from 'obsidian';
|
||||||
|
|
||||||
|
export enum EmbeddingProvider {
|
||||||
|
OLLAMA = 'ollama',
|
||||||
|
OPENAI = 'openai'
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface QdrantSettings {
|
||||||
|
url: string;
|
||||||
|
apiKey?: string;
|
||||||
|
apiKeyHeader?: string;
|
||||||
|
collectionPrefix: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface OllamaSettings {
|
||||||
|
url: string;
|
||||||
|
model: string;
|
||||||
|
batchSize: number;
|
||||||
|
maxConcurrency: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface OpenAISettings {
|
||||||
|
apiKey: string;
|
||||||
|
model: string;
|
||||||
|
batchSize: number;
|
||||||
|
maxConcurrency: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface EmbeddingSettings {
|
||||||
|
provider: EmbeddingProvider;
|
||||||
|
ollama: OllamaSettings;
|
||||||
|
openai: OpenAISettings;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface IndexingSettings {
|
||||||
|
includePatterns: string[];
|
||||||
|
excludePatterns: string[];
|
||||||
|
maxFileSize: number; // in bytes
|
||||||
|
ignoredFolders: string[];
|
||||||
|
useTextExtractor: boolean;
|
||||||
|
enableImageOCR: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ChunkingSettings {
|
||||||
|
targetTokens: number;
|
||||||
|
overlapTokens: number;
|
||||||
|
maxTokens: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface PluginSettings {
|
||||||
|
qdrant: QdrantSettings;
|
||||||
|
embedding: EmbeddingSettings;
|
||||||
|
indexing: IndexingSettings;
|
||||||
|
chunking: ChunkingSettings;
|
||||||
|
enableGraphView: boolean;
|
||||||
|
graphSimilarityThreshold: number;
|
||||||
|
graphMaxNodes: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ExtractedContent {
|
||||||
|
text: string;
|
||||||
|
metadata: ChunkMetadata;
|
||||||
|
pageNumbers?: number[];
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ChunkMetadata {
|
||||||
|
path: string;
|
||||||
|
ext: string;
|
||||||
|
mime: string;
|
||||||
|
title: string;
|
||||||
|
h1: string[];
|
||||||
|
tags: string[];
|
||||||
|
aliases: string[];
|
||||||
|
links: string[];
|
||||||
|
modified: number;
|
||||||
|
created: number;
|
||||||
|
model: string;
|
||||||
|
chunk_index: number;
|
||||||
|
chunk_start: number;
|
||||||
|
chunk_end: number;
|
||||||
|
page_no?: number;
|
||||||
|
ocr?: boolean;
|
||||||
|
fm: Record<string, any>; // frontmatter fields
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface SearchResult {
|
||||||
|
id: string;
|
||||||
|
score: number;
|
||||||
|
payload: ChunkMetadata;
|
||||||
|
vector?: number[];
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface SearchOptions {
|
||||||
|
query: string;
|
||||||
|
filters?: Record<string, any>;
|
||||||
|
top_k: number;
|
||||||
|
score_threshold?: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface IndexingProgress {
|
||||||
|
totalFiles: number;
|
||||||
|
processedFiles: number;
|
||||||
|
totalChunks: number;
|
||||||
|
processedChunks: number;
|
||||||
|
currentFile?: string;
|
||||||
|
errors: string[];
|
||||||
|
isRunning: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface FileManifestEntry {
|
||||||
|
mtime: number;
|
||||||
|
size: number;
|
||||||
|
hash: string;
|
||||||
|
chunkCount: number;
|
||||||
|
lastIndexed: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface GraphNode {
|
||||||
|
id: string;
|
||||||
|
path: string;
|
||||||
|
title: string;
|
||||||
|
type: string;
|
||||||
|
x?: number;
|
||||||
|
y?: number;
|
||||||
|
size?: number;
|
||||||
|
color?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface GraphEdge {
|
||||||
|
source: string;
|
||||||
|
target: string;
|
||||||
|
weight: number;
|
||||||
|
similarity: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface GraphData {
|
||||||
|
nodes: GraphNode[];
|
||||||
|
edges: GraphEdge[];
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface EmbeddingProviderInterface {
|
||||||
|
embed(texts: string[]): Promise<number[][]>;
|
||||||
|
getDimension(): Promise<number>;
|
||||||
|
getName(): string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ExtractorInterface {
|
||||||
|
canHandle(file: TFile): boolean;
|
||||||
|
extract(file: TFile): Promise<ExtractedContent>;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface ChunkerInterface {
|
||||||
|
chunk(content: ExtractedContent): Promise<ChunkMetadata[]>;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface IndexingQueueItem {
|
||||||
|
file: TFile;
|
||||||
|
action: 'create' | 'update' | 'delete';
|
||||||
|
priority: number;
|
||||||
|
}
|
||||||
468
src/ui/settingsTab.ts
Normal file
468
src/ui/settingsTab.ts
Normal file
@ -0,0 +1,468 @@
|
|||||||
|
import { App, PluginSettingTab, Setting, Notice } from 'obsidian';
|
||||||
|
import { PluginSettings, EmbeddingProvider } from '../types';
|
||||||
|
import { validateSettings } from '../settings';
|
||||||
|
|
||||||
|
export class QdrantSettingsTab extends PluginSettingTab {
|
||||||
|
plugin: any;
|
||||||
|
settings: PluginSettings;
|
||||||
|
|
||||||
|
constructor(app: App, plugin: any) {
|
||||||
|
super(app, plugin);
|
||||||
|
this.plugin = plugin;
|
||||||
|
this.settings = plugin.settings;
|
||||||
|
}
|
||||||
|
|
||||||
|
display(): void {
|
||||||
|
const { containerEl } = this;
|
||||||
|
containerEl.empty();
|
||||||
|
|
||||||
|
containerEl.createEl('h2', { text: 'Qdrant Semantic Search Settings' });
|
||||||
|
|
||||||
|
// Qdrant Settings
|
||||||
|
this.createQdrantSettings(containerEl);
|
||||||
|
|
||||||
|
// Embedding Settings
|
||||||
|
this.createEmbeddingSettings(containerEl);
|
||||||
|
|
||||||
|
// Indexing Settings
|
||||||
|
this.createIndexingSettings(containerEl);
|
||||||
|
|
||||||
|
// Chunking Settings
|
||||||
|
this.createChunkingSettings(containerEl);
|
||||||
|
|
||||||
|
// Graph Settings
|
||||||
|
this.createGraphSettings(containerEl);
|
||||||
|
|
||||||
|
// Actions
|
||||||
|
this.createActionButtons(containerEl);
|
||||||
|
}
|
||||||
|
|
||||||
|
private createQdrantSettings(containerEl: HTMLElement): void {
|
||||||
|
containerEl.createEl('h3', { text: 'Qdrant Configuration' });
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Qdrant URL')
|
||||||
|
.setDesc('URL of your Qdrant instance (e.g., http://localhost:6333)')
|
||||||
|
.addText(text => text
|
||||||
|
.setPlaceholder('http://localhost:6333')
|
||||||
|
.setValue(this.settings.qdrant.url)
|
||||||
|
.onChange(async (value) => {
|
||||||
|
this.settings.qdrant.url = value;
|
||||||
|
await this.plugin.saveSettings();
|
||||||
|
}));
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('API Key')
|
||||||
|
.setDesc('API key for Qdrant (leave empty for local instances)')
|
||||||
|
.addText(text => text
|
||||||
|
.setPlaceholder('Your API key')
|
||||||
|
.setValue(this.settings.qdrant.apiKey || '')
|
||||||
|
.onChange(async (value) => {
|
||||||
|
this.settings.qdrant.apiKey = value;
|
||||||
|
await this.plugin.saveSettings();
|
||||||
|
}));
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('API Key Header')
|
||||||
|
.setDesc('Header name for API key authentication')
|
||||||
|
.addText(text => text
|
||||||
|
.setPlaceholder('api-key')
|
||||||
|
.setValue(this.settings.qdrant.apiKeyHeader || 'api-key')
|
||||||
|
.onChange(async (value) => {
|
||||||
|
this.settings.qdrant.apiKeyHeader = value;
|
||||||
|
await this.plugin.saveSettings();
|
||||||
|
}));
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Collection Prefix')
|
||||||
|
.setDesc('Prefix for collection names')
|
||||||
|
.addText(text => text
|
||||||
|
.setPlaceholder('vault')
|
||||||
|
.setValue(this.settings.qdrant.collectionPrefix)
|
||||||
|
.onChange(async (value) => {
|
||||||
|
this.settings.qdrant.collectionPrefix = value;
|
||||||
|
await this.plugin.saveSettings();
|
||||||
|
}));
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Test Connection')
|
||||||
|
.setDesc('Test connection to Qdrant')
|
||||||
|
.addButton(button => button
|
||||||
|
.setButtonText('Test')
|
||||||
|
.onClick(async () => {
|
||||||
|
try {
|
||||||
|
const success = await this.plugin.testQdrantConnection();
|
||||||
|
if (success) {
|
||||||
|
new Notice('✅ Qdrant connection successful');
|
||||||
|
} else {
|
||||||
|
new Notice('❌ Qdrant connection failed');
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
new Notice('❌ Qdrant connection failed: ' + error.message);
|
||||||
|
}
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
|
||||||
|
private createEmbeddingSettings(containerEl: HTMLElement): void {
|
||||||
|
containerEl.createEl('h3', { text: 'Embedding Configuration' });
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Embedding Provider')
|
||||||
|
.setDesc('Choose your embedding provider')
|
||||||
|
.addDropdown(dropdown => dropdown
|
||||||
|
.addOption(EmbeddingProvider.OLLAMA, 'Ollama (Local)')
|
||||||
|
.addOption(EmbeddingProvider.OPENAI, 'OpenAI (API)')
|
||||||
|
.setValue(this.settings.embedding.provider)
|
||||||
|
.onChange(async (value) => {
|
||||||
|
this.settings.embedding.provider = value as EmbeddingProvider;
|
||||||
|
await this.plugin.saveSettings();
|
||||||
|
this.display(); // Refresh to show provider-specific settings
|
||||||
|
}));
|
||||||
|
|
||||||
|
if (this.settings.embedding.provider === EmbeddingProvider.OLLAMA) {
|
||||||
|
this.createOllamaSettings(containerEl);
|
||||||
|
} else if (this.settings.embedding.provider === EmbeddingProvider.OPENAI) {
|
||||||
|
this.createOpenAISettings(containerEl);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private createOllamaSettings(containerEl: HTMLElement): void {
|
||||||
|
containerEl.createEl('h4', { text: 'Ollama Settings' });
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Ollama URL')
|
||||||
|
.setDesc('URL of your Ollama instance')
|
||||||
|
.addText(text => text
|
||||||
|
.setPlaceholder('http://localhost:11434')
|
||||||
|
.setValue(this.settings.embedding.ollama.url)
|
||||||
|
.onChange(async (value) => {
|
||||||
|
this.settings.embedding.ollama.url = value;
|
||||||
|
await this.plugin.saveSettings();
|
||||||
|
}));
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Model')
|
||||||
|
.setDesc('Ollama embedding model to use')
|
||||||
|
.addText(text => text
|
||||||
|
.setPlaceholder('nomic-embed-text')
|
||||||
|
.setValue(this.settings.embedding.ollama.model)
|
||||||
|
.onChange(async (value) => {
|
||||||
|
this.settings.embedding.ollama.model = value;
|
||||||
|
await this.plugin.saveSettings();
|
||||||
|
}));
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Batch Size')
|
||||||
|
.setDesc('Number of texts to process in each batch')
|
||||||
|
.addSlider(slider => slider
|
||||||
|
.setLimits(1, 50, 1)
|
||||||
|
.setValue(this.settings.embedding.ollama.batchSize)
|
||||||
|
.setDynamicTooltip()
|
||||||
|
.onChange(async (value) => {
|
||||||
|
this.settings.embedding.ollama.batchSize = value;
|
||||||
|
await this.plugin.saveSettings();
|
||||||
|
}));
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Max Concurrency')
|
||||||
|
.setDesc('Maximum number of concurrent requests')
|
||||||
|
.addSlider(slider => slider
|
||||||
|
.setLimits(1, 10, 1)
|
||||||
|
.setValue(this.settings.embedding.ollama.maxConcurrency)
|
||||||
|
.setDynamicTooltip()
|
||||||
|
.onChange(async (value) => {
|
||||||
|
this.settings.embedding.ollama.maxConcurrency = value;
|
||||||
|
await this.plugin.saveSettings();
|
||||||
|
}));
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Test Ollama Connection')
|
||||||
|
.setDesc('Test connection to Ollama')
|
||||||
|
.addButton(button => button
|
||||||
|
.setButtonText('Test')
|
||||||
|
.onClick(async () => {
|
||||||
|
try {
|
||||||
|
const success = await this.plugin.testOllamaConnection();
|
||||||
|
if (success) {
|
||||||
|
new Notice('✅ Ollama connection successful');
|
||||||
|
} else {
|
||||||
|
new Notice('❌ Ollama connection failed');
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
new Notice('❌ Ollama connection failed: ' + error.message);
|
||||||
|
}
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
|
||||||
|
private createOpenAISettings(containerEl: HTMLElement): void {
|
||||||
|
containerEl.createEl('h4', { text: 'OpenAI Settings' });
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('API Key')
|
||||||
|
.setDesc('Your OpenAI API key')
|
||||||
|
.addText(text => text
|
||||||
|
.setPlaceholder('sk-...')
|
||||||
|
.setValue(this.settings.embedding.openai.apiKey)
|
||||||
|
.onChange(async (value) => {
|
||||||
|
this.settings.embedding.openai.apiKey = value;
|
||||||
|
await this.plugin.saveSettings();
|
||||||
|
}));
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Model')
|
||||||
|
.setDesc('OpenAI embedding model to use')
|
||||||
|
.addDropdown(dropdown => dropdown
|
||||||
|
.addOption('text-embedding-3-small', 'text-embedding-3-small')
|
||||||
|
.addOption('text-embedding-3-large', 'text-embedding-3-large')
|
||||||
|
.addOption('text-embedding-ada-002', 'text-embedding-ada-002')
|
||||||
|
.setValue(this.settings.embedding.openai.model)
|
||||||
|
.onChange(async (value) => {
|
||||||
|
this.settings.embedding.openai.model = value;
|
||||||
|
await this.plugin.saveSettings();
|
||||||
|
}));
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Batch Size')
|
||||||
|
.setDesc('Number of texts to process in each batch')
|
||||||
|
.addSlider(slider => slider
|
||||||
|
.setLimits(1, 2048, 1)
|
||||||
|
.setValue(this.settings.embedding.openai.batchSize)
|
||||||
|
.setDynamicTooltip()
|
||||||
|
.onChange(async (value) => {
|
||||||
|
this.settings.embedding.openai.batchSize = value;
|
||||||
|
await this.plugin.saveSettings();
|
||||||
|
}));
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Max Concurrency')
|
||||||
|
.setDesc('Maximum number of concurrent requests')
|
||||||
|
.addSlider(slider => slider
|
||||||
|
.setLimits(1, 10, 1)
|
||||||
|
.setValue(this.settings.embedding.openai.maxConcurrency)
|
||||||
|
.setDynamicTooltip()
|
||||||
|
.onChange(async (value) => {
|
||||||
|
this.settings.embedding.openai.maxConcurrency = value;
|
||||||
|
await this.plugin.saveSettings();
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
|
||||||
|
private createIndexingSettings(containerEl: HTMLElement): void {
|
||||||
|
containerEl.createEl('h3', { text: 'Indexing Configuration' });
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Include Patterns')
|
||||||
|
.setDesc('File patterns to include (one per line)')
|
||||||
|
.addTextArea(text => text
|
||||||
|
.setPlaceholder('*.md\n*.txt\n*.pdf')
|
||||||
|
.setValue(this.settings.indexing.includePatterns.join('\n'))
|
||||||
|
.onChange(async (value) => {
|
||||||
|
this.settings.indexing.includePatterns = value.split('\n').filter(p => p.trim());
|
||||||
|
await this.plugin.saveSettings();
|
||||||
|
}));
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Exclude Patterns')
|
||||||
|
.setDesc('File patterns to exclude (one per line)')
|
||||||
|
.addTextArea(text => text
|
||||||
|
.setPlaceholder('*.tmp\n*.log')
|
||||||
|
.setValue(this.settings.indexing.excludePatterns.join('\n'))
|
||||||
|
.onChange(async (value) => {
|
||||||
|
this.settings.indexing.excludePatterns = value.split('\n').filter(p => p.trim());
|
||||||
|
await this.plugin.saveSettings();
|
||||||
|
}));
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Max File Size (MB)')
|
||||||
|
.setDesc('Maximum file size to index')
|
||||||
|
.addSlider(slider => slider
|
||||||
|
.setLimits(1, 100, 1)
|
||||||
|
.setValue(this.settings.indexing.maxFileSize / (1024 * 1024))
|
||||||
|
.setDynamicTooltip()
|
||||||
|
.onChange(async (value) => {
|
||||||
|
this.settings.indexing.maxFileSize = value * 1024 * 1024;
|
||||||
|
await this.plugin.saveSettings();
|
||||||
|
}));
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Ignored Folders')
|
||||||
|
.setDesc('Folders to ignore (one per line)')
|
||||||
|
.addTextArea(text => text
|
||||||
|
.setPlaceholder('.obsidian\n.git\nnode_modules')
|
||||||
|
.setValue(this.settings.indexing.ignoredFolders.join('\n'))
|
||||||
|
.onChange(async (value) => {
|
||||||
|
this.settings.indexing.ignoredFolders = value.split('\n').filter(p => p.trim());
|
||||||
|
await this.plugin.saveSettings();
|
||||||
|
}));
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Use Text Extractor')
|
||||||
|
.setDesc('Use Text Extractor plugin for PDF and image text extraction')
|
||||||
|
.addToggle(toggle => toggle
|
||||||
|
.setValue(this.settings.indexing.useTextExtractor)
|
||||||
|
.onChange(async (value) => {
|
||||||
|
this.settings.indexing.useTextExtractor = value;
|
||||||
|
await this.plugin.saveSettings();
|
||||||
|
}));
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Enable Image OCR')
|
||||||
|
.setDesc('Enable OCR for images (requires Text Extractor plugin)')
|
||||||
|
.addToggle(toggle => toggle
|
||||||
|
.setValue(this.settings.indexing.enableImageOCR)
|
||||||
|
.onChange(async (value) => {
|
||||||
|
this.settings.indexing.enableImageOCR = value;
|
||||||
|
await this.plugin.saveSettings();
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
|
||||||
|
private createChunkingSettings(containerEl: HTMLElement): void {
|
||||||
|
containerEl.createEl('h3', { text: 'Chunking Configuration' });
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Target Tokens')
|
||||||
|
.setDesc('Target number of tokens per chunk')
|
||||||
|
.addSlider(slider => slider
|
||||||
|
.setLimits(100, 1000, 50)
|
||||||
|
.setValue(this.settings.chunking.targetTokens)
|
||||||
|
.setDynamicTooltip()
|
||||||
|
.onChange(async (value) => {
|
||||||
|
this.settings.chunking.targetTokens = value;
|
||||||
|
await this.plugin.saveSettings();
|
||||||
|
}));
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Overlap Tokens')
|
||||||
|
.setDesc('Number of tokens to overlap between chunks')
|
||||||
|
.addSlider(slider => slider
|
||||||
|
.setLimits(0, 200, 10)
|
||||||
|
.setValue(this.settings.chunking.overlapTokens)
|
||||||
|
.setDynamicTooltip()
|
||||||
|
.onChange(async (value) => {
|
||||||
|
this.settings.chunking.overlapTokens = value;
|
||||||
|
await this.plugin.saveSettings();
|
||||||
|
}));
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Max Tokens')
|
||||||
|
.setDesc('Maximum tokens per chunk (hard limit)')
|
||||||
|
.addSlider(slider => slider
|
||||||
|
.setLimits(200, 2000, 100)
|
||||||
|
.setValue(this.settings.chunking.maxTokens)
|
||||||
|
.setDynamicTooltip()
|
||||||
|
.onChange(async (value) => {
|
||||||
|
this.settings.chunking.maxTokens = value;
|
||||||
|
await this.plugin.saveSettings();
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
|
||||||
|
private createGraphSettings(containerEl: HTMLElement): void {
|
||||||
|
containerEl.createEl('h3', { text: 'Graph Visualization' });
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Enable Graph View')
|
||||||
|
.setDesc('Enable graph visualization of document relationships')
|
||||||
|
.addToggle(toggle => toggle
|
||||||
|
.setValue(this.settings.enableGraphView)
|
||||||
|
.onChange(async (value) => {
|
||||||
|
this.settings.enableGraphView = value;
|
||||||
|
await this.plugin.saveSettings();
|
||||||
|
}));
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Similarity Threshold')
|
||||||
|
.setDesc('Minimum similarity score for graph edges')
|
||||||
|
.addSlider(slider => slider
|
||||||
|
.setLimits(0.1, 1.0, 0.1)
|
||||||
|
.setValue(this.settings.graphSimilarityThreshold)
|
||||||
|
.setDynamicTooltip()
|
||||||
|
.onChange(async (value) => {
|
||||||
|
this.settings.graphSimilarityThreshold = value;
|
||||||
|
await this.plugin.saveSettings();
|
||||||
|
}));
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Max Nodes')
|
||||||
|
.setDesc('Maximum number of nodes to display in graph')
|
||||||
|
.addSlider(slider => slider
|
||||||
|
.setLimits(10, 500, 10)
|
||||||
|
.setValue(this.settings.graphMaxNodes)
|
||||||
|
.setDynamicTooltip()
|
||||||
|
.onChange(async (value) => {
|
||||||
|
this.settings.graphMaxNodes = value;
|
||||||
|
await this.plugin.saveSettings();
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
|
||||||
|
private createActionButtons(containerEl: HTMLElement): void {
|
||||||
|
containerEl.createEl('h3', { text: 'Actions' });
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Full Reindex')
|
||||||
|
.setDesc('Reindex the entire vault')
|
||||||
|
.addButton(button => button
|
||||||
|
.setButtonText('Reindex Vault')
|
||||||
|
.setCta()
|
||||||
|
.onClick(async () => {
|
||||||
|
try {
|
||||||
|
new Notice('Starting full reindex...');
|
||||||
|
await this.plugin.indexFullVault();
|
||||||
|
new Notice('✅ Full reindex completed');
|
||||||
|
} catch (error) {
|
||||||
|
new Notice('❌ Full reindex failed: ' + error.message);
|
||||||
|
}
|
||||||
|
}));
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Clear Index')
|
||||||
|
.setDesc('Clear all indexed data')
|
||||||
|
.addButton(button => button
|
||||||
|
.setButtonText('Clear Index')
|
||||||
|
.setWarning()
|
||||||
|
.onClick(async () => {
|
||||||
|
try {
|
||||||
|
await this.plugin.clearIndex();
|
||||||
|
new Notice('✅ Index cleared');
|
||||||
|
} catch (error) {
|
||||||
|
new Notice('❌ Failed to clear index: ' + error.message);
|
||||||
|
}
|
||||||
|
}));
|
||||||
|
|
||||||
|
new Setting(containerEl)
|
||||||
|
.setName('Index Statistics')
|
||||||
|
.setDesc('View current index statistics')
|
||||||
|
.addButton(button => button
|
||||||
|
.setButtonText('View Stats')
|
||||||
|
.onClick(async () => {
|
||||||
|
try {
|
||||||
|
const stats = await this.plugin.getIndexStats();
|
||||||
|
this.showStatsModal(stats);
|
||||||
|
} catch (error) {
|
||||||
|
new Notice('❌ Failed to get stats: ' + error.message);
|
||||||
|
}
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
|
||||||
|
private showStatsModal(stats: any): void {
|
||||||
|
const modal = new (this.app as any).Modal();
|
||||||
|
modal.titleEl.textContent = 'Index Statistics';
|
||||||
|
|
||||||
|
const content = modal.contentEl;
|
||||||
|
content.empty();
|
||||||
|
|
||||||
|
content.createEl('h4', { text: 'Collection Stats' });
|
||||||
|
content.createEl('p', { text: `Points: ${stats.collectionStats.pointsCount}` });
|
||||||
|
content.createEl('p', { text: `Vectors: ${stats.collectionStats.vectorsCount}` });
|
||||||
|
content.createEl('p', { text: `Disk Size: ${(stats.collectionStats.diskSize / 1024 / 1024).toFixed(2)} MB` });
|
||||||
|
|
||||||
|
content.createEl('h4', { text: 'Manifest Stats' });
|
||||||
|
content.createEl('p', { text: `Files: ${stats.manifestStats.totalFiles}` });
|
||||||
|
content.createEl('p', { text: `Chunks: ${stats.manifestStats.totalChunks}` });
|
||||||
|
content.createEl('p', { text: `Total Size: ${(stats.manifestStats.totalSize / 1024 / 1024).toFixed(2)} MB` });
|
||||||
|
|
||||||
|
content.createEl('h4', { text: 'Queue Stats' });
|
||||||
|
content.createEl('p', { text: `Queue Length: ${stats.queueStats.queueLength}` });
|
||||||
|
content.createEl('p', { text: `Processing: ${stats.queueStats.isProcessing ? 'Yes' : 'No'}` });
|
||||||
|
|
||||||
|
modal.open();
|
||||||
|
}
|
||||||
|
}
|
||||||
228
styles.css
228
styles.css
@ -1,8 +1,226 @@
|
|||||||
/*
|
/* Qdrant Semantic Search Plugin Styles */
|
||||||
|
|
||||||
This CSS file will be included with your plugin, and
|
.qdrant-search-header {
|
||||||
available in the app when your plugin is enabled.
|
margin-bottom: 1rem;
|
||||||
|
}
|
||||||
|
|
||||||
If your plugin does not need CSS, delete this file.
|
.qdrant-search-input {
|
||||||
|
width: 100%;
|
||||||
|
padding: 0.5rem;
|
||||||
|
margin-bottom: 1rem;
|
||||||
|
border: 1px solid var(--background-modifier-border);
|
||||||
|
border-radius: 4px;
|
||||||
|
background: var(--background-primary);
|
||||||
|
color: var(--text-normal);
|
||||||
|
font-size: 1rem;
|
||||||
|
}
|
||||||
|
|
||||||
*/
|
.qdrant-search-input:focus {
|
||||||
|
outline: none;
|
||||||
|
border-color: var(--interactive-accent);
|
||||||
|
box-shadow: 0 0 0 2px var(--interactive-accent-hover);
|
||||||
|
}
|
||||||
|
|
||||||
|
.qdrant-search-results {
|
||||||
|
max-height: 400px;
|
||||||
|
overflow-y: auto;
|
||||||
|
}
|
||||||
|
|
||||||
|
.qdrant-search-result {
|
||||||
|
padding: 0.75rem;
|
||||||
|
margin-bottom: 0.5rem;
|
||||||
|
border: 1px solid var(--background-modifier-border);
|
||||||
|
border-radius: 4px;
|
||||||
|
background: var(--background-primary);
|
||||||
|
cursor: pointer;
|
||||||
|
transition: all 0.2s ease;
|
||||||
|
}
|
||||||
|
|
||||||
|
.qdrant-search-result:hover {
|
||||||
|
background: var(--background-secondary);
|
||||||
|
border-color: var(--interactive-accent);
|
||||||
|
}
|
||||||
|
|
||||||
|
.qdrant-search-result.selected {
|
||||||
|
background: var(--interactive-accent);
|
||||||
|
color: var(--text-on-accent);
|
||||||
|
border-color: var(--interactive-accent);
|
||||||
|
}
|
||||||
|
|
||||||
|
.qdrant-search-result-path {
|
||||||
|
font-size: 0.8rem;
|
||||||
|
color: var(--text-muted);
|
||||||
|
margin-bottom: 0.25rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.qdrant-search-result-title {
|
||||||
|
font-weight: 600;
|
||||||
|
margin-bottom: 0.5rem;
|
||||||
|
color: var(--text-normal);
|
||||||
|
}
|
||||||
|
|
||||||
|
.qdrant-search-result.selected .qdrant-search-result-title {
|
||||||
|
color: var(--text-on-accent);
|
||||||
|
}
|
||||||
|
|
||||||
|
.qdrant-search-result-snippet {
|
||||||
|
font-size: 0.9rem;
|
||||||
|
line-height: 1.4;
|
||||||
|
margin-bottom: 0.5rem;
|
||||||
|
color: var(--text-normal);
|
||||||
|
}
|
||||||
|
|
||||||
|
.qdrant-search-result.selected .qdrant-search-result-snippet {
|
||||||
|
color: var(--text-on-accent);
|
||||||
|
}
|
||||||
|
|
||||||
|
.qdrant-search-result-metadata {
|
||||||
|
font-size: 0.75rem;
|
||||||
|
color: var(--text-muted);
|
||||||
|
margin-bottom: 0.25rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.qdrant-search-result.selected .qdrant-search-result-metadata {
|
||||||
|
color: var(--text-on-accent);
|
||||||
|
}
|
||||||
|
|
||||||
|
.qdrant-search-result-score {
|
||||||
|
font-size: 0.75rem;
|
||||||
|
color: var(--text-muted);
|
||||||
|
font-weight: 500;
|
||||||
|
}
|
||||||
|
|
||||||
|
.qdrant-search-result.selected .qdrant-search-result-score {
|
||||||
|
color: var(--text-on-accent);
|
||||||
|
}
|
||||||
|
|
||||||
|
.qdrant-search-no-results {
|
||||||
|
text-align: center;
|
||||||
|
padding: 2rem;
|
||||||
|
color: var(--text-muted);
|
||||||
|
font-style: italic;
|
||||||
|
}
|
||||||
|
|
||||||
|
.qdrant-search-status {
|
||||||
|
font-size: 0.8rem;
|
||||||
|
color: var(--text-muted);
|
||||||
|
margin-top: 0.5rem;
|
||||||
|
text-align: center;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Graph view styles */
|
||||||
|
.qdrant-graph-container {
|
||||||
|
width: 100%;
|
||||||
|
height: 100%;
|
||||||
|
position: relative;
|
||||||
|
}
|
||||||
|
|
||||||
|
.qdrant-graph-svg {
|
||||||
|
width: 100%;
|
||||||
|
height: 100%;
|
||||||
|
}
|
||||||
|
|
||||||
|
.qdrant-graph-node {
|
||||||
|
cursor: pointer;
|
||||||
|
transition: all 0.2s ease;
|
||||||
|
}
|
||||||
|
|
||||||
|
.qdrant-graph-node:hover {
|
||||||
|
stroke-width: 3px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.qdrant-graph-edge {
|
||||||
|
stroke: var(--text-muted);
|
||||||
|
stroke-width: 1px;
|
||||||
|
opacity: 0.6;
|
||||||
|
}
|
||||||
|
|
||||||
|
.qdrant-graph-edge:hover {
|
||||||
|
stroke-width: 2px;
|
||||||
|
opacity: 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
.qdrant-graph-tooltip {
|
||||||
|
position: absolute;
|
||||||
|
background: var(--background-primary);
|
||||||
|
border: 1px solid var(--background-modifier-border);
|
||||||
|
border-radius: 4px;
|
||||||
|
padding: 0.5rem;
|
||||||
|
font-size: 0.8rem;
|
||||||
|
pointer-events: none;
|
||||||
|
z-index: 1000;
|
||||||
|
box-shadow: 0 2px 8px rgba(0, 0, 0, 0.1);
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Settings styles */
|
||||||
|
.qdrant-settings-section {
|
||||||
|
margin-bottom: 2rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.qdrant-settings-section h3 {
|
||||||
|
margin-bottom: 1rem;
|
||||||
|
color: var(--text-accent);
|
||||||
|
}
|
||||||
|
|
||||||
|
.qdrant-settings-section h4 {
|
||||||
|
margin-bottom: 0.75rem;
|
||||||
|
color: var(--text-normal);
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Status bar styles */
|
||||||
|
.qdrant-status-bar {
|
||||||
|
font-size: 0.8rem;
|
||||||
|
color: var(--text-muted);
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Progress indicator */
|
||||||
|
.qdrant-progress {
|
||||||
|
width: 100%;
|
||||||
|
height: 4px;
|
||||||
|
background: var(--background-modifier-border);
|
||||||
|
border-radius: 2px;
|
||||||
|
overflow: hidden;
|
||||||
|
margin: 0.5rem 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
.qdrant-progress-bar {
|
||||||
|
height: 100%;
|
||||||
|
background: var(--interactive-accent);
|
||||||
|
transition: width 0.3s ease;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Error styles */
|
||||||
|
.qdrant-error {
|
||||||
|
color: var(--text-error);
|
||||||
|
background: var(--background-modifier-error);
|
||||||
|
border: 1px solid var(--text-error);
|
||||||
|
border-radius: 4px;
|
||||||
|
padding: 0.5rem;
|
||||||
|
margin: 0.5rem 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Success styles */
|
||||||
|
.qdrant-success {
|
||||||
|
color: var(--text-success);
|
||||||
|
background: var(--background-modifier-success);
|
||||||
|
border: 1px solid var(--text-success);
|
||||||
|
border-radius: 4px;
|
||||||
|
padding: 0.5rem;
|
||||||
|
margin: 0.5rem 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Loading spinner */
|
||||||
|
.qdrant-spinner {
|
||||||
|
display: inline-block;
|
||||||
|
width: 16px;
|
||||||
|
height: 16px;
|
||||||
|
border: 2px solid var(--background-modifier-border);
|
||||||
|
border-top: 2px solid var(--interactive-accent);
|
||||||
|
border-radius: 50%;
|
||||||
|
animation: qdrant-spin 1s linear infinite;
|
||||||
|
}
|
||||||
|
|
||||||
|
@keyframes qdrant-spin {
|
||||||
|
0% { transform: rotate(0deg); }
|
||||||
|
100% { transform: rotate(360deg); }
|
||||||
|
}
|
||||||
Loading…
x
Reference in New Issue
Block a user