Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Searching local system offline #301

Open
Tortoise17 opened this issue Feb 19, 2024 · 5 comments
Open

Searching local system offline #301

Tortoise17 opened this issue Feb 19, 2024 · 5 comments

Comments

@Tortoise17
Copy link

Is this possible to use this pipeline to search locally offline inside the system from specific folders and paths?? Is the setup deployment possible? Please if you can guide me.

@nico-martin
Copy link

I'm not sure what you want to achieve.
Could you might add an example of what your app/project should do and how WebLLM could help you?

@Tortoise17
Copy link
Author

I want to achieve the llm run off line and also want to make it work to gain a search optimization from internal data and from llm vectorized inter linking.

@Tortoise17
Copy link
Author

Tortoise17 commented Feb 20, 2024

Something like this .
search system optimization
but not for code base only. for the database search optimization locally which work together with llm and local folder coordination!!

@nico-martin
Copy link

I'm not fully convinced I get your question. But from the link you provided it seems you want to use an LLM to make sense of all the data in a local folder so you can ask questions or complete tasks based on that data, right?

So the idea of WebLLM is that you can run Large Language Models in the browser. So either you provide a Model that is finetuned for your usecase (trained with the data you need) or you put the information in the prompt (context).

Since WebLLM runs as a browser application it has very limited access to file and folder structures on your device. You could take a look at the File System Access API. So you could prepare the needed context to add it to your prompt. But it from my understanding it won't be able to just remember all the necessary information and then generate the Text you would expect.

@Tortoise17
Copy link
Author

yes, you understood quite close to what I want to implement. make sense of input search and already available data in folders locally and on customized demand interlinking and making relationships as for search optimization and not for text generation. Something like this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants