-
Notifications
You must be signed in to change notification settings - Fork 4.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: map_httpcore_exceptions while invoking query_engine.query in colab #12670
Comments
Hey @sunilnagpal, great to see you back! Hope you're doing well despite this pesky connection error giving you a bit of a headache. Let's see what we can do to sort it out. Based on the error message you've shared, it seems like you're encountering a network-related issue. This error is typically not directly related to the LlamaIndex code, but rather to the network settings in your Google Colab environment. Here are some general steps you can take to address this issue:
Regarding your code snippet, it seems correct. However, there was a similar issue reported in the past where the user was not correctly loading the documents. Make sure you're correctly loading your documents using the from llama_index import VectorStoreIndex, SimpleDirectoryReader
documents = SimpleDirectoryReader(input_files = ["test2.txt"]).load_data()
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()
query_engine.query("what is my name?") You can find more details about this issue here. Lastly, please note that there are known limitations with using LlamaIndex in a Google Colab environment. Google Colab typically uses Python 3.7, and LlamaIndex requires Python version 3.10 specifically. This means LlamaIndex may not be compatible with the default Python version in Google Colab. You can find more details about this limitation in the BUILD file. I hope this helps! If you have any other questions or need further clarification, feel free to ask. Sources
|
Thanks Dosu, but this didn't resolve the issue. Also why is network library needed while querying? |
Based on the error message you provided, it seems like you're encountering a network-related issue when trying to perform a query with LlamaIndex in a Google Colab environment. This error typically occurs when an application is trying to bind or connect to an IP address and port that is not available or not allowed. In the context of Google Colab, this could be due to restrictions on outbound connections or the specific way networking is handled within the Colab environment. The role of network libraries in the querying process of LlamaIndex involves two main components: During the querying process, LlamaIndex performs specific network operations such as asynchronous querying of contributors, gathering results, response synthesis, and callback management. These operations are encapsulated within the To resolve this issue, you might want to consider the following steps:
I also found a similar issue in the LlamaIndex repository where the user encountered a problem while querying in a Colab environment. The suggested solution was to correct the document's argument. Since it is a file, use the input argument from llama_index import VectorStoreIndex, SimpleDirectoryReader
documents = SimpleDirectoryReader(input_files = ["test2.txt"]).load_data()
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()
query_engine.query("what is my name?") You can find more details about this issue here. I hope this helps! If you have any other questions or need further clarification, feel free to ask. Sources
|
Tried it in the local system to get similar network error:
|
Hello I will appreciate assistance on the above pls. Thanks |
Have you been able to resolve this issue as I am getting the same connection error. |
No, @Shahik15 it remains without resolution or support on this... |
@jerryjliu Can you please help here if possible. Thanks |
Experiencing the same issue, working well in a local notebook but not on Google Colab... |
@kumailkhan1 can you share the code for local notebook? As I am having the connection error on local notebook pycharm as well. |
Cant share the exact but you can say it is almost identical to what you shared... |
It's been 2 weeks of since the first issue report but I await the attention of the repo maintainers. Will appreciate any inputs on whom to tag for catching their attention |
@logan-markewich is it possible for you to help or assist with bringing this to someone's attention? |
llamaIndex 这个简直了,问题非常多 |
Hi! Any updates on this issue? |
I was facing the same issue on running ollama and llamaindex in docker, I added the base url in this line to connect to the ollama docker instance, and this error did not appear again. I hope it helps.
In my case the base_url was |
Bug Description
New to llama_index. Thank you for all the efforts towards developing and maintaining it. I'm trying to reproduce the starter example in colab using my own tabulated data. Works fine until I try to query:
Following works fine -
Error is thrown when trying to query:
response = query_engine.query("What did the author write about?")
I have shared the error in the logs section.
Version
llama-index-0.10.27
Steps to Reproduce
Trying to reproduce starter example local of https://docs.llamaindex.ai/en/stable/getting_started/starter_example_local/ in colab.
Relevant Logs/Tracbacks
The text was updated successfully, but these errors were encountered: