Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: demo for pinecone support broken #13347

Closed
jzhao62 opened this issue May 8, 2024 · 4 comments
Closed

[Bug]: demo for pinecone support broken #13347

jzhao62 opened this issue May 8, 2024 · 4 comments
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized

Comments

@jzhao62
Copy link

jzhao62 commented May 8, 2024

Bug Description

follow the example code for pinecone support

Cause:

  • attempting to instantiate an abstract instance
    image

Remedy:

  • Changing BaseNode type to TextNode will make the demo work

Version

0.10.34

Steps to Reproduce

  1. follow the example code for pinecone support
  2. you will see error when using query engine

Relevant Logs/Tracbacks

- Traceback

Traceback (most recent call last):
  File "/home/jingyi/PycharmProjects/llama/llama-pinecone.py", line 37, in <module>
    response = query_engine.query("What did the author do growing up?")
  File "/home/jingyi/PycharmProjects/llama/.venv/lib/python3.10/site-packages/llama_index/core/instrumentation/dispatcher.py", line 274, in wrapper
    result = func(*args, **kwargs)
  File "/home/jingyi/PycharmProjects/llama/.venv/lib/python3.10/site-packages/llama_index/core/base/base_query_engine.py", line 53, in query
    query_result = self._query(str_or_query_bundle)
  File "/home/jingyi/PycharmProjects/llama/.venv/lib/python3.10/site-packages/llama_index/core/instrumentation/dispatcher.py", line 274, in wrapper
    result = func(*args, **kwargs)
  File "/home/jingyi/PycharmProjects/llama/.venv/lib/python3.10/site-packages/llama_index/core/query_engine/retriever_query_engine.py", line 189, in _query
    nodes = self.retrieve(query_bundle)
  File "/home/jingyi/PycharmProjects/llama/.venv/lib/python3.10/site-packages/llama_index/core/query_engine/retriever_query_engine.py", line 144, in retrieve
    nodes = self._retriever.retrieve(query_bundle)
  File "/home/jingyi/PycharmProjects/llama/.venv/lib/python3.10/site-packages/llama_index/core/instrumentation/dispatcher.py", line 274, in wrapper
    result = func(*args, **kwargs)
  File "/home/jingyi/PycharmProjects/llama/.venv/lib/python3.10/site-packages/llama_index/core/base/base_retriever.py", line 244, in retrieve
    nodes = self._retrieve(query_bundle)
  File "/home/jingyi/PycharmProjects/llama/.venv/lib/python3.10/site-packages/llama_index/core/instrumentation/dispatcher.py", line 274, in wrapper
    result = func(*args, **kwargs)
  File "/home/jingyi/PycharmProjects/llama/.venv/lib/python3.10/site-packages/llama_index/core/indices/vector_store/retrievers/retriever.py", line 101, in _retrieve
    return self._get_nodes_with_embeddings(query_bundle)
  File "/home/jingyi/PycharmProjects/llama/.venv/lib/python3.10/site-packages/llama_index/core/indices/vector_store/retrievers/retriever.py", line 180, in _get_nodes_with_embeddings
    return self._build_node_list_from_query_result(query_result)
  File "/home/jingyi/PycharmProjects/llama/.venv/lib/python3.10/site-packages/llama_index/core/indices/vector_store/retrievers/retriever.py", line 171, in _build_node_list_from_query_result
    node_with_scores.append(NodeWithScore(node=node, score=score))
  File "pydantic/main.py", line 341, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 1 validation error for NodeWithScore
node
  Can't instantiate abstract class BaseNode with abstract methods get_content, get_metadata_str, get_type, hash, set_content (type=type_error)
@jzhao62 jzhao62 added bug Something isn't working triage Issue needs to be triaged/prioritized labels May 8, 2024
Copy link

dosubot bot commented May 8, 2024

The error you're encountering with the NodeWithScore object instantiation in the Pinecone support demo is due to the BaseNode class being abstract and not meant to be instantiated directly. The NodeWithScore class requires a concrete implementation of BaseNode that provides implementations for the abstract methods mentioned in the error (get_content, get_metadata_str, get_type, hash, set_content).

To resolve this issue, ensure that when you're creating a NodeWithScore object, the node parameter is an instance of a subclass of BaseNode that implements all the required abstract methods. This subclass could be a custom class you define, which extends BaseNode and provides concrete implementations for the abstract methods.

Here's a brief example of how you might define such a subclass and use it to instantiate a NodeWithScore object:

from llama_index.core.base import BaseNode

class MyConcreteNode(BaseNode):
    def get_content(self):
        # Implement method
        pass

    def get_metadata_str(self):
        # Implement method
        pass

    def get_type(self):
        # Implement method
        pass

    def hash(self):
        # Implement method
        pass

    def set_content(self, content):
        # Implement method
        pass

# Assuming `NodeWithScore` expects an instance of `BaseNode` and a score
node = MyConcreteNode()
score = 0.5  # Example score
node_with_score = NodeWithScore(node=node, score=score)

Make sure that any class you use as a BaseNode has all the necessary methods implemented. This should help you bypass the instantiation error and use the query engine as intended.

If you're following an example or documentation, double-check that you're using the correct class types as required by the framework or library you're working with. This approach ensures that the objects you create are compatible with the operations you intend to perform on them.

Details

Sources


About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

@logan-markewich
Copy link
Collaborator

@jzhao62 not an issue with llama-index, there are no abstract methods that aren't implemented. If that was the cause none of our unit tests would pass.

More likely, it's an env issue. Try with a completely fresh venv, or in Google colab.

Locally, I would do this in a new terminal

# remove any global install
pip freeze | grep llama-index | xargs pip uninstall -y

# install in a fresh env 
python -m venv venv
source venv/bin/activate
pip install llama-index llama-index-vector-stores-pinecone

@jzhao62
Copy link
Author

jzhao62 commented May 9, 2024

@logan-markewich i switched back to 0.9.34 and it works, i might be using some packages from the .legacy

@jzhao62 jzhao62 closed this as completed May 9, 2024
@logan-markewich
Copy link
Collaborator

@jzhao62 yea, mixing legacy with non-legacy imports will also cause this. Need to use all legacy imports or none

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized
Projects
None yet
Development

No branches or pull requests

2 participants