Skip to content

Conversation

@sfc-gh-jcarroll
Copy link

This is a bit awkward as I didn't sort out all the proper implications of using the Chat Engine / Agent paradigm in LlamaIndex, but it shows the gist! And it works. Would love to do an example that streams responses somehow too but unclear if it's easily possible.

Note, it installs nightly, instead could use 1.26 once released.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant