File Uploads passing all the data to LLM Node #5897
Unanswered
gururohan29312
asked this question in
Q&A
Replies: 1 comment
-
|
This is a real issue because once file contents and MCP tool outputs both keep flowing forward, the LLM node ends up paying for context it does not actually need. If the useful result is already extracted upstream, the raw file payload should ideally stop there. A practical fix might be an explicit transformation step between the MCP stage and the LLM node that keeps only the structured result fields. That would give you more predictable context size and make the graph easier to reason about when multiple files are involved. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
My use case is, user uploads one image and one document file in the initial chat. I have 2 MCP tool nodes, one handles image and other handles document. After that I pass the small output received from these 2 MCP tool nodes to LLM node.
The issue is Flowise is sending all the uploaded image and document data as base64 string to LLM node and its erroring out. Because the context of model in LLM node is 32k. I am using Llama 3.1 8B Instruct model.
How to stop this from happening. Already MCPs have consumed the image and document extracted the required info. Don't want file upload data to get passed into the LLM node.
I tried disabled the Memory in LLM node, but didn't help.
I want user to upload these 2 files (they can upload any image or document) so I can't use RAG as well.
Please help to get a workaround for this.
Beta Was this translation helpful? Give feedback.
All reactions