Skip to main content

Understanding source material limits

Learn how the source material gauge helps you to provide your AI with the optimal source materials to achieve the best results

Philip Deng avatar
Written by Philip Deng
Updated over a week ago

AI is pretty smart, and getting smarter everyday, but you'll still achieve the best results if you add your own intelligence to the mix. One of the most impactful ways to get the best generative text outputs in the fewest attempts is to be careful with the source material you provide.


AI context windows

A context window in AI terminology can be thought of as the number of words a the large language model (LLM) can think about at once when talking to you, which is the limit for how much information it can work with in a single conversation.

As you interact with an LLM by prompting it, adding context, and receiving responses, you are filling up the window and getting closer to the limit of what the AI can effectively handle.

You may have noticed when using AI chatbots that as conversations get longer, response times tend to lengthen and the quality of the outputs can begin to degrade. This is because the amount of information the AI is processing may have exceeded what is optimal for that usage.

Right-sizing your source material

The source material you provide to the AI assistant in Grantable counts toward the total amount of text in the AI context window. This has a few important implications:

  1. The more source material you provide, the less the AI may be able to generate

  2. The more extraneous source material you provide, the more likely the AI is to include something unwanted

  3. The more extraneous source material you provide, the more likely you are to experience a slower response time

💡 Best practice: Strive to be economical and efficient with the source material you provide to the AI assistant—try to give it just what it needs to fulfill your request.

The LLM cannot exceed its context window limit, so providing it with too much information up front may cause it to throw an error, or prevent it from completing its response.

While modern AI systems are extremely good at understanding your requests, especially when they are specific and detailed, one of the best ways to ensure an AI uses the right source material is to minimize the amount of extra context you're giving it. This makes it less likely the AI will incorporate the wrong content in your outputs.

Source material progress meter

The size of source documents in your content library can be difficult to know. To solve this problem, we've created a real-time progress meter that shows you exactly how much source material you're giving to AI.

You'll see the progress meter the the app wherever you are able to interact with AI.

In-line AI editor progress meter

There is a mini progress meter in the bottom left corner of the in-line AI text editor tool next to the count of source materials.

Clicking on the Sources button or the mini progress meter will reveal the controls to manage source materials, with a larger progress meter that will change as you select or de-select source material.

The size of documents and the total amount of source material are both calculated in the number of words.

AI assistant progress meter

The same progress meter is visible at the bottom of the chat window when using the AI assistant in a grant application. Once again, it is next to the count of actively referenced source materials.

Like in the in-line AI tool, clicking on the Sources button or the mini progress meter will reveal the controls to manage source materials, with a larger progress meter that will change as you select or de-select source material.

Did this answer your question?