facebook pixel
@google
An AI model’s context window is the range of tokens — small building blocks of information, like parts of words, images or videos — that the model can take in at once. The longer the context window is, the more information it can process and the more useful and relevant its responses become. Google DeepMind engineers increased the context window for Gemini 1.5 Pro from 32,000 to 1 million tokens — giving it the longest context window yet of any model of its size. Tap the link in our bio to learn more. #GeminiAI

 33.2k

 532

Credits
    Tags, Events, and Projects
    • geminiai