A new feature in Grok-1.5 is the capability to process long contexts of up to 128K tokens within its context window. This allows Grok to have an increased memory capacity of up to 16 times the previous context length, enabling it to utilize information from substantially longer documents.
ORIGINAL LINK: https://x.ai/blog/grok-1.5?utm_source=www.therundown.ai&utm_medium=newsletter&utm_campaign=elon-musk-s-xai-reveals-the-newest-chatbot