Kimi K3 is reportedly set for a Q3 release, boasting over 2.5 trillion parameters, according to a report by "Daily Anxiety Emperor" on X. Internal tests have explored context lengths exceeding 1 million tokens, though it's unclear if this feature will be available to users. The main challenge for implementing the 1M context is not technical but related to compute resources. This development follows DeepSeek V4 Flash/Pro's introduction of 1M context as a key feature, potentially positioning Kimi K3 to match its competitor in model scale and long-context capabilities.