M

Question

Will transformer models be the state-of-the-art on most natural language processing benchmarks on January 1, 2027?

Total Forecasters16
Community Prediction90% (80% - 96.5%)

Make a Prediction

50%
community: 90%
Currently, PaLM and Megatron NLG 530B by Nvidia are leading the chart on performance, demonstrating the dominance of transformer models in NLP benchmarks.
Despite advancements, some models like the Eagle 7B have surpassed traditional transformers in evaluation benchmarks, suggesting potential competition to transformer dominance.
In March 2023, a new state-of-the-art model was being released almost every other day, indicating rapid advancements and competition in NLP models.
The attention mechanism in transformer models, which calculates correlations between pairs of tokens, takes quadratic time in the input size, making it a time bottleneck for transformer operations.
Opened:May 14, 2024
Closes:Dec 31, 2026
Scheduled resolution:Jan 2, 2027

Comments

? comments