Question
Will transformer models be the state-of-the-art on most natural language processing benchmarks on January 1, 2027?
Total Forecasters16
Community Prediction90% (80% - 96.5%)
Make a Prediction
Attention mechanisms have incurred 7 years' gradual innovations, showing that the transformer architecture is still evolving and improving.
Currently, PaLM and Megatron NLG 530B by Nvidia are leading the chart on performance, demonstrating the dominance of transformer models in NLP benchmarks.
Authors:
Opened:May 14, 2024
Closes:Dec 31, 2026
Scheduled resolution:Jan 2, 2027
Comments
? comments
Authors:
Opened:May 14, 2024
Closes:Dec 31, 2026
Scheduled resolution:Jan 2, 2027