Question

Will transformer models be the state-of-the-art on most natural language processing benchmarks on January 1, 2027?

Community Prediction80%85%90%95%100%Mar 15Mar 19Mar 23Mar 27Mar 31Apr 04Apr 08Apr 12Apr 16Apr 20Apr 24Apr 28May 02May 06May 10now
Total Forecasters16
Community Prediction
90%
(85% - 98%)

Make a Prediction

50%
community: 90%
Currently, PaLM and Megatron NLG 530B by Nvidia are leading the chart on performance, demonstrating the dominance of transformer models in NLP benchmarks.
Attention mechanisms have incurred 7 years' gradual innovations, showing that the transformer architecture is still evolving and improving.
Despite advancements, some models like the Eagle 7B have surpassed traditional transformers in evaluation benchmarks, suggesting potential competition to transformer dominance.
In March 2023, a new state-of-the-art model was being released almost every other day, indicating rapid advancements and competition in NLP models.
Opened:May 14, 2024
Closes:Dec 31, 2026
Scheduled resolution:Jan 2, 2027
Spot Scoring Time:May 17, 2024

Comments

? comments

We use cookies 🍪 to understand how you use Metaculus and to improve your experience.

Learn more about how we use cookies in our Privacy Policy