Will any open-source Transformers LLM model that function as a dense mixture of experts be released by end of 2024?
3
Ṁ18
Jan 1
50%
chance

Will any open source or weights Transformers LLM based model emerge that is functionally a dense version of mixture of experts where the empirical mathematical sparsity resembles dense models like Llama 3.1 405B or Mistral Large Enough. A tool that allows for the creation of this type of model even if no model is released along with it would resolve as yes as long as it is possible to create the model for example Mergekit for various ways of model manipulation. A paper would only resolve as yes if there was an accompanying model, functional code released, or implementation by a third party.

Get Ṁ1,000 play money