MANIFOLD
BrowseUS ElectionNewsAbout
Will I believe in 1 year that DeepSeek R1 was substantially trained via distillation of a US model?
Mini
7
Ṁ148
Jan 1
64%
chance
1D
1W
1M
ALL

#DeepSeek
#Artificial Intelligence
#Machine Learning
Get
Ṁ1,000
and
S1.00
1 Comment
Sort by:

You should probably clarify what you mean by distillation since so many people are using different definitions.

Related questions

will deepseek-v4 destroy all other models?
15% chance
will DeepSeek become a closed AI lab by EOY?
12% chance
Did DeepSeek violate OpenAI's terms of service by using OpenAI model outputs for distillation in 2024 or January 2025?
9% chance
V4 (DeepSeek) release date
Will DeepSeek's next reasoning model be called R3?
1% chance
Will OpenAI’s claims that DeepSeek is a distillation of their models become the consensus view?
17% chance
Will DeepSeek R2 be open source?
93% chance
Will there be an open replication of DeepSeek v3 for <$10m?
41% chance
Did DeepSeek lie about the GPU compute budget they used in the training of v3?
5% chance
R2 (DeepSeek) release date

Related questions

will deepseek-v4 destroy all other models?
15% chance
Will OpenAI’s claims that DeepSeek is a distillation of their models become the consensus view?
17% chance
will DeepSeek become a closed AI lab by EOY?
12% chance
Will DeepSeek R2 be open source?
93% chance
Did DeepSeek violate OpenAI's terms of service by using OpenAI model outputs for distillation in 2024 or January 2025?
9% chance
Will there be an open replication of DeepSeek v3 for <$10m?
41% chance
V4 (DeepSeek) release date
Did DeepSeek lie about the GPU compute budget they used in the training of v3?
5% chance
Will DeepSeek's next reasoning model be called R3?
1% chance
R2 (DeepSeek) release date
Terms & Conditions•Privacy Policy•Sweepstakes Rules
BrowseElectionNewsAbout