IFakeLab IQuest-Coder-V1 (Analysis)
reddit.com·3h·
Discuss: r/LocalLLaMA
Preview
Report Post

Culprit: https://iquestlab.github.io/ / https://huggingface.co/IQuestLab

BLUF: I assess with near certainty that IQuest-Coder's models are a hybrid of LLaMA-3.1-70B's attention config with Qwen2.5-32B's dimensions and tokenizer. The "trained from scratch" are misleading and false. While the models themselves are not pretrained by IFakeLab, the loop mechanism seems to be a frankenstein combination of four papers.

Longer;

Here are the points that this lab/model release is lying about.

  1. Claims* of training "from sc...

Similar Posts

Loading similar posts...