Round 1 - Codellama70B vs Mixtral MoE vs Mistral 7B for coding

william falcon · Intermediate ·🧠 Large Language Models ·2y ago
Codellama 70B dropped a few hours ago. I use it to help me write code to finetune a resnet50 and compare vs Mixtral MoE and Mistral 7B. TL;DR Codellama is too safe... (prompt suggestions?). Mistral 7B was okay. Mixtral MoE crushed it.
Watch on YouTube ↗ (saves to browser)
5 Levels of AI Agents - From Simple LLM Calls to Multi-Agent Systems
Next Up
5 Levels of AI Agents - From Simple LLM Calls to Multi-Agent Systems
Dave Ebbelaar (LLM Eng)