@jankais3r You were right: the 65B model failed. The process was killed. Here’s also a screenshot of asitop 5min into the process.
Eventually, the swap file reached the size of 70GB and more.
I’ll try the 30B. https://t.co/dZEgs8Ssrk
Eventually, the swap file reached the size of 70GB and more.
I’ll try the 30B. https://t.co/dZEgs8Ssrk
To use yours, do I have to repeat the quantization process I did for llama.cpp? https://t.co/cIbw4aYhcm
This week’s free edition is titled Issue #2 – Law firms’ morale at an all-time high now that they can use AI to generate evil plans to charge customers more money
Read it here: https://t.co/v2Y8o5706M https://t.co/qX6SmyYfMN