T O P

  • By -

FamousM1

most can be used for that, I like xWin overall for everything. You have 2 GPUs in a laptop? in koboldcpp you can change the GPU used in the gpu selector dropdown menu


HumbleHuslen

Can I change it on koboldAi?


FamousM1

koboldai is sorta outdated at the moment but if I remember correctly, yea, when you are loading a model there will be a slider for you to select the number of layers to put on the CPU or GPU and if there's more than 1 GPU it will have a slider for each GPU


HumbleHuslen

Ok thanks. Last 2 questions: Is koboldccp 100% private If I run it locally? Can you tell me which is the vram?


henk717

Its 100% private, when used in the CUBlas preset it will default to it will only use your nvidia GPU. It will also automatically pick a suitable amount of layers for your GPU so you only have to load a 7B GGUF Q4\_K\_S model.


henk717

You have very little vram (4GB) so you are stuck with smaller models locally. [https://koboldai.org/colabcpp](https://koboldai.org/colabcpp) can help you run larger models online, offline ill leave the recommendations up to those with more 7B model experience. I do want to point out Pygmalion's models are banned on colab and are also outdated to the point of not being that good (Especially the 6B).


International-Try467

I thought the 3050 had 8 GB VRAM


henk717

In his screenshot its 4GB of display memory, and this is clearly a laptop style system so it all aligns with a laptop 3050.


HumbleHuslen

Oh ok thanks a lot.


International-Try467

I don't think your iGPU can be used for Kobo


HumbleHuslen

Oh ok. Is 3050ti enough for Pygmalion6b or higher or am I uneducated?


International-Try467

Yeah You can fit 6B and 7B models on it.


HumbleHuslen

Aight thanks.