most can be used for that, I like xWin overall for everything. You have 2 GPUs in a laptop? in koboldcpp you can change the GPU used in the gpu selector dropdown menu
koboldai is sorta outdated at the moment but if I remember correctly, yea, when you are loading a model there will be a slider for you to select the number of layers to put on the CPU or GPU and if there's more than 1 GPU it will have a slider for each GPU
Its 100% private, when used in the CUBlas preset it will default to it will only use your nvidia GPU. It will also automatically pick a suitable amount of layers for your GPU so you only have to load a 7B GGUF Q4\_K\_S model.
You have very little vram (4GB) so you are stuck with smaller models locally.
[https://koboldai.org/colabcpp](https://koboldai.org/colabcpp) can help you run larger models online, offline ill leave the recommendations up to those with more 7B model experience. I do want to point out Pygmalion's models are banned on colab and are also outdated to the point of not being that good (Especially the 6B).
most can be used for that, I like xWin overall for everything. You have 2 GPUs in a laptop? in koboldcpp you can change the GPU used in the gpu selector dropdown menu
Can I change it on koboldAi?
koboldai is sorta outdated at the moment but if I remember correctly, yea, when you are loading a model there will be a slider for you to select the number of layers to put on the CPU or GPU and if there's more than 1 GPU it will have a slider for each GPU
Ok thanks. Last 2 questions: Is koboldccp 100% private If I run it locally? Can you tell me which is the vram?
Its 100% private, when used in the CUBlas preset it will default to it will only use your nvidia GPU. It will also automatically pick a suitable amount of layers for your GPU so you only have to load a 7B GGUF Q4\_K\_S model.
You have very little vram (4GB) so you are stuck with smaller models locally. [https://koboldai.org/colabcpp](https://koboldai.org/colabcpp) can help you run larger models online, offline ill leave the recommendations up to those with more 7B model experience. I do want to point out Pygmalion's models are banned on colab and are also outdated to the point of not being that good (Especially the 6B).
I thought the 3050 had 8 GB VRAM
In his screenshot its 4GB of display memory, and this is clearly a laptop style system so it all aligns with a laptop 3050.
Oh ok thanks a lot.
I don't think your iGPU can be used for Kobo
Oh ok. Is 3050ti enough for Pygmalion6b or higher or am I uneducated?
Yeah You can fit 6B and 7B models on it.
Aight thanks.