Recent Posts
Emacs
poast your fav cool websites
Give me one tech tip you know about
Urge to watch Mr. Robot Hindi dub
/cyb/+/psg/: Cyber-Punk/Security & Privacy
soyjak.st forum which hacked 4chin blocks all indi...
Have you discovered female luke smith yet?
Aarambh hai prachand..
OTG vs Microwave
which llm subscription is the best
Tell me all about indiachan/bharatchan lore
How do people who have a job get time to code on s...
Great time to be a /g/enius
Just found out linux foundation has their own free...
the best android browser has arrived!!
My ThinkPad arrived
(((Open)))ai playing new tricks
NEED ADVICE FROM true /g/entooman
Create something interesting with your skills now ...
Gonna make my own 34-key keyboard
which software on PC and mobile you use to handle ...
🦀🦀🦀🦀🦀
C++ Resources
Local Models General
GPT-5
What is shell programming
Libre Thinkpads
Computing
Tech Scams
Thinkpads are the best Laptops
M. tech. thesis suggestion,ideas.
Linux /gen/
Indian related AI discussion
privacy chuds gtfih
Best LLM model for coding and maths.
PuchAI
AI = Actually Indonesians
NBVf7m
No.709
It's was pinoys but same thing tbh
CEO is getting charged for defrauding his investor by using humans while claiming it was AI.
AFZxCh
No.710
reply to my thread of running llm locally.
NBVf7m
No.711
>>710
you are not gonna be able to run 67billion parameter model on any laptop (i am not sure about macbooks).
Get rtx 3060, 4060 with highest ram you can manage and then maybe you have chance.
However smaller models 7billion param can in theory be run.
I don't do it, another anon did create a thread about it so find that thread ask him desu.
On that, good idea we should probably start a general dedicated for local models what do you say sirs?
Start by grabbing bits of info from 4chan /g/ and LocalLLaMA
On /g/ there's /aicg/ and /lmg/
7zK+rK
No.712
>>710
Just run them in google collab. Laptop pe nahi ho payega.
AFZxCh
No.713
AFZxCh
No.714
>>712
I want to actually use it for daily tasks.
I found this on twitter saying that it can run models. if any anon knows kindly elaborate what this thing is.
NBVf7m
No.715
>>714
>https://vicharak.in/axon
I have been interesting about them for a while, maybe i will get one of their products in future.
I think starting with 6k or somethign not bad.
7zK+rK
No.716
7zK+rK
No.717
Seems more like a buffed of raspberry pi, instead of device for running AI models.
NBVf7m
No.718
>>716
Not sure about price, but there's one product which is like raspberry but not the only one.
NBVf7m
No.719
>>717
yeah
NBVf7m
No.720
if you have rtx 3050, decent chunk of ram - i have like 64gb and decent processors you can run many llms with around 7bn parameter or so.
maybe some image generation models like stable diffusion etc.
AFZxCh
No.721
>>720
well, i have 16GB DDR4, i5 11th gen and rtx 2050 4gb Vram. what can i even do in this? also i think my gpu will work better for anything like this.
AFZxCh
No.722
>>720
>Simpler or less detailed responses: While a 10B model can handle many tasks very well, it may sometimes lack the richer, more layered answers that come from a model with hundreds of billions of parameters.
>Reduced performance on very complex or highly technical queries: The distilled model might occasionally struggle with the most challenging reasoning tasks compared to its larger counterpart.
what do you think about this or your experience on this?
NBVf7m
No.723
>>721
try 7bn parameter models - mistral, deepseek, meta all of them probably have those.
iirc deepseek has 1bn parameter model but it's mostly useless
NBVf7m
No.724
>>722
>what do you think about this or your experience on this?
Give me this week i have decent setup, i can run these models.
I will see some llms and some image gen ones if it works out i will write about it. Probably kickstart a general.
AFZxCh
No.725
>>723
alright, should i do it on ubuntu or windows? i have 40 Gb left on ubuntu so asking for that only.
NBVf7m
No.726
>>725
loonix, cuda is more optimized for it and updated
NBVf7m
No.727
I have separate slightly older setup, if i can run 7bn models on it then ig you will be breeze.
I will test there first. It has limitations same as yours, like 16gb ram etc. etc. Graphics card a bit worse than yours.