/g/ - Technology

Board dedicated to discussions related to everyday technology!

0/4000

BharatChan Disclaimer

Notice

Before proceeding, please read and understand the following:

1. BharatChan is a user-generated content platform. The site owners do not claim responsibility for posts made by users.

2. By accessing this website, you acknowledge that content may not be suitable for all audiences.

3. You must follow BharatChan’s community guidelines and rules. Failure to do so may result in a ban.

4. By using BharatChan users agree to the use of cookies, mostly for session related to user.

A poster on BharatChan must abide by the following rules:

Sitewide Rules
You must be 18 or older to post.
Sharing personal details or engaging in doxing is strictly prohibited.
Political discussions should be confined to /pol/.
NSFW content is only allowed in /kama/.
Off-topic discussions, thread derailment, or spam may result in a ban and IP blacklist.
Pornographic content is strictly prohibited.
Any activity violating local laws is not allowed.
Acknowledge

Recent Posts

just installed Arch Linux bros

View

Emacs

View

poast your fav cool websites

View

Give me one tech tip you know about

View

Urge to watch Mr. Robot Hindi dub

View

/cyb/+/psg/: Cyber-Punk/Security & Privacy

View

soyjak.st forum which hacked 4chin blocks all indi...

View

Have you discovered female luke smith yet?

View

View

Aarambh hai prachand..

View

OTG vs Microwave

View

which llm subscription is the best

View

Tell me all about indiachan/bharatchan lore

View

View

How do people who have a job get time to code on s...

View

Great time to be a /g/enius

View

Just found out linux foundation has their own free...

View

the best android browser has arrived!!

View

View

My ThinkPad arrived

View

(((Open)))ai playing new tricks

View

NEED ADVICE FROM true /g/entooman

View

View

Create something interesting with your skills now ...

View

View

Gonna make my own 34-key keyboard

View

View

which software on PC and mobile you use to handle ...

View

🦀🦀🦀🦀🦀

View

C++ Resources

View

View

Local Models General

View

GPT-5

View

View

What is shell programming

View

View

Libre Thinkpads

View

Computing

View

Tech Scams

View

Thinkpads are the best Laptops

View

View

M. tech. thesis suggestion,ideas.

View

View

Linux /gen/

View

Indian related AI discussion

View

privacy chuds gtfih

View

View

Best LLM model for coding and maths.

View

PuchAI

View

View

AI = Actually Indonesians

Anonymous

IN

NBVf7m

No.709

It's was pinoys but same thing tbh

CEO is getting charged for defrauding his investor by using humans while claiming it was AI.

Anonymous

IN

AFZxCh

No.710

reply to my thread of running llm locally.

Anonymous

IN

NBVf7m

No.711

>>710

you are not gonna be able to run 67billion parameter model on any laptop (i am not sure about macbooks).

Get rtx 3060, 4060 with highest ram you can manage and then maybe you have chance.

However smaller models 7billion param can in theory be run.

I don't do it, another anon did create a thread about it so find that thread ask him desu.

On that, good idea we should probably start a general dedicated for local models what do you say sirs?

Start by grabbing bits of info from 4chan /g/ and LocalLLaMA

On /g/ there's /aicg/ and /lmg/

Duck

IN

7zK+rK

No.712

>>710

Just run them in google collab. Laptop pe nahi ho payega.

Anonymous

IN

AFZxCh

No.713

>>711

>>712

yes, yaar, i think some MLfag here should make a dedicated thread to ML which has resources and discussion on ML about new models and research papers.

Anonymous

IN

AFZxCh

No.714

>>712

I want to actually use it for daily tasks.

https://vicharak.in/axon

I found this on twitter saying that it can run models. if any anon knows kindly elaborate what this thing is.

Anonymous

IN

NBVf7m

No.715

>>714

>https://vicharak.in/axon

I have been interesting about them for a while, maybe i will get one of their products in future.

I think starting with 6k or somethign not bad.

Duck

IN

7zK+rK

No.716

>>714

>18k rupees

>8gb DDR4

I won't run anything your laptop can't already run.

Duck

IN

7zK+rK

No.717

Seems more like a buffed of raspberry pi, instead of device for running AI models.

Anonymous

IN

NBVf7m

No.718

>>716

Not sure about price, but there's one product which is like raspberry but not the only one.

Anonymous

IN

NBVf7m

No.719

>>717

yeah

Anonymous

IN

NBVf7m

No.720

if you have rtx 3050, decent chunk of ram - i have like 64gb and decent processors you can run many llms with around 7bn parameter or so.

maybe some image generation models like stable diffusion etc.

Anonymous

IN

AFZxCh

No.721

>>720

well, i have 16GB DDR4, i5 11th gen and rtx 2050 4gb Vram. what can i even do in this? also i think my gpu will work better for anything like this.

Anonymous

IN

AFZxCh

No.722

>>720

>Simpler or less detailed responses: While a 10B model can handle many tasks very well, it may sometimes lack the richer, more layered answers that come from a model with hundreds of billions of parameters.

>Reduced performance on very complex or highly technical queries: The distilled model might occasionally struggle with the most challenging reasoning tasks compared to its larger counterpart.

what do you think about this or your experience on this?

Anonymous

IN

NBVf7m

No.723

>>721

try 7bn parameter models - mistral, deepseek, meta all of them probably have those.

iirc deepseek has 1bn parameter model but it's mostly useless

Anonymous

IN

NBVf7m

No.724

>>722

>what do you think about this or your experience on this?

Give me this week i have decent setup, i can run these models.

I will see some llms and some image gen ones if it works out i will write about it. Probably kickstart a general.

Anonymous

IN

AFZxCh

No.725

>>723

alright, should i do it on ubuntu or windows? i have 40 Gb left on ubuntu so asking for that only.

Anonymous

IN

NBVf7m

No.726

>>725

loonix, cuda is more optimized for it and updated

Anonymous

IN

NBVf7m

No.727

I have separate slightly older setup, if i can run 7bn models on it then ig you will be breeze.

I will test there first. It has limitations same as yours, like 16gb ram etc. etc. Graphics card a bit worse than yours.