/g/ - Technology

Board dedicated to discussions related to everyday technology!

0/4000

BharatChan Disclaimer

Notice

Before proceeding, please read and understand the following:

1. BharatChan is a user-generated content platform. The site owners do not claim responsibility for posts made by users.

2. By accessing this website, you acknowledge that content may not be suitable for all audiences.

3. You must follow BharatChan’s community guidelines and rules. Failure to do so may result in a ban.

4. By using BharatChan users agree to the use of cookies, mostly for session related to user.

A poster on BharatChan must abide by the following rules:

Sitewide Rules
You must be 18 or older to post.
Sharing personal details or engaging in doxing is strictly prohibited.
Political discussions should be confined to /pol/.
NSFW content is only allowed in /kama/.
Off-topic discussions, thread derailment, or spam may result in a ban and IP blacklist.
Pornographic content is strictly prohibited.
Any activity violating local laws is not allowed.
Acknowledge

Recent Posts

View

Truth of Computer Programming

View

Browsing best practices

View

Picrel is a 56 yr old auntyji from Amreeka

View

Placement

View

View

More electronics manufacturing moving to India

View

Tatti windows

View

View

WebDev general

View

Seeding paid courses

View

View

AI = Actually Indonesians

View

View

alternative frontends

View

View

Seedhe maut 🔥

View

Script request

View

GOOGLE FIREBASE

View

Sach bol raha he kya?

View

View

sharing doxxing/hacking tutorial

View

आर्च लिनक्स को कैसे इंस्टॉल करें?

View

View

Machine learning with C±+

View

View

View

making a glow-proof smartphone

View

/AI/premiums

View

View

Rust

View

View

Torrent links general

View

how to ddos a site?

View

Kalej doesnt want me to do shit.

View

Making a community of people who want to code and ...

View

View

Every software is open source of you know assembly

View

It's over

View

No Innova-ck

View

archive.today prolly getting attacked by glowies

View

why jeets dont seed ?

View

View

Privacy Tools

View

Techsappot jara idhar aana

View

Bangladesh: Govt officials, student ‘activists’ ho...

View

phone under 20k

View

Privacy tools thread

View

2025 telephone suggestion

View

Chinese are absolutely dominating AI

View

AI = Actually Indonesians

Anonymous

ROJR

NBVf7m

No.709

It's was pinoys but same thing tbh

CEO is getting charged for defrauding his investor by using humans while claiming it was AI.

Anonymous

IN

AFZxCh

No.710

reply to my thread of running llm locally.

Anonymous

ROJR

NBVf7m

No.711

>>710

you are not gonna be able to run 67billion parameter model on any laptop (i am not sure about macbooks).

Get rtx 3060, 4060 with highest ram you can manage and then maybe you have chance.

However smaller models 7billion param can in theory be run.

I don't do it, another anon did create a thread about it so find that thread ask him desu.

On that, good idea we should probably start a general dedicated for local models what do you say sirs?

Start by grabbing bits of info from 4chan /g/ and LocalLLaMA

On /g/ there's /aicg/ and /lmg/

Duck

IN

7zK+rK

No.712

>>710

Just run them in google collab. Laptop pe nahi ho payega.

Anonymous

IN

AFZxCh

No.713

>>711

>>712

yes, yaar, i think some MLfag here should make a dedicated thread to ML which has resources and discussion on ML about new models and research papers.

Anonymous

IN

AFZxCh

No.714

>>712

I want to actually use it for daily tasks.

https://vicharak.in/axon

I found this on twitter saying that it can run models. if any anon knows kindly elaborate what this thing is.

Anonymous

ROJR

NBVf7m

No.715

>>714

>https://vicharak.in/axon

I have been interesting about them for a while, maybe i will get one of their products in future.

I think starting with 6k or somethign not bad.

Duck

IN

7zK+rK

No.716

>>714

>18k rupees

>8gb DDR4

I won't run anything your laptop can't already run.

Duck

IN

7zK+rK

No.717

Seems more like a buffed of raspberry pi, instead of device for running AI models.

Anonymous

ROJR

NBVf7m

No.718

>>716

Not sure about price, but there's one product which is like raspberry but not the only one.

Anonymous

ROJR

NBVf7m

No.719

>>717

yeah

Anonymous

ROJR

NBVf7m

No.720

if you have rtx 3050, decent chunk of ram - i have like 64gb and decent processors you can run many llms with around 7bn parameter or so.

maybe some image generation models like stable diffusion etc.

Anonymous

IN

AFZxCh

No.721

>>720

well, i have 16GB DDR4, i5 11th gen and rtx 2050 4gb Vram. what can i even do in this? also i think my gpu will work better for anything like this.

Anonymous

IN

AFZxCh

No.722

>>720

>Simpler or less detailed responses: While a 10B model can handle many tasks very well, it may sometimes lack the richer, more layered answers that come from a model with hundreds of billions of parameters.

>Reduced performance on very complex or highly technical queries: The distilled model might occasionally struggle with the most challenging reasoning tasks compared to its larger counterpart.

what do you think about this or your experience on this?

Anonymous

ROJR

NBVf7m

No.723

>>721

try 7bn parameter models - mistral, deepseek, meta all of them probably have those.

iirc deepseek has 1bn parameter model but it's mostly useless

Anonymous

ROJR

NBVf7m

No.724

>>722

>what do you think about this or your experience on this?

Give me this week i have decent setup, i can run these models.

I will see some llms and some image gen ones if it works out i will write about it. Probably kickstart a general.

Anonymous

IN

AFZxCh

No.725

>>723

alright, should i do it on ubuntu or windows? i have 40 Gb left on ubuntu so asking for that only.

Anonymous

ROJR

NBVf7m

No.726

>>725

loonix, cuda is more optimized for it and updated

Anonymous

ROJR

NBVf7m

No.727

I have separate slightly older setup, if i can run 7bn models on it then ig you will be breeze.

I will test there first. It has limitations same as yours, like 16gb ram etc. etc. Graphics card a bit worse than yours.