Recent Posts
Invented 0
was talking to a white girl on discord
Ova !
Ex Foid Hater
How to win at life (mantra for greatest degree of ...
It's actually over
kitti pyaari hai...❤️❤️❤️
Serious thread about AI/LLMs
Gems of Chamar Rashtra
About hindus eating meat
TL zara idhar aana
Kek
What's the reason behind this mentality?
Edits
lost all hopes in humanity
Indians love dancing
Serious thread about AI/LLMs
V1Oo1c
No.469134
Please no bakar in this thread, serious answers wonly
What resources/compute power are realistically needed to train a foundational model?
How many and which type of processors and their cost?
How many TBs of RAM and which type and their cost?
How many TBs of GPUs and which type and their cost?
Give the bare minimum of these resources and any other resources that are needed to train a basic LLM foundational model.
Thank you
YAd+pm
No.469135
>>469134(OP)
rKhdSx
No.469136
>>469134(OP)
abhi behen ke saheliya kyu post karra pajeet, ja jaake /g/ pe post kar
A3x3hv
No.469152
>>469134(OP)
Nothing you can get from here, go to local models threads in /g/ and ask there instead. Though they will tell you to just read OP guides
My high level understanding of it is
a) Install le model running software
b) Download le model(Usually 10s and 100s of gigabytes)
c) Attach le model to le runner software and you got a completely offline chatgpt
>but saar I want to train AI saars
That's more foundational stuff, I recommend you to start first researching about SVMs and implementing a basic nueral network and best resource is AI itself and look in to tensorflow tutorials.
LIGE4M
No.469161
xOLz7k
No.469174
>>469134(OP)
Post in /infra/ if you want serious thread. You will nothing on /b/.


NIey5E
No.469178
>>469134(OP)
Some IITian trained a LLM 5 billion parametrs, in 1 lakh rupees. On 8 H200 GPUs. You can't train them on your local system. You can fine tune them. I recently fine tuned a model on my dataset, i have 12gb vram and 32gb ram.


NIey5E
No.469180
>>469134(OP)
Have atleast 2 lakh rupees. Get the code base and the database ready. Have a good amount of storage SSD and then rent multiple GPUs on runpod and run the training possibly for days.. it's a tedious task


NIey5E
No.469195
>>469134(OP)
For example Basic lora training on float 8 precision takes 24 hours on my system. Training a swi transformer from scratch takes 3 days
V1Oo1c
No.469198
>>469152
thank you for the reply, I wanted to know how much compute power and resources are needed to train a foundational model, not that I want to train one myself
All this AI jazz had me curious about the question since
>0 foundational models
keeps getting posted often
Don't WITCH companies or the gormint have the money for those resources?
>>469161
Links share karein saar
>>469174
I will post in /b/ nigger because there is where the anons are and the relevant replies will be there amongst the shitposts, all other boards are dead, also this is more /g/ than /infra/ tbh
>>469178
>>469180
Interesting, what did he train though?
>>469135
>>469136
arre gawaro


NIey5E
No.469261
>>469198
https&x2F;&x2F;wwwredditcom&x2F;r&x2F;LocalLLaMA&x2F;s&x2F;x1kPd9nUpg
Read more about it here also check his GitHub repo
(Text Content Sanitized Too.)
xOLz7k
No.469262
>>469198
Go ahead that, fag. Watch this thread either get derailed or die.
V1Oo1c
No.469264
>>469261
thanks
why are you banned though? you are one of the anons who makes good contributions and is knowledgeable
A3x3hv
No.469276
>>469198
>compute power and resources
I tried running it in my 50000 rs lap. Took half an hour to reply.
v521WR
No.469282
>>469134(OP)



















































