Welcome Guest ( Log In | Register )

6 Pages  1 2 3 > » Bottom

Outline · [ Standard ] · Linear+

 Full DeepSeek R1 At Home 🥳🥳🥳

views
     
SUSipohps3
post Jan 28 2025, 01:21 PM, updated 11 months ago

Regular
******
Senior Member
1,974 posts

Joined: Dec 2011





This post has been edited by ipohps3: Jan 28 2025, 01:22 PM
SUSipohps3
post Jan 28 2025, 01:22 PM

Regular
******
Senior Member
1,974 posts

Joined: Dec 2011



moooore memory...
zerorating
post Jan 28 2025, 01:22 PM

Miskin Adab
*****
Senior Member
975 posts

Joined: Aug 2007
From: Lokap Polis


slow la. better just use proper graphic card instead.
SUSipohps3
post Jan 28 2025, 01:25 PM

Regular
******
Senior Member
1,974 posts

Joined: Dec 2011



QUOTE(zerorating @ Jan 28 2025, 01:22 PM)
slow la. better just use proper graphic card instead.
*
wonder if it is using the NPU, GPU or CPU in the M4 Mac Mini. if NPU, no need dedicated GPU also can. also, inside the M4 Mac Mini got GPU too. main thing is memory.
Juan86
post Jan 28 2025, 01:26 PM

On my way
****
Junior Member
651 posts

Joined: Mar 2009
i have 3 gpu on floor

can i setup r1 ?
zerorating
post Jan 28 2025, 01:29 PM

Miskin Adab
*****
Senior Member
975 posts

Joined: Aug 2007
From: Lokap Polis


QUOTE(ipohps3 @ Jan 28 2025, 01:25 PM)
wonder if it is using the NPU, GPU or CPU in the M4 Mac Mini. if NPU, no need dedicated GPU also can. also, inside the M4 Mac Mini got GPU too. main thing is memory.
*
most likely GPU. they will need extra work to get NPU doing the hard lifting. also NPU is optimized for performance/watt, it wont be faster than GPU with AI accelerator.
smallcrab
post Jan 28 2025, 02:06 PM

Getting Started
**
Junior Member
140 posts

Joined: Jul 2007
From: Puchong


Who need that kind of setup??
Graphic designers?
zoozul
post Jan 28 2025, 02:09 PM

Getting Started
**
Junior Member
98 posts

Joined: Jan 2023
Sampah punya AI pun ada hype.
ihm11
post Jan 28 2025, 02:10 PM

Getting Started
**
Junior Member
62 posts

Joined: Apr 2018
dis 1 is localised, no internet

i download ady the android version

not bad the chain of tot n reasoning all listed 4 u to see
SUSipohps3
post Jan 28 2025, 02:14 PM

Regular
******
Senior Member
1,974 posts

Joined: Dec 2011



QUOTE(ihm11 @ Jan 28 2025, 02:10 PM)
dis 1 is localised, no internet

i download ady the android version

not bad the chain of tot n reasoning all listed 4 u to see
*
can register new account?

also only possible to run without internet if open-source.
kingkingyyk
post Jan 28 2025, 02:19 PM

10k Club
Group Icon
Elite
15,694 posts

Joined: Mar 2008
QUOTE(ipohps3 @ Jan 28 2025, 02:14 PM)
can register new account?

also only possible to run without internet if open-source.
*
Note that the open source model you can run the distilled version, not the full one they hosted in cloud. You need crazy amount of fast GPUs to make the full version workable. Have fun enjoy heating you room. biggrin.gif

This post has been edited by kingkingyyk: Jan 28 2025, 02:22 PM
kurtkob78
post Jan 28 2025, 02:21 PM

Do your best
*******
Senior Member
3,833 posts

Joined: Oct 2006
From: Shah Alam


better just pay chatgpt plus. only usd20 per month. lol
whynotpink
post Jan 28 2025, 02:22 PM

New Member
*
Junior Member
0 posts

Joined: Apr 2022
I am running a RM50million business company with just my M1 MacBook Air.
emefbiemef
post Jan 28 2025, 02:23 PM

Getting Started
**
Junior Member
112 posts

Joined: Aug 2006
QUOTE(kingkingyyk @ Jan 28 2025, 02:19 PM)
Note that the open source model you can run the distilled version, not the full one they hosted in cloud. You need crazy amount of fast GPUs to make the full version workable. Have fun enjoy heating you room.  biggrin.gif
*
what's the difference, preciousss?
azarimy
post Jan 28 2025, 02:23 PM

mister architect: the arrogant pr*ck
Group Icon
Elite
10,672 posts

Joined: Jul 2005
From: shah alam - skudai - shah alam


What does it do? What's the difference between running the one online?
kingkingyyk
post Jan 28 2025, 02:23 PM

10k Club
Group Icon
Elite
15,694 posts

Joined: Mar 2008
QUOTE(kurtkob78 @ Jan 28 2025, 02:21 PM)
better just pay chatgpt plus. only usd20 per month. lol
*
If you already have some big VRAM GPUs, open source LLM lets you utilize the computing power at no cost (well, you could argue there is TNB bill). smile.gif
hellothere131495
post Jan 28 2025, 02:26 PM

Casual
***
Junior Member
473 posts

Joined: Sep 2019
QUOTE(kingkingyyk @ Jan 28 2025, 02:19 PM)
Note that the open source model you can run the distilled version, not the full one they hosted in cloud. You need crazy amount of fast GPUs to make the full version workable. Have fun enjoy heating you room.  biggrin.gif
*
Bruh. The full model is also open source:
https://github.com/deepseek-ai/DeepSeek-R1

ollama too:
https://ollama.com/library/deepseek-r1

Edit:
lol you edited

This post has been edited by hellothere131495: Jan 28 2025, 02:27 PM
GagalLand
post Jan 28 2025, 02:29 PM

Getting Started
**
Junior Member
239 posts

Joined: May 2022

Obviously you don't know AI at all

If DeepSeek is sampah

All AI related tech giants won't cirit-birit in a row

QUOTE(zoozul @ Jan 28 2025, 02:09 PM)
Sampah punya AI pun ada hype.
*
jmas
post Jan 28 2025, 02:30 PM

I can edit title???
*****
Junior Member
830 posts

Joined: Mar 2010
running the 1.5bn q4_k_m locally on my NAS
slow af, but it worked so I dont have to work with o1 free-limit
hellothere131495
post Jan 28 2025, 02:30 PM

Casual
***
Junior Member
473 posts

Joined: Sep 2019
QUOTE(azarimy @ Jan 28 2025, 02:23 PM)
What does it do? What's the difference between running the one online?
*
Same. The same model. Just that you probably won't have the chance to run the online one. It's full 32 bit and 671B parameters. What you can run is the distilled (and quantized) version, like the qwen and llama3.1 8b one that has been distilled to respond like the original deepseekr1

6 Pages  1 2 3 > » Top
 

Change to:
| Lo-Fi Version
0.0176sec    1.24    5 queries    GZIP Disabled
Time is now: 17th December 2025 - 05:59 PM