Welcome Guest ( Log In | Register )

Outline · [ Standard ] · Linear+

 ChatGPT Pro, $200 per month for o1 Pro

views
     
DogeGamingPRO
post Jan 13 2025, 03:16 PM

Getting Started
**
Junior Member
169 posts

Joined: Mar 2014



QUOTE(khusyairi @ Jan 13 2025, 02:59 PM)
I want an AI that smarter than me.
Unfortunately, I am wrong & give too much expectation for chatgpt.

To be frank, wikipedia will give much better info than chatgpt.
Maybe my question, too complex for chatgpt to answer & chatgpt not enough historical data to support it (maybe the system too young).
*
The response will only be as good as the prompt, shit prompt = shit response.
Especially true for o1.

Most ppl don't use it effectively, its true that it is not foolproof yet, cannot simply ask and get good response.
But people that are already using it effectively now will get way ahead of the others.

If u want to use it like a search engine, better use Perplexity. That is more built for purpose.

This post has been edited by DogeGamingPRO: Jan 13 2025, 03:18 PM
DogeGamingPRO
post Jan 13 2025, 03:32 PM

Getting Started
**
Junior Member
169 posts

Joined: Mar 2014



QUOTE(yvliew @ Jan 13 2025, 03:25 PM)
Got what LLM model as good as chatgpt? Download and run locally la. As long as the model is not too big.
*
Decent model to run locally can try Phi4, 14B params. Recently released only
Just run using ollama, very easy

This post has been edited by DogeGamingPRO: Jan 13 2025, 03:34 PM
DogeGamingPRO
post Jan 13 2025, 03:39 PM

Getting Started
**
Junior Member
169 posts

Joined: Mar 2014



QUOTE(yvliew @ Jan 13 2025, 03:36 PM)
what difference is this from free basic user acc??? have you tried it? can use gpt live chat? last time i try talking cantonese but limited request only. this one unlimited or not?
*
Plus account can use the voice mode and now got camera mode, probably the live conversation one you talking about. And access to o1.
Its the $20 tier but ppl do sharing, and got limits.

Unlimited need the $200 tier, not sure got anyone do sharing yet.

DogeGamingPRO
post Jan 13 2025, 04:18 PM

Getting Started
**
Junior Member
169 posts

Joined: Mar 2014



QUOTE(ipohps3 @ Jan 13 2025, 04:08 PM)
host using which GPU that have enough RAM???
Locally run 7b 14b models not too demanding, just slower or faster tokens per second only


 

Change to:
| Lo-Fi Version
0.0194sec    1.02    6 queries    GZIP Disabled
Time is now: 17th December 2025 - 03:31 AM