Welcome Guest ( Log In | Register )

Outline · [ Standard ] · Linear+

> Ollama - Offline Generative AI, Similar to ChatGPT

views
     
TSxxboxx
post Apr 26 2024, 10:10 PM, updated 2 months ago

The mind is for having ideas, not holding them
*******
Senior Member
5,138 posts

Joined: Oct 2004
From: J@Y B33


I searched but there doesn't seem to be any thread about Ollama and nobody discuss about it too. And I think this is the correct sub-forum to discuss about this.

Anyway, been dabbling with Ollama for almost a month ever since they release Windows installer version. At first been using it on CMD / Terminal, mistral and llama2 seems like the best models that time. And then learned about pairing Ollama with PrivateGPT for it to ingest data.

Now after a month, recently Meta release llama3 and Microsoft release phi3 model, the AI intelligence level got even better. I'm now using Open-WebUI instead of PrivateGPT as it looks to be a lot more stable and can change the model easily.

I tried giving this puzzle to the AI, if you seen Alice in Borderland you'll know the answer. Only 1 give impressively correct answer.
QUOTE
There are three on/off switches on the ground floor of a building. Only one operates a single lightbulb on the third floor. The other two switches are not connected to anything. Put the switches in any on/off order you like. Then go to the third floor to check the bulb. Without leaving the third floor, can you figure out which switch is genuine? You get only one try.


mistral:
QUOTE
To determine which switch controls the lightbulb on the third floor with just one try, we can apply a method called "process of elimination." Here's a strategy to follow:

1. Test the switches in a specific order and record the results. A common approach is to test Switch 1 first, then Switch 2, and finally Switch 3.
2. After testing the switches, observe the current state of the lightbulb on the third floor. Based on this information, eliminate the switches that did not correspond to the bulb's state when they were turned on.
3. Now, you are left with the switch(es) that could control the lightbulb. Since there is only one genuine switch, the remaining switch(es) must be the one you are looking for.

Here's an example of how this process might work:

- Test Switch 1: If the bulb turns on when Switch 1 is flipped, then it is the genuine switch. If not, proceed to the next step.
- Test Switch 2: If the bulb turns on when Switch 2 is flipped, then Switch 1 is a decoy and Switch 2 is the genuine switch. If not, proceed to the next step.
- Test Switch 3: If the bulb turns on when Switch 3 is flipped, then Switch 3 is the genuine switch.

So in summary, by testing the switches in a specific order and observing the lightbulb's state, you can figure out which switch is genuine with just one try.


llama3
QUOTE
A classic puzzle!
Let's think about it step by step.

First, let's assume that all three switches are on. We'll then go to the third floor and turn off the lightbulb. Now, we're back on the ground floor. Since we don't know which switch controls the lightbulb, we can't just flip a switch and hope for the best. Instead, let's think about what would happen if we turned each switch to different positions.

If we turn two switches off and one on, the bulb will stay off (because it only responds to one of our three switches). If we turn all three switches off, the bulb will still be off. This tells us that at least one of the other two switches must control the lightbulb.

Now, let's consider what happens if we turn two switches on and one off. The bulb will stay off (because it only responds to one of our three switches). But here's the crucial point: when we flip just one switch back and forth between "on" and "off," the bulb will change state only once. This tells us that this one switch must control the lightbulb.

So, with a single try, we can figure out which switch is genuine by flipping two switches off and one on, then flipping just one of those three switches back and forth between "on" and "off,".


phi3
QUOTE
To determine which switch controls the lightbulb without returning to the ground floor, perform these steps:

1. Turn on switch 1 and leave it on for a few minutes.
2. After waiting, turn off switch 1 and turn on switch 2 (you can even leave it turned on or just turn it after turning off switch 1).
3. Proceed to the third floor with lightbulb in sight.
4. Observe the state of the bulb:
  - If the bulb is lit, then switch 2 controls the lightbulb (because you turned it on and left it on when going upstairs).
  - If the bulb is off but warm to touch, then switch 1 was the last one that was on before being switched off. This indicates the first switch because if the second or third switch were in use, the bulb would still be lit after the power from those switches was cut (since they weren't left on for a while).
  - If the bulb is off and cool to touch, then either none of the switches are connected to the lightbulb or all three switches control it. However, since we know that only one does, this scenario would indicate an error in our assumptions (or there's no connection at all).

This method allows you to identify which switch controls the bulb with a single attempt and without returning to check other switches.


Anyone else playing around with Ollama?
iZuDeeN
post May 11 2024, 01:02 AM

Look at all my stars!!
*******
Senior Member
3,381 posts

Joined: Jan 2003
From: PJ, Malaysia
Guys someone can guide me a newbie guide to install Ollama with Llama3 model on a Windows machine?

I tried using docker on Synology ; and it went perfectly ; but took forever to generate a response

On PC i would be using it on 5600x with RTX3080 with 32GB RAM
TSxxboxx
post May 13 2024, 04:55 PM

The mind is for having ideas, not holding them
*******
Senior Member
5,138 posts

Joined: Oct 2004
From: J@Y B33


QUOTE(iZuDeeN @ May 11 2024, 01:02 AM)
Guys someone can guide me a newbie guide to install Ollama with Llama3 model on a Windows machine?

I tried using docker on Synology ; and it went perfectly ; but took forever to generate a response

On PC i would be using it on 5600x with RTX3080 with 32GB RAM
*
Even on my 8 cores laptop it is slow to generate response compared to my computer with GPU. NAS CPU is just not powerful enough to run AI.

Ollama can be installed easily on Windows with it's setup file. Go to ollama.com and download the setup file for Windows. After install open command prompt or power shell (might need to run as admin) and type "ollama run <insert model name>"
It will then download the model. Better to get llama3:instruct as this have been tweaked to give better answer. Now you can type anything to ask the AI. phi3: instruct also another good model to try.

If you want to have UI (user interface), install this
https://github.com/open-webui/open-webui

I recommend using the docker method. You still need to install ollama, this is just create web interface and also for RAG. After install docker, just run this on command prompt
CODE
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main


You can attach text doc or insert link and ask questions about it. With model such as llava you can also attach picture and ask about it, but this is still very infancy.

Once done, you access it on http://localhost:3000
With some configuration on your computer firewall, you can also access the web ui with any device connected to the same local network, your computer become the AI server for your household.
iZuDeeN
post May 13 2024, 09:47 PM

Look at all my stars!!
*******
Senior Member
3,381 posts

Joined: Jan 2003
From: PJ, Malaysia
QUOTE(xxboxx @ May 13 2024, 04:55 PM)
Even on my 8 cores laptop it is slow to generate response compared to my computer with GPU. NAS CPU is just not powerful enough to run AI.

Ollama can be installed easily on Windows with it's setup file. Go to ollama.com and download the setup file for Windows. After install open command prompt or power shell (might need to run as admin) and type "ollama run <insert model name>"
It will then download the model. Better to get llama3:instruct as this have been tweaked to give better answer. Now you can type anything to ask the AI. phi3: instruct also another good model to try.

If you want to have UI (user interface), install this
https://github.com/open-webui/open-webui

I recommend using the docker method. You still need to install ollama, this is just create web interface and also for RAG. After install docker, just run this on command prompt
CODE
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main


You can attach text doc or insert link and ask questions about it. With model such as llava you can also attach picture and ask about it, but this is still very infancy.

Once done, you access it on http://localhost:3000
With some configuration on your computer firewall, you can also access the web ui with any device connected to the same local network, your computer become the AI server for your household.
*
I manage to install ; but problem is I cant download anything from huggingface repo ; seems invalid credentials or something as in docker desktop also I cant see any images.. just blank when search

I want to use llama3-uncensored model
TSxxboxx
post May 13 2024, 10:28 PM

The mind is for having ideas, not holding them
*******
Senior Member
5,138 posts

Joined: Oct 2004
From: J@Y B33


QUOTE(iZuDeeN @ May 13 2024, 09:47 PM)
I manage to install ; but problem is I cant download anything from huggingface repo ; seems invalid credentials or something as in docker desktop also I cant see any images.. just blank when search

I want to use llama3-uncensored model
*
You need to have account in huggingface to download. Some model need request permission and after granted then only can see the download.

Which one you want to download? Llama-3-8B-Lexi-Uncensored?
Go to files and versions, then click the download icon next to the gguf name
user posted image

Uncensored version basically only allow the AI to answer unethical or criminal questions. I never tried using gguf file with ollama, I just find model from ollama own website, and use the pull command. Usually when a huggingface model become famous enough, someone will upload it to ollama. Now I'm downloading llava-llama3, see if it can recognize better than llava
TSxxboxx
post May 13 2024, 11:26 PM

The mind is for having ideas, not holding them
*******
Senior Member
5,138 posts

Joined: Oct 2004
From: J@Y B33


llava-llama3 is impressive. I ask to describe this picture

user posted image

It answer
QUOTE
In the image, a lion-like creature is seated at a wooden desk in what appears to be a forest setting. The creature's fur is a light brown color, and it has long hair that cascades down its back. Its eyes are a striking blue, and it wears a serious expression on its face as it gazes directly into the camera.


The desk on which the creature sits is made of wood and is positioned in front of a backdrop of trees and foliage. The forest setting suggests an outdoor environment, possibly a clearing or a secluded spot within the woods.


On the desk, there's a laptop computer open to the screen. The laptop is silver in color, contrasting with the natural tones of the surrounding environment. The creature seems to be interacting with the laptop, perhaps using it for work or communication.


There are no discernible texts or other objects in the image. The relative position of the objects suggests that the lion-like creature is the main subject of this image, with the forest setting and the laptop serving as supporting elements to frame the scene. The image does not provide any information about the actions of the objects beyond what has been described.

ericcheng2021
post May 16 2024, 04:19 PM

Getting Started
**
Junior Member
174 posts

Joined: Mar 2021


Ollama now very easy to run on windows and download model.
Trying llama2 uncensored to generate some ahem stories, really good at it.
TSxxboxx
post May 17 2024, 12:43 PM

The mind is for having ideas, not holding them
*******
Senior Member
5,138 posts

Joined: Oct 2004
From: J@Y B33


QUOTE(ericcheng2021 @ May 16 2024, 04:19 PM)
Ollama now very easy to run on windows and download model.
Trying llama2 uncensored to generate some ahem stories, really good at it.
*
I tried llama3-chatqa, it was very fast to give answer
ericcheng2021
post May 17 2024, 07:11 PM

Getting Started
**
Junior Member
174 posts

Joined: Mar 2021


QUOTE(xxboxx @ May 17 2024, 12:43 PM)
I tried llama3-chatqa, it was very fast to give answer
*
what GPU u using ?
TSxxboxx
post May 17 2024, 09:43 PM

The mind is for having ideas, not holding them
*******
Senior Member
5,138 posts

Joined: Oct 2004
From: J@Y B33


QUOTE(ericcheng2021 @ May 17 2024, 07:11 PM)
what GPU u using ?
*
I'm using 3080
chow1942
post Jul 9 2024, 08:01 AM

Getting Started
**
Junior Member
279 posts

Joined: Dec 2006
i've try running using old laptop and only tinyllama model waiting time is acceptable lmao
SUSMilfuntastic
post Jul 25 2024, 05:24 AM

Real man help each other not SUS one another
****
Junior Member
559 posts

Joined: Dec 2022
From: Chyna builds kingdom instead of BS about freedom

Interesting share tq
TSxxboxx
post Aug 10 2024, 10:21 PM

The mind is for having ideas, not holding them
*******
Senior Member
5,138 posts

Joined: Oct 2004
From: J@Y B33


Trying out Flux. The quality is 👍
user posted image

But need lot's of RAM. VRAM too unless run schell model that limit to 8GB VRAM use
user posted image
chow1942
post Aug 13 2024, 08:36 PM

Getting Started
**
Junior Member
279 posts

Joined: Dec 2006
Eh, flux? in ollama?
Probably not with my old hardware without graphic card anyway tongue.gif
Phi3 just updated, acceptable speed for some easy question
TSxxboxx
post Aug 15 2024, 08:46 PM

The mind is for having ideas, not holding them
*******
Senior Member
5,138 posts

Joined: Oct 2004
From: J@Y B33


QUOTE(chow1942 @ Aug 13 2024, 08:36 PM)
Eh, flux? in ollama?
Probably not with my old hardware without graphic card  anyway tongue.gif
Phi3 just updated, acceptable speed for some easy question
*
Of course not. Lazy to open new thread.

Phi3 recently got update? I saw update is 2 weeks ago.

Phi3 doesn't feel on par with other LLMs, especially with the latest Mistral and Llama3.1
Phi3:14b on the other hand is comparable.

1 of the usual Q to ask LLM: Sally is a girl. She has three brothers. Each of her brothers has the same two sisters. How many sisters does Sally have?

Phi3 go on rambling mode, but still get it wrong
QUOTE
If we are to assume that each brother indeed shares the same two sisters, and since it's established in the problem statement that there'deeply exists at least one girl (Sally), then logically speaking these two must be her only siblings. Therefore, aside from herself being counted among them as well, Sally does not have any other sisters besides possibly twins she shares with each of her brothers. However, this scenario seems a bit unlikely given the context and wording that suggests individuality amongst family members (as it typically would be in such problems). Thus another plausible interpretation is there's no additional girl aside from Sally—a common structure for sibling scenarios when not specified otherwise. In summary:

- She has 1 sister, herself.

Sally only counts as a sister to each of her brothers; therefore, she has zero other sisters besides possibly the twin or half-twin scenario which is commonly understood but was unnecessary here given that Sally's existence makes it clear there are no more siblings in this family setup than just herself and three male children.


While the others gets it right

phi3:14b assume it's a twin
QUOTE
Sally only has one sister, herself. Since each brother has two sisters and we know that she is one of them (as it's stated 'She', referring to Sally), then there must be just one other girl in the family - her own twin. So, they are three brothers and two girls in total.


mistral-nemo straight to the point
QUOTE
Let's think through this logically:

1. We know that Sally has three brothers.
2. Each of her brothers has two sisters.
3. Since they all share these two sisters, those must be Sally herself and another sister.

So, Sally has one sister.

Final answer: Sally has one sister.


llama3.1
QUOTE
The question states that each of Sally's brothers "has the same two sisters". This means Sally must be one of those two sisters for each brother.

So, Sally has 3 brothers, and each brother has the same two sisters, which includes Sally herself. That means there is another girl who is also a sister to all three brothers.

Therefore, Sally has only ONE other sister in addition to herself. The answer is 2.

chow1942
post Aug 16 2024, 08:14 PM

Getting Started
**
Junior Member
279 posts

Joined: Dec 2006
qwen2:latest
QUOTE
Given the riddle, "Sally is a girl. She has three brothers. Each of her brothers has the same two sisters. How many sisters does Sally have?"


The answer to this riddle is:


Sally has two sisters.


Here's why: Each of Sally's brothers has the same two sisters. Since there are three brothers, you might initially think that each sister is counted multiple times, but in reality, it means Sally and one other sister together are counted as the "same two sisters" for all her brothers. So, Sally counts once for herself and then there is another sister who is also counted alongside her. That's a total of two sisters for Sally to have.
Let me start to think all Chinese AI is fallen behind or what

deepseek-v2:latest
QUOTE
The riddle you're presenting seems like it might be intended as an optical illusion or play on words, rather than a straightforward problem with a literal interpretation. It appears to hint at something more complex about sibling relationships and how we categorize them (siblings versus half-siblings).


Given the standard understanding of siblings in most families, Sally would have one sister and three half-sisters if her parents each had another child apart from her. However, without additional context or clarification on whether Sally is also part of this sibling group or should be considered separately, it's difficult to provide a definitive answer based strictly on the information given.


If you're looking for a more straightforward interpretation and considering only siblings in the traditional sense (those with the same mother and father), then Sally would have one sister because she is not part of the group described as having two sisters each by virtue of sharing just one parent—that description seems to suggest that there are four children total, not three.


So based on this interpretation, Sally has:
1 sister (herself) + 3 half-sisters = Total of 4 siblings.


and this one is hallucinating lol

This post has been edited by chow1942: Aug 16 2024, 08:17 PM
TSxxboxx
post Aug 16 2024, 09:26 PM

The mind is for having ideas, not holding them
*******
Senior Member
5,138 posts

Joined: Oct 2004
From: J@Y B33


QUOTE(chow1942 @ Aug 16 2024, 08:14 PM)
qwen2:latest
Let me start to think all Chinese AI is fallen behind or what

deepseek-v2:latest
and this one is hallucinating lol
*
qwen2:latest still got it right, but assume wrongly human would think 3 brothers means each sister is counted multiple times.

deepseek-v2:latest lol seems like few generations ago
c2tony
post Aug 21 2024, 07:57 PM

Getting Started
**
Junior Member
50 posts

Joined: Jan 2003
From: BM, Butterworth, Penang island.


Phi3 updated to Phi3.5 and get it right this time

QUOTE
Since all three of Sally's brothers share the same two sisters, it implies that these are also her siblings because in a family unit with multiple children like this one (including both male and female), there is only one set of sisters for each brother. Therefore, despite having three brothers, Sally has just one sister—the fact they all have "two" sisters at common refers to the same individual who counts once per sibling relationship in a family with multiple children sharing identical pairs among themselves. So, Sally indeed only has one biological sister.
chow1942
post Aug 30 2024, 08:17 PM

Getting Started
**
Junior Member
279 posts

Joined: Dec 2006
well, I cannot get it to make the right flag, the right tower and even the right cat

user posted image

This post has been edited by chow1942: Aug 30 2024, 08:18 PM
TSxxboxx
post Sep 1 2024, 09:35 PM

The mind is for having ideas, not holding them
*******
Senior Member
5,138 posts

Joined: Oct 2004
From: J@Y B33


Probably Flux doesn't generate exactly or not enough data to be accurate. Maybe if you train some loras then can use it to generate accurate flag and tower.

For fun it's good enough. This I use with anthroids lora
user posted image
TSxxboxx
post Sep 13 2024, 12:35 PM

The mind is for having ideas, not holding them
*******
Senior Member
5,138 posts

Joined: Oct 2004
From: J@Y B33


For vision AI, Qwen2-VL is very impressive!! This is the output for the Merdeka picture above:
QUOTE
This picture appears to be celebrating Hari Merdeka, which is Malaysia's Independence Day. The text "Hari Merdeka!" is prominently displayed at the top of the image. The individuals in the picture are holding Malaysian flags and are dressed in blue shirts, which is a common color associated with the Malaysian flag. The setting seems to be outdoors, possibly in a park or a similar open area, with trees visible in the background. The overall mood of the image is festive and patriotic.


And this is the text for the cat with Malaysia flag:
QUOTE
This image depicts two cats sitting on a high structure, holding up the flag of Malaysia. The background shows a cityscape with numerous skyscrapers, suggesting that the cats are at a significant height above the ground. The scene is likely digitally manipulated to create a whimsical and imaginative setting.


This handwritten note
user posted image
QUOTE
The handwritten note on the paper reads:

```
Dear Lucinda,
Just a quick note to let you know how grateful I am for the beautiful flowers you dropped off last
```


Too bad don't have model for Ollama yet
chow1942
post Sep 17 2024, 07:35 PM

Getting Started
**
Junior Member
279 posts

Joined: Dec 2006
Using Ollama minicpm-v model for merdeka pics

Can you describe this image for me please?

QUOTE
The image depicts a group of people celebrating Malaysia's Independence Day. They are standing outdoors with several Malaysian flags in the background, creating an atmosphere filled with national pride and joy. The text "Hari Merdeka!" suggests that this moment is significant for those involved as they commemorate their country’s independence day.


can you explain and describe that image in a very detail way?

» Click to show Spoiler - click again to hide... «



TSxxboxx
post Oct 24 2024, 10:32 AM

The mind is for having ideas, not holding them
*******
Senior Member
5,138 posts

Joined: Oct 2004
From: J@Y B33


Ollama now support GGUF
https://www.tomsguide.com/ai/ollama-just-ma...ternet-required

I haven't tried it yet, but this opens up to much more choices
TSxxboxx
post Oct 26 2024, 07:25 PM

The mind is for having ideas, not holding them
*******
Senior Member
5,138 posts

Joined: Oct 2004
From: J@Y B33


chow1942 using minicpm-v can you get all the text from this image correctly?
user posted image

I only got as such using it
user posted image

But using one of the online server it got very close to complete and correct
user posted image

I wonder if my parameters is not correct or ollama/open-webui engine issue.

using llama 3.2 vision on one of the online server also give it correctly but then probably runs out of token
user posted image

I also tried on ChatGPT and everything is almost correct
user posted image
TSxxboxx
post Nov 8 2024, 11:23 AM

The mind is for having ideas, not holding them
*******
Senior Member
5,138 posts

Joined: Oct 2004
From: J@Y B33


Llama 3.2 Vision for Ollama has been released!

Trying with the earlier table, it is better than other model but still not complete

user posted image


If I crop only the table it is better

user posted image

If I crop it to be shorter table, it's better but still same split into 2 table

user posted image

 

Change to:
| Lo-Fi Version
0.0244sec    0.76    6 queries    GZIP Disabled
Time is now: 10th December 2024 - 12:37 AM