Skip to content

Commit 5a2fb1f

Browse files
committed
Post-final QA
1 parent 7ad330d commit 5a2fb1f

File tree

2 files changed

+18
-1
lines changed

2 files changed

+18
-1
lines changed

ollama-python-sdk/generate_code.py

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,17 @@
1+
from ollama import generate
2+
3+
prompt = """
4+
Write a Python function fizzbuzz(n: int) -> List[str] that:
5+
6+
- Returns a list of strings for the numbers 1..n
7+
- Uses "Fizz" for multiples of 3
8+
- Uses "Buzz" for multiples of 5
9+
- Uses "FizzBuzz" for multiples of both 3 and 5
10+
- Uses the number itself (as a string) otherwise
11+
- Raises ValueError if n < 1
12+
13+
Include type hints compatible with Python 3.8.
14+
"""
15+
16+
response = generate(model="codellama:latest", prompt=prompt)
17+
print(response.response)

ollama-python-sdk/tool_calling.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ def square_root(number: float) -> float:
2424
]
2525

2626
response = chat(
27-
model="llama3.2:latest",
27+
model="llama3.2:latest", # You may want to try this model: llama3.1:8b
2828
messages=messages,
2929
tools=[square_root], # Pass the tools along with the prompt
3030
)

0 commit comments

Comments
 (0)