Skip to content

Commit a921772

Browse files
committed
Fix the prompt typo in llm.py
1 parent 5be771e commit a921772

1 file changed

Lines changed: 1 addition & 1 deletion

File tree

  • litecli/packages/special

litecli/packages/special/llm.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -222,7 +222,7 @@ def handle_llm(text, cur) -> Tuple[str, Optional[str]]:
222222
if "-c" in parts:
223223
capture_output = True
224224
use_context = False
225-
# If the parts has `pormpt` command without `-c` then use context to the prompt.
225+
# If the parts has `prompt` command without `-c` then use context to the prompt.
226226
# \llm -m ollama prompt "Most visited urls?"
227227
elif "prompt" in parts: # User might invoke prompt with an option flag in the first argument.
228228
capture_output = True

0 commit comments

Comments
 (0)