This website requires JavaScript.
Explore
Help
Sign In
containers
/
ramalama
Watch
1
Star
0
Fork
0
You've already forked ramalama
mirror of
https://github.com/containers/ramalama.git
synced
2026-02-05 06:46:39 +01:00
Code
Issues
Wiki
Activity
Files
main
Add File
New File
Apply Patch
ramalama
/
test
History
Mike Bonnet
5c3d3781ac
Merge pull request
#2359
from olliewalsh/flash_attn
2026-01-29 09:37:30 -08:00
..
e2e
Merge pull request
#2359
from olliewalsh/flash_attn
2026-01-29 09:37:30 -08:00
system
Use default (auto) value for llama.cpp flash-attn
2026-01-29 12:21:11 +00:00
tmt
remove whisper.cpp from all images
2026-01-27 16:34:24 -08:00
unit
Use default (auto) value for llama.cpp flash-attn
2026-01-29 12:21:11 +00:00
__init__.py
…
ci.sh
…
conftest.py
Download safetensors models from huggingface.co with https.
2026-01-28 23:35:04 -05:00
report.md
…