1
0
mirror of https://github.com/containers/ramalama.git synced 2026-02-05 06:46:39 +01:00

Cleanup READMEs and man pages.

Signed-off-by: Daniel J Walsh <dwalsh@redhat.com>
This commit is contained in:
Daniel J Walsh
2025-02-10 12:19:49 -05:00
parent 0d841ec2cd
commit 8fff12a2c0
4 changed files with 11 additions and 8 deletions

View File

@@ -98,13 +98,16 @@ curl -fsSL https://raw.githubusercontent.com/containers/ramalama/s/install.sh |
| Command | Description |
| ------------------------------------------------------ | ---------------------------------------------------------- |
| [ramalama(1)](https://github.com/containers/ramalama/blob/main/docs/ramalama.1.md) | primary RamaLama man page |
| [ramalama-bench(1)](https://github.com/containers/ramalama/blob/main/docs/ramalama-bench.1.md)| benchmark specified AI Model |
| [ramalama-containers(1)](https://github.com/containers/ramalama/blob/main/docs/ramalama-containers.1.md)| list all RamaLama containers |
| [ramalama-convert(1)](https://github.com/containers/ramalama/blob/main/docs/ramalama-convert.1.md) | convert AI Model from local storage to OCI Image |
| [ramalama-info(1)](https://github.com/containers/ramalama/blob/main/docs/ramalama-info.1.md) | display RamaLama configuration information |
| [ramalama-inspect(1)](https://github.com/containers/ramalama/blob/main/docs/ramalama-inspect.1.md) | display RamaLama configuration information |
| [ramalama-list(1)](https://github.com/containers/ramalama/blob/main/docs/ramalama-list.1.md) | list all downloaded AI Models |
| [ramalama-login(1)](https://github.com/containers/ramalama/blob/main/docs/ramalama-login.1.md) | login to remote registry |
| [ramalama-logout(1)](https://github.com/containers/ramalama/blob/main/docs/ramalama-logout.1.md) | logout from remote registry |
| [ramalama-perplexity(1)](https://github.com/containers/ramalama/blob/main/docs/ramalama-perplexity.1.md)| calculate perplexity for specified AI Model |
| [ramalama-pull(1)](https://github.com/containers/ramalama/blob/main/docs/ramalama-pull.1.md) | pull AI Model from Model registry to local storage |
| [ramalama-convert(1)](https://github.com/containers/ramalama/blob/main/docs/ramalama-convert.1.md) | convert AI Model from local storage to OCI Image |
| [ramalama-push(1)](https://github.com/containers/ramalama/blob/main/docs/ramalama-push.1.md) | push AI Model from local storage to remote registry |
| [ramalama-rm(1)](https://github.com/containers/ramalama/blob/main/docs/ramalama-rm.1.md) | remove AI Model from local storage |
| [ramalama-run(1)](https://github.com/containers/ramalama/blob/main/docs/ramalama-run.1.md) | run specified AI Model as a chatbot |

View File

@@ -1,7 +1,7 @@
% ramalama-info 1
## NAME
ramalama\-info - Display RamaLama configuration information
ramalama\-info - display RamaLama configuration information
## SYNOPSIS

View File

@@ -137,19 +137,19 @@ show RamaLama version
| Command | Description |
| ------------------------------------------------- | ---------------------------------------------------------- |
| [ramalama-containers(1)](ramalama-containers.1.md)| list all RamaLama containers |
| [ramalama-bench(1)](ramalama-bench.1.md) | benchmark specified AI Model |
| [ramalama-containers(1)](ramalama-containers.1.md)| list all RamaLama containers |
| [ramalama-convert(1)](ramalama-convert.1.md) | convert AI Models from local storage to OCI Image |
| [ramalama-info(1)](ramalama-info.1.md) | Display RamaLama configuration information |
| [ramalama-info(1)](ramalama-info.1.md) | display RamaLama configuration information |
| [ramalama-inspect(1)](ramalama-inspect.1.md) | inspect the specified AI Model |
| [ramalama-list(1)](ramalama-list.1.md) | list all downloaded AI Models |
| [ramalama-login(1)](ramalama-login.1.md) | login to remote registry |
| [ramalama-logout(1)](ramalama-logout.1.md) | logout from remote registry |
| [ramalama-perplexity(1)](ramalama-perplexity.1.md)| calculate the perplexity value of an AI Model |
| [ramalama-pull(1)](ramalama-pull.1.md) | pull AI Models from Model registries to local storage |
| [ramalama-push(1)](ramalama-push.1.md) | push AI Models from local storage to remote registries |
| [ramalama-rm(1)](ramalama-rm.1.md) | remove AI Models from local storage |
| [ramalama-run(1)](ramalama-run.1.md) | run specified AI Model as a chatbot |
| [ramalama-perplexity(1)](ramalama-perplexity.1.md)| calculate the perplexity value of an AI Model |
| [ramalama-serve(1)](ramalama-serve.1.md) | serve REST API on specified AI Model |
| [ramalama-stop(1)](ramalama-stop.1.md) | stop named container that is running AI Model |
| [ramalama-version(1)](ramalama-version.1.md) | display version of RamaLama |

View File

@@ -238,10 +238,10 @@ def configure_subcommands(parser):
"""Add subcommand parsers to the main argument parser."""
subparsers = parser.add_subparsers(dest="subcommand")
subparsers.required = False
help_parser(subparsers)
bench_parser(subparsers)
containers_parser(subparsers)
convert_parser(subparsers)
help_parser(subparsers)
info_parser(subparsers)
inspect_parser(subparsers)
list_parser(subparsers)
@@ -501,7 +501,7 @@ def list_containers(args):
def info_parser(subparsers):
parser = subparsers.add_parser("info", help="Display information pertaining to setup of RamaLama.")
parser = subparsers.add_parser("info", help="display information pertaining to setup of RamaLama.")
parser.add_argument("--container", default=config.get('container', use_container()), help=argparse.SUPPRESS)
parser.set_defaults(func=info_cli)
@@ -625,7 +625,7 @@ def list_cli(args):
def help_parser(subparsers):
parser = subparsers.add_parser("help", help="help about any command")
parser = subparsers.add_parser("help")
# Do not run in a container
parser.add_argument("--container", default=False, action="store_false", help=argparse.SUPPRESS)
parser.set_defaults(func=help_cli)