1
0
mirror of https://github.com/containers/ramalama.git synced 2026-02-05 06:46:39 +01:00
Files
ramalama/docsite/docs/introduction.mdx
2025-07-24 09:41:25 -05:00

34 lines
1.8 KiB
Plaintext

---
title: Introduction
description: RamaLama strives to make working with AI simple, straightforward, and familiar by using OCI containers.
sidebar_position: 0
---
<div style={{textAlign: "center"}}>
<img src="https://github.com/user-attachments/assets/1a338ecf-dc84-4495-8c70-16882955da47" style={{width: "50%"}} />
</div>
[RamaLama](https://ramalama.ai) strives to make working with AI simple, straightforward, and familiar by using OCI containers.
## Description
RamaLama is an open-source tool that simplifies the local use and serving of AI models for inference from any source through the familiar approach of containers. It allows engineers to use container-centric development patterns and benefits to extend to AI use cases.
RamaLama eliminates the need to configure the host system by instead pulling a container image specific to the GPUs discovered on the host system, and allowing you to work with various models and platforms.
- Eliminates the complexity for users to configure the host system for AI.
- Detects and pulls an accelerated container image specific to the GPUs on the host system, handling dependencies and hardware optimization.
- RamaLama supports multiple AI model registries, including OCI Container Registries.
- Models are treated similarly to how Podman and Docker treat container images.
- Use common container commands to work with AI models.
- Run AI models securely in rootless containers, isolating the model from the underlying host.
- Keep data secure by defaulting to no network access and removing all temporary data on application exits.
- Interact with models via REST API or as a chatbot.
## Contributors
Open to contributors
<a href="https://github.com/containers/ramalama/graphs/contributors">
<img src="https://contrib.rocks/image?repo=containers/ramalama" />
</a>