Quarkus MCP Server that indexes Red Hat product documentation and the "IA Development From Zero To Hero" workshop for OpenShift Lightspeed.
Servidor MCP Quarkus para OpenShift Lightspeed
Documentacion inteligente al alcance de tu chat
English | Espanol
Showroom Docs MCP Server es un servidor Model Context Protocol (MCP) construido con Quarkus que indexa y expone documentacion del workshop “IA Development From Zero To Hero” (Neuralbank) junto con un extracto parcial de documentacion oficial de 9 productos Red Hat, permitiendo que OpenShift Lightspeed responda preguntas basadas en ese contexto.
Nota: La documentacion de productos Red Hat incluida es un extracto parcial y puede no reflejar las ultimas actualizaciones. Red Hat publica actualizaciones de su documentacion oficial de forma frecuente. Consulte siempre docs.redhat.com para la informacion mas actual y completa.
| Fuente | Contenido |
|---|---|
| Workshop Neuralbank | Caso de negocio, agentes MCP, Golden Path, DevSpaces, Keycloak, Connectivity Link, MCP Inspector, Deploy, OpenTelemetry, RAG |
| OpenShift Service Mesh 3.3 | About, Installing, Observability, Gateways, Updating |
| Connectivity Link 1.3 | Discover, Install, Configure, Observe |
| Developer Hub 1.9 | Install, Configure, Auth, GitHub, MCP Tools |
| OpenShift Lightspeed 1.0 | Install, Configure, Operate, Troubleshoot |
| OpenShift Observability 1 | Overview hub |
| OpenTelemetry 3.9 | Install, Collector, Instrumentation |
| OpenShift Pipelines 1.21 | Install, Pipelines as Code, CI/CD |
| API Management 1 | Getting started, Administering |
| OpenShift AI Cloud Service 1 | Data Science, Model Serving, Llama Stack |
helm repo add showroom-docs-mcp \
https://maximilianopizarro.github.io/showroom-docs-mcp/
helm repo update
helm install showroom-docs-mcp showroom-docs-mcp/showroom-docs-mcp \
--namespace openshift-lightspeed \
--create-namespace \
--set image.pullPolicy=Always
oc get pods -n openshift-lightspeed -l app=showroom-docs-mcp
# Esperado: 1/1 Running
oc create secret generic ols-llm-credentials \
-n openshift-lightspeed \
--from-literal=apitoken=<tu-api-token>
apiVersion: ols.openshift.io/v1alpha1
kind: OLSConfig
metadata:
name: cluster
spec:
featureGates:
- MCPServer
llm:
providers:
- credentialsSecretRef:
name: ols-llm-credentials
models:
- name: llama-32-3b-instruct
parameters:
maxTokensForResponse: 8192
name: red_hat_openshift_ai
type: rhoai_vllm
url: 'http://llama-32-3b-instruct-openai.my-first-model.svc.cluster.local/v1'
mcpServers:
- name: showroom-docs-mcp
timeout: 10
url: 'http://showroom-docs-mcp.openshift-lightspeed.svc.cluster.local:8080/mcp'
ols:
conversationCache:
postgres:
maxConnections: 2000
sharedBuffers: 256MB
type: postgres
defaultModel: llama-32-3b-instruct
defaultProvider: red_hat_openshift_ai
deployment:
api:
replicas: 1
console:
replicas: 1
dataCollector: {}
database:
replicas: 1
llamaStack: {}
mcpServer: {}
logLevel: INFO
userDataCollection: {}
olsDataCollector:
logLevel: INFO
Importante: Usa
/mcp(Streamable HTTP), no/mcp/sse.
oc apply -f cluster-ols.yml
oc logs -n openshift-lightspeed deploy/lightspeed-app-server \
-c lightspeed-service-api | grep "tools from MCP"
# Esperado:
# Loaded 4 tools from MCP server 'showroom-docs-mcp'
Abri la consola web de OpenShift, hace clic en el icono de chat de Lightspeed, y pregunta sobre la documentacion indexada.
Una vez desplegado, abri el chat de OpenShift Lightspeed y pregunta:
Guia de instalacion completa – Arquitectura – Configuracion OLSConfig