Skip to content

Commit ca65d63

Browse files
authored
[#67] added authors shortcodes and first article ported from opensource.net (#74)
* [#67] created authors shortcodes and added under blog/news the first article from opensource.net * [#67] fix typo * [#67] ported the second article from opensource.net * [#67] moved new articles in a new section
1 parent 4e5bd09 commit ca65d63

10 files changed

Lines changed: 353 additions & 0 deletions

File tree

assets/scss/_styles_project.scss

Lines changed: 40 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -184,3 +184,43 @@ aside{
184184
}
185185
}
186186

187+
188+
// Author card styles
189+
.author-card {
190+
display: flex;
191+
align-items: flex-start;
192+
background: #f5f6f7;
193+
border: 1px solid #e0e0e0;
194+
border-radius: 6px;
195+
padding: 16px;
196+
margin-bottom: 16px;
197+
box-shadow: 0 1px 2px rgba(0,0,0,0.03);
198+
gap: 16px;
199+
max-width: 100%;
200+
}
201+
.author-card img {
202+
width: 64px;
203+
height: 64px;
204+
border-radius: 50%;
205+
object-fit: cover;
206+
flex-shrink: 0;
207+
border: 2px solid #ddd;
208+
background: #fff;
209+
}
210+
.author-card .author-info {
211+
flex: 1;
212+
}
213+
.author-card .author-info strong {
214+
font-size: 1.1em;
215+
}
216+
@media (max-width: 600px) {
217+
.author-card {
218+
flex-direction: column;
219+
align-items: center;
220+
text-align: center;
221+
}
222+
.author-card .author-info {
223+
margin-top: 8px;
224+
}
225+
}
226+

content/en/blog/articles/_index.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
---
2+
title: "Articles"
3+
weight: 20
4+
---
Lines changed: 219 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,219 @@
1+
---
2+
title: "Building MCP Servers the Easy Way with Apache OpenServerless"
3+
date: 2025-07-08
4+
description: >
5+
How to build an MCP-compliant server using Apache OpenServerless and a custom MCP plugin.
6+
---
7+
8+
It’s 2025, and apparently, if your infrastructure isn’t running on MCP servers, are you even in tech? From stealth startups to sleepy enterprises pretending to innovate, everyone claims to be “built on MCP” — or at least wishes they were. It’s the new badge of modernity.
9+
10+
In this guide, I’ll show how to build an MCP-compliant server using Apache OpenServerless and our custom MCP plugin. By deploying OpenServerless and using the plugin, you can quickly expose tools via the Model Context Protocol (MCP). This setup enables fast and portable AI workflows across any cloud or on-prem environment.
11+
12+
## The hard part about running an MCP Server
13+
14+
Spinning up an MCP server sounds cool and it looks easy. But the real pain doesn’t start until after the “hello world” works. Because running an MCP server isn’t the challenge — it’s **keeping it running** and updating it.
15+
16+
Want to make it available on the Internet? Prepare for a joyride through SSL, firewall configs, and reverse proxies. Thinking of scaling it? That’s when the fun begins: orchestration, autoscaling, persistence, model versioning, billing — suddenly you’re less “AI pioneer” and more “distributed systems janitor.”
17+
18+
This is where OpenServerless with MCP truly shines: enabling fast, portable, and secure AI tool deployment with zero DevOps, seamless orchestration, and full compliance with the Model Context Protocol.
19+
20+
## Introducing `olaris-mcp`, the OpenServerless plugin to build MCP servers
21+
22+
We developed an Apache OpenServerless plugin, or more precisely an **ops** plugin for building MCP servers with Apache OpenServerless functions. A quick reminder: **ops** is the CLI and it supports plugins as a way to extend the CLI with new commands.
23+
24+
This plugin allows you to create an MCP-compliant server in a fully serverless way—by simply writing functions and publishing them to OpenServerless.
25+
26+
The plugin can run locally for development or be deployed to any server for production use. We support both local and public (published on the Internet) MCP servers. We will cover the latter in a future article as it enables interesting scenarios like inter-servers communications to be explored.
27+
28+
> **Note:** In OpenServerless, a single MCP server consists of a number of functions, so one single MCP server is a *package*. It consists of a collection of tools, prompts, and resources, each represented as a distinct OpenServerless function. That means one server is always split into a number of microservices.
29+
30+
## Installing the MCP Plugin for OpenServerless
31+
32+
As we said, it’s an `ops` plugin and can be installed directly using:
33+
34+
```shell
35+
$ ops -plugin https://github.com/mastrogpt/olaris-mcp
36+
```
37+
38+
To verify that the plugin has been installed correctly, run:
39+
40+
```shell
41+
$ ops mcp
42+
```
43+
44+
You should see the following usage synopsis (shortened):
45+
46+
```
47+
Usage:
48+
mcp new <package> [<description>] (--tool=<tool>|--resource=<resource>|--prompt=<prompt>|--clean=<clean>) [--redis] [--postgres] [--milvus] [--s3]
49+
mcp run <package> [--sse]
50+
mcp test <package> [--sample] [--norun]
51+
mcp install [<package>] [--cursor] [--claude] [--5ire] [--uninstall]
52+
mcp inspect <package> [--sse]
53+
```
54+
55+
Let’s see in detail what the available commands do:
56+
57+
- `ops mcp new` – Create a new MCP package tool, prompt or resource.
58+
- `ops mcp run` – Run the specified package as an MCP server.
59+
- `ops mcp test` – Test the generated MCP server via CLI.
60+
- `ops mcp inspect` – Launch the MCP web inspector for the specified package.
61+
- `ops mcp install` – Install or uninstall the MCP server locally to Cursor, Claude, or 5ire environments.
62+
63+
## Creating a new MCP Server with a serverless function
64+
65+
Let’s walk through the steps to create a simple MCP server – for example, one that provides weather information for any location in the world.
66+
67+
We’ll start by creating a serverless function that acts as a proxy using the following command:
68+
69+
```shell
70+
$ ops mcp new demomcp --tool=weather
71+
```
72+
73+
This command initializes a new MCP package named `demomcp` and defines a tool called `weather`.
74+
75+
Next, you’ll need to describe your MCP tool using metadata annotations. These annotations define the tool type, description, and input parameters:
76+
77+
```text
78+
#-a mcp:type tool
79+
#-a mcp:desc "Provides weather information for a given location"
80+
#-a input:str "The location to retrieve weather data for"
81+
```
82+
83+
## Implementing a Weather Function
84+
85+
Now it’s time to implement the logic for your weather function.
86+
87+
You can use generative AI to get the required code quickly. For instance, the following prompt can help you generate a simple function that retrieves weather information:
88+
89+
```text
90+
AI Prompt:
91+
A Python function get_weather(location) using requests and open-meteo.com that retrieves the given location, selects the first match, then fetches and returns the weather information for that location.
92+
```
93+
94+
We do not include the implementation here, ChatGPT typically returns a valid and usable function.
95+
96+
Assuming you’ve implemented a `get_weather(location)` function, you can now create a wrapper to handle MCP-style invocation:
97+
98+
```python
99+
def weather(args):
100+
inp = args.get("input", "")
101+
if inp:
102+
out = get_weather(inp)
103+
else:
104+
out = "Please provide a location to get the weather information for."
105+
return {"output": out}
106+
```
107+
108+
### Deploy and Test the Function
109+
110+
You can deploy and test your MCP function as follows:
111+
112+
```shell
113+
$ ops ide deploy demomcp/weather
114+
ok: updated action demomcp/weather
115+
116+
$ ops invoke demomcp/weather
117+
{
118+
"output": "Please provide a location to get the weather information for."
119+
}
120+
121+
$ ops invoke demomcp/weather input=Rome
122+
{
123+
"output": {
124+
"location": "Rome, Italy",
125+
"temperature": 26.0,
126+
"time": "2025-06-22T06:45",
127+
"weathercode": 2,
128+
"winddirection": 360,
129+
"windspeed": 2.9
130+
}
131+
}
132+
133+
$ ops invoke demomcp/weather input=NontExistingCity
134+
{
135+
"output": "Could not find location: NontExistingCity"
136+
}
137+
```
138+
139+
## Testing the MCP Server
140+
141+
Your MCP server is now up and running, and you can test it using the graphical inspector with the following command:
142+
143+
```shell
144+
$ ops mcp inspect demomcp
145+
```
146+
147+
The Inspector connects to your MCP server, lists available tools and resources, and allows you to test their behavior interactively.
148+
149+
<img src="/ops-mcp-testing.webp" alt="Apache OpenServerless Inspector" class="mb-2 img-fluid">
150+
151+
152+
## Using the MCP Server
153+
154+
Your MCP server is now ready to be integrated into any chat interface that supports MCP servers.
155+
156+
In this example, we use [5ire](https://5ire.app/), a free AI assistant and MCP client that provides an excellent environment for running and testing MCP tools.
157+
158+
### Step 1: Install the `ops` CLI
159+
160+
First, install the `ops` CLI. You can find installation instructions on the [OpenServerless installation page](https://openserverless.apache.org/docs/installation/download/).
161+
162+
### Step 2: Add the MCP Plugin
163+
164+
Install the MCP plugin using:
165+
166+
```shell
167+
$ ops -plugin https://github.com/mastrogpt/olaris-mcp
168+
```
169+
170+
### Step 3: Log in to Your OpenServerless Account
171+
172+
Use the following command to authenticate:
173+
174+
```shell
175+
$ ops ide login
176+
```
177+
178+
### Step 4: Install the MCP Server into 5ire
179+
180+
Deploy your toolset to 5ire with:
181+
182+
```shell
183+
$ ops mcp install demomcp --5ire
184+
```
185+
186+
You’re all set! Now you can access your 5ire client and use the deployed MCP server in real conversations.
187+
188+
Let’s walk through how the tool works in practice:
189+
<img src="/ops-mcp-using.webp" alt="Apache OpenServerless Testing it step-by-step" class="mb-2 img-fluid">
190+
191+
## Testing it step-by-step
192+
193+
1. **Ask a Chatbot**
194+
Ask a chatbot for the weather in Rome. It will likely reply that, as a language model, it doesn’t have up-to-date weather information.
195+
2. **Open the Tool List**
196+
In the 5ire interface, open the list of available MCP tools.
197+
3. **Enable the MCP Tool**
198+
Locate your toolset (`demomcp`) and enable it.
199+
4. **Ask Again**
200+
Now that the tool is active, ask the chatbot again: “What’s the weather in Rome?”
201+
5. **Observe What Happens**
202+
Behind the scenes, the LLM invokes the MCP server, which triggers the serverless function that retrieves live weather data.
203+
6. **Success!**
204+
You’ve successfully extended your LLM to provide **real-time weather information** for any location in the world.
205+
206+
## Conclusion
207+
208+
With Apache OpenServerless, we showed how to build and deploy a serverless MCP server in minutes, bypassing all complex system configuration.
209+
210+
This example covered only local MCP server configuration. However, the optimal solution utilizes public MCP servers, enabling inter-server communication via agent interaction protocols.
211+
212+
This is just the beginning. Public MCP servers open the door to multi-agent interactions, federation, and more.
213+
214+
Stay tuned for more updates from Apache OpenServerless!
215+
216+
## Authors
217+
218+
{{< authors/michele >}}
219+
{{< authors/bruno >}}
Lines changed: 72 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,72 @@
1+
---
2+
title: "Apache OpenServerless is the easiest way to build your cloud native AI application"
3+
date: 2025-05-15
4+
description: >
5+
Meet this portable, self-contained and complete cloud-native serverless platform built on Kubernetes.
6+
---
7+
8+
If you have never heard of it, you may wonder: what is Apache OpenServerless?
9+
The short answer is: a portable, self-contained and complete cloud-native serverless platform, built on top of Kubernetes and especially suitable to develop production-ready AI applications with minimal effort. Because of its portability and availability in every environment, including air-gapped ones, it shines when you have strong privacy and security constraints and need to build Private AI applications.
10+
11+
OpenServerless embraces the functional programming paradigm, enabling developers to build modular, stateless functions ideal for scalable AI workloads: this model aligns naturally with serverless architecture and simplifies the integration of both public and private LLMs - developers can invoke proprietary APIs like OpenAI or deploy and run private models locally, ensuring full control over sensitive data. A key strength is its ability to run GPU-accelerated runtimes, allowing execution of code directly on GPUs for high-performance inference or training tasks.
12+
13+
## The origins of Apache OpenServerless
14+
15+
The project [Apache OpenServerless](https://openserverless.apache.org/) is closely related to another serverless project: [Apache OpenWhisk](https://openwhisk.apache.org/). OpenWhisk is a portable serverless engine originally developed and open sourced by IBM, and later adopted and further developed by vendors of the calibre of Adobe, Naver, and Digital Ocean.
16+
17+
OpenWhisk is an excellent foundation to provide FaaS services and indeed it is adopted by many cloud providers as their serverless engine. It is also widely used in academia for research on serverless. It is highly scalable and extremely robust and reliable. However, OpenWhisk is not yet widely used because in itself is not a full platform: it is only a FaaS service and, while it is used by cloud providers, their users have little interest in making it available for wider use.
18+
19+
A team of contributors to OpenWhisk, working initially with the startup Nimbella (acquired by Digital Ocean), and later Nuvolaris, developed it further to make it widely accessible and more useful out-of-the-box, adding all the required components with the goal of making it a complete serverless environment. Indeed in general serverless is useful when it is coupled with storage, cache, database and frontend. Given the popularity of LLM based application development it has been also extended to fully support the development of AI applications.
20+
21+
The project was then donated to the [Apache Software Foundation](https://apache.org/) and released as Apache OpenServerless. Note in this text we sometimes omit Apache in the name, but always keep in mind that the full names of the projects are respectively Apache OpenWhisk and Apache OpenServerless, as they are both projects copyrighted by the Apache Software Foundation.
22+
23+
## What is in Apache OpenServerless?
24+
25+
To clarify the difference between OpenWhisk and OpenServerless you can think in this way: if OpenWhisk were Linux, then OpenServerless would be Ubuntu. In short, it is a distribution of OpenWhisk providing a Kubernetes operator to install and manage it, a rich CLI with integrated installation and development tools and a collection of starters to build AI applications.
26+
27+
You can see what is in openserverless in the picture below:
28+
29+
<img src="/openserverless-diagram.webp" alt="Apache OpenServerless Architecture" class="mb-2 img-fluid">
30+
31+
As you can note at the core there is OpenWhisk, providing the scalable FaaS service, composed of a set of controllers accepting requests and queuing them in [Kafka](https://kafka.apache.org/), and a set of invokers serving the requests on demand, instantiating runtimes. OpenServerless also adds a [Kubernetes](https://kubernetes.io/) Operator that manages all the systems. The main purpose of the operator is to deploy OpenWhisk, but also the integrated services. At the moment there is Redis (in the open [ValKey](https://valkey.io/) flavour), [Postgresql](https://www.postgresql.org/) (SQL database) and the MongoDB compatible adapter [FerretDB](https://ferretdb.io/) (NoSQL), the Vector Database [Milvus](https://milvus.io/) and an S3 object storage services. We currently support both [Minio](https://min.io/) and [Ceph](https://ceph.io/) as backends.
32+
33+
Also we have a special service, called streamer, designed to support SSE (server side events) commonly used with AI applications to stream answers from LLM.
34+
35+
The operator is actually pretty powerful as it is configurable, and allows for the creation and management of resources as it is able to create databases, buckets and redis prefixes in the environment it manages, and manage the secrets to access them.
36+
37+
OpenWhisk has a large set of runtimes but instead of supporting all of them, we focused and optimized the more used languages, typically Python, Javascript and PHP, and provided a rich set of libraries in order to use the integrated services.
38+
39+
The operator is controlled by a rich CLI, called ops. The name is a pun, a short of OPenServerless, but also Operation… and also what you say (“OoooPS!”) when you make a mistake. The CLI completes the picture as it is extremely powerful and even expandable with plugins. It manages the serverless resources as in OpenWhisk, but also includes the ability to install OpenServerless in multiple cloud providers and integrates powerful development tools. We will discuss it more in detail later.
40+
41+
## Installation and configuration
42+
43+
Let’s start from the installation: you install OpenWhisk with a helm chart on a set of well known Kubernetes clusters, like Amazon EKS, IBM IKS or OpenShift v4. You need a Kubernetes cluster, that should also be configured properly. Also the installer only installs the engine and no other services.
44+
45+
OpenServerless CLI is more complete. It installs OpenWhisk by deploying the operator in a Kubernetes cluster and sending a configuration. But it is also able to create a suitable cluster.
46+
47+
Indeed the documentation explains how to prepare a Kubernetes cluster on Amazon AWS, Microsoft Azure and Google GCP using the cli called ops: there is an interactive configuration, then ops builds a suitable cluster with all the parameters in place to install OpenServerless in it.
48+
49+
When installing OpenServerless, you can also select which services you want to enable, and many configuration parameters that are essential. All of this just using the ops CLI to set the configuration parameters before performing the installation.
50+
51+
After the installation, the CLI is useful to administer the cluster, adding new users, etc. Note that each user has a complete set of services included, so you do not only create an area (called namespace) for serverless functions but also a SQL database (and a No-SQL adapter), a Vector Database, a bucket for web content (public) and another for private data, a Redis prefix (to isolate your keys in Redis).
52+
53+
Note that the system supports a public area for web content using a dns configuration. You need a DNS domain for an OpenServerless installation, and you usually need to point the root of the domain (@) and a wildcard (*) to a load balancer accessing it. Each user will have a different web area to upload their web content, and a mapping to their serverless functions ('/api/my') suitable for deploying SPA applications with serverless backend support.
54+
55+
## Development tools
56+
57+
So far so good, but work would not be complete without suitable development tools. You can deploy each function easily but it is a bit painful to have to deploy each function separately. Furthermore, you have to provide each function with options to change the runtime type, the memory constraints, timeouts etc. OpenWhisk supports a manifest format to do that, but does not offer other facilities for deployment.
58+
59+
It is still possible to use the manifest, but we also added a configuration system based on conventions: just put your code in directories and the system will automatically build and deploy.
60+
61+
Also, in this case, the super powers of cli ops come to our rescue! The ops development tools allow us to incrementally publish all the functions we have written, to manage their dependencies, annotations during publication; as well as publish the web part of our application. Furthermore it is possible to integrate the build scripts of our Angular, React, or Svelte application so as to be invoked during the publication process. Other useful tools allow us to handle and interact with the integrated services (Postgresql, Minio, Redis).
62+
63+
## Conclusions and a new beginning
64+
65+
All of this looks interesting, but it is actually just the starting point for building AI applications, as this is our main focus. OpenServerless lays the groundwork by providing a flexible, event-driven foundation, but its real power emerges when applied to AI-centric workflows.
66+
67+
Our primary goal is to enable developers and data scientists to move beyond basic automation and toward complex AI systems that integrate reasoning, natural language understanding, and data processing. OpenServerless becomes a powerful platform for rapid experimentation, secure deployment, and scalable AI services. From RAG pipelines to autonomous agents, this environment is designed to evolve with the needs of modern AI, turning abstract ideas into production-ready solutions without the usual overhead of managing infrastructure or sacrificing control.
68+
69+
## Authors
70+
71+
{{< authors/michele >}}
72+
{{< authors/bruno >}}

content/en/ops-mcp-testing.webp

89.6 KB
Loading

content/en/ops-mcp-using.webp

79.8 KB
Loading

0 commit comments

Comments
 (0)