Sunday, November 30, 2025

With Geniez, You Can Discuss to Your Mainframe in Pure Language


I’m Gil. I’ve been within the mainframe house for 27 years already. I labored for IBM in storage improvement and for another shopper retailers. Many individuals know me because the founding father of Model9. I began Model9 in 2016—we had been connecting mainframes to cloud. We bought Model9 to BMC three years in the past, and after a short time at BMC, I left to begin Genies.

Interviewer:
So what actually led to you beginning Genies?

Gil:
My total profession, I’ve targeted on bringing revolutionary applied sciences to the mainframe world. After I left BMC, the most popular development within the trade was AI. We had been speaking to prospects, listening to their ache factors, and realized there was a chance to do that once more—this time with AI and the mainframe.

Interviewer:
Inform me extra in regards to the market hole you noticed and what impressed you.

Gil:
After we regarded on the AI–mainframe panorama, we noticed distributors specializing in AI on z/OS—which is nice—and others who aren’t even mainframe distributors attempting to assist with issues like code and connectivity. However the huge hole was this: all of us use ChatGPT in our private lives, however how can we deliver that into our skilled lives? Extra broadly, how can we join LLMs to mainframe environments to derive enterprise worth for the businesses we work for? That turned one thing prospects had been desirous about, and we went for it.

Interviewer:
Had been there stuff you’d been engaged on for some time that led you right here, or was it purely curiosity?

Gil:
At Genies, we attempt to leverage the distinctive property firms have on the mainframe. I began in storage, developed into knowledge, and mainframe knowledge is extremely attention-grabbing and vital. It’s probably the most up-to-date knowledge from real-time transactions hitting the system and a long time of enterprise historical past saved on the mainframe. These are treasure troves for AI. We perceive knowledge and storage, and we noticed a chance to construct one thing invaluable.

Interviewer:
For somebody simply listening to about Genies, what do you do?

Gil:
We name it connecting LLMs and AI brokers to real-time mainframe knowledge. Consider it as a easy, revolutionary strategy to join mainframe knowledge to the most recent functions firms are constructing—right this moment meaning AI functions. They contain LLMs, brokers, and bridging the hole to precise mainframe knowledge to unravel particular mainframe issues securely and at scale. All the pieces you’d count on from a mainframe product—that’s the place we are available.

Interviewer:
So how are you connecting generative AI to the mainframe?

Gil:
It’s not one trick. Some prospects first assume it’s magic—plug it in and also you’re accomplished. I want it had been that straightforward. We leverage trade requirements. From the AI aspect, we’re appropriate and straightforward to plug in. One instance is the MCP protocol we use. However that’s not the tip of the story. There’s a variety of know-how round securing the connection, doing it at scale, guaranteeing efficiency, and making it cost-effective. It’s a complete framework, not only a bridge.

Interviewer:
How is your method completely different from how others method AI on the mainframe?

Gil:
Our staff has deep mainframe abilities. Not like firms attempting to modernize the mainframe from the skin, we leverage the strengths and know-how of the mainframe itself. We write code that runs on the mainframe. Our product runs on the mainframe. We sort out actual mainframe challenges: accessing hard-to-reach knowledge, securing it utilizing native mainframe capabilities, and dealing inside the platform’s structure. Working on the mainframe provides us a bonus—there are stuff you merely can’t do from the skin.

Interviewer:
You talked about that is in regards to the knowledge. What does that imply if you say you give attention to the info first?

Gil:
The mainframe holds probably the most up-to-date knowledge within the firm, in addition to all its historic knowledge. This could possibly be DB2, IMS, VSAM information, operational logs like OPERLOG, or SMF knowledge. Prospects inform us that accessing and leveraging that knowledge in an AI utility is a significant problem. So we give attention to delivering that knowledge to AI functions.

Interviewer:
You’re right here at GSUK speaking about Genies and what it does. Inform me what’s thrilling prospects and what units you other than different AI-on-mainframe makes an attempt.

Gil:
Prospects inform us that for the primary time, they will use generative AI for his or her mainframe use instances and mainframe knowledge. They already use ChatGPT personally, so the concept of asking natural-language questions on mainframe knowledge seems like magic. For instance, if a tax regulation adjustments, as an alternative of writing new mainframe code, they will ask an LLM what modified and what it means for his or her system. Operationally, they will ask what’s occurring of their system proper now, have a look at logs, and ask the place the issue is. One buyer even mentioned we picked a great identify—it seems like a genie.

Interviewer:
Does this make mainframe knowledge extra accessible inside an organization?

Gil:
Sure. Prospects implement our answer as a result of they need real-time entry to all mainframe knowledge. Traditionally, firms tried to maneuver some knowledge off the mainframe—however nobody strikes all of it. There isn’t sufficient time within the day. And real-time replication options don’t scale to all mainframe knowledge. Our method queries knowledge on the supply on the mainframe in actual time and returns outcomes instantly.

Interviewer:
What are shoppers saying in regards to the enterprise worth?

Gil:
Some use it to research system exercise—once they hit an issue, they use an LLM to research audit logs or system logs and decide what occurred. That reduces downtime, which has a transparent greenback worth. Others use it for productiveness. In the event that they don’t have to write down new mainframe code to construct an AI utility, that’s sooner time-to-market. It frees workers to do different work. And in a world the place staffing budgets are consistently being minimize, doing extra with the identical workforce is important. AI is a race—management tells groups to indicate how they’ll use AI to develop into extra environment friendly, serve prospects higher, and beat opponents. We plug into these initiatives.

Interviewer:
The z16 and z17 releases added AI acceleration on-chip. Are you leveraging that?

Gil:
One key differentiator for Genies is that we run on the mainframe. So we leverage all of the {hardware} developments—ZIP engines, Linux environments, and built-in acceleration, safety, efficiency, and scalability. We profit from all of it.

Interviewer:
You talked about you developed and deployed MCP on the mainframe. Why is that important?

Gil:
MCP is a brand new and vital know-how in generative AI as a result of it ties all the things collectively—the LLMs, the info, the brokers. With out it, you miss out on a variety of worth. But it surely’s not simply MCP. Many Python packages utilized in AI aren’t obtainable on the mainframe. Many instruments aren’t obtainable but. As a substitute of attempting to port all the things to z/OS, we join these instruments to the mainframe knowledge. If a gen-AI chief releases one thing new tomorrow, it already works with the Genies framework.

Interviewer:
What are you listening to from individuals truly utilizing Genies?

Gil:
Many use instances. One buyer requested if they may examine safety vulnerabilities of their system utilizing Genies. Sure—which means we course of knowledge from SMP/E, safety logs, or different sources, feed it to an LLM, and supply evaluation. That saves them time. One other mentioned even once they get graphs from system knowledge, they don’t perceive the which means. Now they will ask an LLM to elucidate what’s occurring. It saves time and helps them do their jobs.

Interviewer:
The place do you see AI going subsequent?

Gil:
AI and the mainframe will inevitably intertwine. The mainframe runs firms’ most important workloads. On the similar time, firms are pushing to leverage AI to serve prospects higher and function extra effectively. These worlds should join. We goal to assist prospects put together for the fast developments in generative AI to allow them to leverage them of their mainframe environments.

Interviewer:
What recommendation would you give firms that wish to use AI with their mainframe however don’t know the place to begin?

Gil:
Hold an open thoughts. Educate your self about generative AI—what it could possibly do, even exterior the mainframe. Identical to you discovered ChatGPT in your private life, take a course, come to occasions like GSUK, be taught what AI can do. As soon as individuals perceive the safety controls and the way options like ours join safely to the mainframe, they’re extra prepared to belief it and take a look at it.

Interviewer:
The place can individuals be taught extra about Genies or see the know-how in motion?

Gil:
We don’t simply discuss—we present. The most effective place is our YouTube channel, Genies AI, the place you possibly can see a number of demos of the product in motion and the worth it brings to mainframe professionals.

Interviewer:
What’s subsequent for Genies AI?

Gil:
We’re very excited. We’re listening to nice suggestions from prospects. We simply obtained an funding from one other VC. We’re bringing innovation into this market, we’re hiring, we’re rising, and shortly you’ll see some very cool issues coming from us.

Interviewer:
That sounds nice. I’m trying ahead to it.

Gil:
Thanks.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles