We have now reached the tip of our sequence on deploying to OCI utilizing the Hackathon Starter Equipment.
For this final article, we are going to see methods to deploy an software utilizing Helidon (Java), the MySQL REST Service, and OCI GenAI with Lanchain4J.
We use Helidon as a result of it’s a cool, open-source framework developed by Oracle. It’s light-weight and quick. The most recent model, 4.3.0, consists of AI-related options equivalent to LangChain4j, which permits us to combine AI capabilities into our software by way of OCI AI Providers.
Prerequisities
The very first thing we have to do is confirm that now we have all of the required Java and Maven variations: Java 21 and the corresponding Maven model.
We additionally want our OCI config file to entry the AI Providers.
We’d like the Sakila pattern database.
And eventually, we want the code of our software, which is obtainable on GitHub: helidon-mrs-ai
Java 21 and Maven
We verify on our compute occasion what’s put in, we are able to seek advice from half 3 of the sequence:
[opc@webserver ~]$ java -version
openjdk model "17.0.17" 2025-10-21 LTS
OpenJDK Runtime Surroundings (Red_Hat-17.0.17.0.10-1.0.1) (construct 17.0.17+10-LTS)
OpenJDK 64-Bit Server VM (Red_Hat-17.0.17.0.10-1.0.1) (construct 17.0.17+10-LTS, combined mode, sharing)
For Helidon 4.3.0, we want Java 21, in order we see partly 3, we have to change the default Java to 21:
[opc@webserver ~]$ sudo update-alternatives --config java
There are 2 packages which offer 'java'.
Choice Command
-----------------------------------------------
*+ 1 java-17-openjdk.aarch64 (/usr/lib/jvm/java-17-openjdk-17.0.17.0.10-1.0.1.el9.aarch64/bin/java)
2 java-21-openjdk.aarch64 (/usr/lib/jvm/java-21-openjdk-21.0.9.0.10-1.0.1.el9.aarch64/bin/java)
Enter to maintain the present choice[+], or sort choice quantity: 2
[opc@webserver ~]$ java -version
openjdk model "21.0.9" 2025-10-21 LTS
OpenJDK Runtime Surroundings (Red_Hat-21.0.9.0.10-1.0.1) (construct 21.0.9+10-LTS)
OpenJDK 64-Bit Server VM (Red_Hat-21.0.9.0.10-1.0.1) (construct 21.0.9+10-LTS, combined mode, sharing)
Now, let’s confirm the Maven model:
[opc@webserver ~]$ mvn -version
Apache Maven 3.6.3 (Crimson Hat 3.6.3-22)
Maven house: /usr/share/maven
Java model: 17.0.17, vendor: Crimson Hat, Inc., runtime: /usr/lib/jvm/java-17-openjdk-17.0.17.0.10-1.0.1.el9.aarch64
Default locale: en_US, platform encoding: UTF-8
OS title: "linux", model: "6.12.0-105.51.5.el9uek.aarch64", arch: "aarch64", household: "unix"
This isn’t the one we want; now we have to put in it:
[opc@webserver ~]$ sudo dnf set up maven-openjdk21
....
[opc@webserver ~]$ mvn -version
Apache Maven 3.6.3 (Crimson Hat 3.6.3-22)
Maven house: /usr/share/maven
Java model: 21.0.9, vendor: Crimson Hat, Inc., runtime: /usr/lib/jvm/java-21-openjdk-21.0.9.0.10-1.0.1.el9.aarch64
Default locale: en_US, platform encoding: UTF-8
OS title: "linux", model: "6.12.0-105.51.5.el9uek.aarch64", arch: "aarch64", household: "unix
Excellent, that is what we want; we are able to proceed to the following step.
OCI Config
To entry the Gen AI Providers on OCI, we have to present some data to allow entry. The best manner is to generate an OCI config file.
It’s good to use the OCI Console and use an present or create a brand new API Key:


Now, we have to copy the personal key to our compute occasion (utilizing ssh, see half 2), and on the compute occasion, we create the folder ~/.oci, and we create the file config through which we paste the configuration:
[laptop]$ scp -i key.pem opc@:
[laptop]$ ssh -i key.pem opc@
[opc@webserver ~]$ mkdir ~/.oci
[opc@webserver ~]$ vi ~/.oci/config
[opc@webserver ~]$ chmod 600
Don’t overlook to replace the key_file entry within the .oci/config file to match the API_PRIVATE_KEY path, such as/house/opc/my_priv.pem.
Sakila Database
We’ll use the Sakila database in our software. That is methods to set up it:
[opc@webserver ~]$ wget https://downloads.mysql.com/docs/sakila-db.tar.gz
[opc@webserver ~]$ tar zxvf sakila-db.tar.gz
[opc@webserver ~]$ mysqlsh admin@
SQL > supply 'sakila-db/sakila-schema.sql'
SQL > supply 'sakila-db/sakila-data.sql'
Putting in the code
Lastly, we are going to deploy the code to our compute occasion. We have to first set up git:
[opc@webserver ~]$ sudo dnf set up git -y
And we fetch the code:
[opc@webserver ~]$ git clone https://github.com/lefred/helidon-mrs-ai.git
The Database
Now we have to add the Sakila schema to the REST Service (see half 8) and create a brand new desk that can retailer our customers’ credentials:
SQL > use sakila;
SQL> CREATE TABLE `customers` (
`user_id` int unsigned NOT NULL AUTO_INCREMENT,
`firstname` varchar(30) DEFAULT NULL,
`lastname` varchar(30) DEFAULT NULL,
`username` varchar(30) DEFAULT NULL,
`electronic mail` varchar(50) DEFAULT NULL,
`password_hash` varchar(60) DEFAULT NULL,
`updated_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP
ON UPDATE CURRENT_TIMESTAMP,
PRIMARY KEY (`user_id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_0900_ai_ci
We create a brand new service, and we add the brand new customers and the actor desk into it:





And for the actor desk, we additionally enable DML (INSERT, UPDATE, DELETE), and we choose a number of additional hyperlinks to different tables, such because the movies, the language, and the class:


After which, all is prepared for the REST Service:

Configuring the app
Now, we are going to configure the applying earlier than constructing it. Within the helidon-mrs-ai/src/fundamental/assets/ listing, there’s a template (software.yaml.template) we are going to use:
[opc@webserver ~]$ cd helidon-mrs-ai/src/fundamental/assets
[opc@webserver resources]$ cp software.yaml.template software.yaml
We’ll now modify the content material of the software.yaml file with the proper data associated to the environment.
Within the mrs part, the URL should match the MySQL HeatWave DB System’s IP and the service path we created. In my case it ought to be:
mrs:
url: "https://10.0.1.74/sakilaService"
For the langchain4j entries, please take note of your area and compartment-id. Yow will discover this data within the OCI Console within the Analytics & AI part. See half 6 within the view code of the chat instance.
Don’t overlook to make use of a mannequin that can be out there in your area.
You might have seen within the configuration file that port 8080 will likely be used; you then want to permit connections to it from the Web:




We have now allowed entry in OCI, now we additionally will allow entry into the compute occasion’s native firewall:
[opc@webserver ~]$ sudo firewall-cmd --zone=public --permanent --add-port=8080/tcp
success
[opc@webserver ~]$ sudo firewall-cmd --reload
success
Helidon
Now we are able to deploy our Helidon software:
[opc@webserver ~]$ cd ~/helidon-mrs-ai/
[opc@webserver helidon-mrs-ai]$ mvn clear set up -DskipTests bundle
As soon as constructed, we are able to run it:
[opc@webserver helidon-mrs-ai]$ java -jar goal/helidon-mrs-ai.jar
We are able to entry the general public IP of our webserver (the compute occasion) on a browser utilizing port 8080:

As you could bear in mind, we created a consumer desk to retailer our customers; we nonetheless have to make one consumer. To know its hash password, we have to generate it. We have now an endpoint that can generate such a hash: _debug/hash?pwd=

We use that hash whereas creating our consumer:
SQL > use sakila
SQL > customers (firstname, username, password_hash)
values ('lefred', 'fred',
'$2a$12$I2bVhBf8rbU9ODMijPpYn.Pe2ooHWxWAnQUWwcYe9cmMwVQVf8oL6')
We are able to now entry the /ui endpoint that can redirect us to the login web page:

And when efficiently logged, we are able to entry the applying:

If we choose the AI Abstract for one actor, we are going to get a abstract generated by the mannequin we configured. The next instance was generated utilizing xai.grok-3:



The next abstract was generated utilizing google.gemini-2.5-flash:



Conclusion
This final step wraps up the entire “Deploying on OCI with the starter package” journey by placing all of the constructing blocks collectively: a Helidon (Java) software working on a compute occasion, a MySQL HeatWave database uncovered by means of the MySQL REST Service, and OCI GenAI consumed by way of LangChain4j so as to add an AI-powered abstract function.
We noticed methods to deploy and begin writing an software in OCI and profit from the facility of the out there LLM fashions.
I hope you loved the journey and that you’re prepared to start out writing your individual app… comfortable coding!
