The future of Low code application development platforms

According to Gartner, by 2023, over 50% of medium and large enterprises will have adopted low code application development platforms (LCADP) as one of their application development strategies.

Currently the non-tech companies are becoming digitized at such a fascinating rate, thus a need to develop applications faster to improve on their inefficiencies and keep ahead of the competition. Alot of business have transitioned to the cloud, i.e ecommerce, restaurants, online reservations etc.

Out of curiosity… 🙂

How often do you use productivity apps such as spreadsheets to share data in your organization using emails?

How do you share the data on the spreadsheets to the rest of the team ensuring it’s security? what if one drops the ball and all the sensitive data is exposed?

How about collaboration? and the authorizations measures set to ensure the right audience can access the right data?

Well, if these and many other scenarios around data resonates with you, then you need to think of a low code application platform.

A Low code application development platform allows users to create customized enterprise grade applications rapidly using high level programing abstractions. Basically there is less code to no coding at all. The platform enables both professional and citizen developers to rapidly develop enterprise applications from using as simple tools such as spreadsheets.

Some of the major usecases of Low code application development include:

In Lines of Business

Low code application platforms allows you to rapidly create inter departmental apps. Business users with no coding skills can take advantage of low code platforms to innovate in their departments. The HR department can create applications using employee data, Sales & Marketing department can also create events or sales applications or integrate with other enterprise apps for efficiency. Also the finance department can create applications that consolidate reports and visualize them! there are so many application use cases and innovations around low code application development platforms.

Upgrading Legacy Applications


LCADP can enable you to build applications faster and migrate from your legacy software which may be limiting you in terms of functionality. With the agility that low code applications platforms comes with, you can be able to innovate faster and cut down costs in application development.

Extending the Enterprise applications

Low code application development platforms are best fit in extending Enterprise applications. Some of those functionalities or customizations not met by your enterprise application can be met by using LCADP. This platforms integrates very well to the existing Enterprise investments.

Things to consider while choosing a low code application development platform;

Security – how secure is the application built, the ability to support authentication and authorization even in a multi cloud environment.

Ease of Use – can citizen developers get started at much ease?

Integrations – can you be able to integrate with the standard REST/SOAP services or extend the enterprise applications?

Collaboration – How easy is it working as a team? can you track changes?

Oracle knows the value of low code application development and has invested in low code solutions to our customers.

Oracle Application Express (APEX) allows developers to build data centric web applications with superior functionality, performance and end user experience with minimum coding.

Oracle Visual builder enables you to build web, mobile and progressive web applications using a visual, browser based development environment.

Other cloud providers have also invested in low code application development platforms. A clear indication that the future of low code application platforms is massive.

In the era of cloud innovations, where there are a lot of changes in the market, companies need to be agile, innovate faster to be ahead of their competition. Low code platforms are a game changer due to the speed at which one can create an application, the limitless functionalities one can have and integrations capabilities they provide.

The future of Low code application development platforms is huge! Companies are choosing low code to cut down on application development cost, to reduce the time it takes in creating new applications and simply to innovate faster with less or no programming skills.

Do you wanna get started?

Check out Oracle Low code application platforms


Extending APEX application with Oracle Digital Assistant

We can extend APEX with a conversational UI which has Natural Language Processing (NLP) capabilities. To achieve this, i have used the Oracle Digital Assistant.


In one of my previous posts i showed you how to connect the Digital Assistant to Oracle ATP database using Node libraries and instant client.

In this post, we shall take advantage of Oracle REST Data services (ORDS), where we shall expose the data from the database using REST APIs.

We shall use APEX to build a low code application on Autonomous Oracle database.

We then create a Oracle Digital Assistant skill which shall pick the user intent and pass it as parameters to a Node.js function within a custom component that queries the Oracle database.


This is a hypothetical use case. Just to unlock new possibilities with APEX and Oracle Digital Assistant at ACME Recruitment.

ACME Recruitment uses an APEX application to manage all open opportunities and applications in different regions within the different LOBs.

Below is a minimum viable product (MVP) of the APEX application. You can access the live demo here:

Live demo

Username: Recruiter

Password: Innovate@2019!

You can also import the APEX app on your environment.
Download the APEX import file here.

ACME Dashboard
ACME Job Listings

We have extended this application using Oracle Digital Assistant(ODA). This will enhance the user experience of the users searching for open opportunities. Users can search for roles using their normal natural language and ODA will handle the rest using Natural Language Processing (NLP).

ACME Recruitment can take advantage of ODA and extend their APEX application with any of the conversational channels of their choice ie. Web, Facebook, Google Voice, Alexa etc.

I have used a web channel in this demo.

Download the pre-built skill here, feel free to enhance it – it has the custom component embedded in it!

Searching for a job using Oracle Digital Assistant

The High level Architecture:

APEX+ODA Architecture

Oracle Autonomous Database REST API (ORDS)

You will have to expose the data in your APEX application using ORDS to be consumed by the Digital Assistant custom component via REST APIs.

I have done an article on how to create ATP RESTful services. Check it out!

Oracle Digital Assistant custom component

Here your knowledge of creating ODA Custom components will be highly useful!

On the skill, create an ODA “composite bag” entity which shall resolve the user intent of location and job type.

In the yaml flow create a System.ResolveEntities component to pick the user intention. You shall then pass the two variables; jobtype and location to a custom component.

jobsAvailable:     #Ideally should resolve the customer intent 
     component: "System.ResolveEntities"
       nlpResultVariable: "iResult"   
       variable: "vacancy"
       maxPrompts: 2
       cancelPolicy: "immediate" 
         cancel: "maxError"
         next: "listVacancies"
     component: "vacancies"
       jobType: "${vacancy.value.BagItemjobType}"
       jobLoc: "${vacancy.value.BagItemlocation}" 
        jobAvailablelist: "listVacancies2"
        NotAvailable: "resetVariables"

On the custom component, you pick the two variables jobtype and location and run a Node request to the ATP database REST API you had created.

There would be a better way to write the code below, this is just an example 🙂

'use strict';
var request = require('request');
//function to determine jobs in a given category and country
function query(jobt,jobl, callback)
  request({ "uri": "https://..../ords/labanish/recruitement/vacancies/"+jobt+"/"+jobl,
      "method": "GET"
    }, function (err, res1, body) {
        body = JSON.parse(body);
        callback(body, err);
module.exports = {
  metadata: () => ({
    name: 'vacancies',
    properties: {
      jobType: { required: true, type: 'string' },
      jobLoc : { required: true, type: 'string' }
    supportedActions: ['jobAvailablelist', 'NotAvailable']
  invoke: (conversation, done) => {
    const { jobType } =;
    const { jobLoc } =;
    // determine jobs
    var jobLocUpper = jobLoc.toUpperCase();
console.log(jobType+ " "+jobLocUpper);
 query(jobType,jobLocUpper, function (res1, err){
  var lst = "";
  if (res1 && res1.items)
    if (res1.items.length) 
      for (var i = 0; i < res1.items.length; i++) 
        if (i > 0)
          lst += ", "
        lst += res1.items[i].vacancyid +": "+res1.items[i].title;
    conversation.variable("jobTypes", lst);
      console.log("No items on the selected category");
      conversation.reply("Unfortunately we do not have any vacancies related to " +jobType.toLowerCase() + " in "+jobLoc);
    console.log("Encountered an error... please try again later or contact the demo administrator");
    conversation.reply("Encountered an error... please try again later or contact the demo administrator");

Once done you pack your custom component for local deployment.

Quick Steps:

The ODA skill I shared with you is linked to my APEX application. You can import the skill and retrain it to test it.

Also visit my APEX application to validate the results. (You can manipulate data in the APEX app, check the authentication credentials above)

In ODA run utterances such as;


I am looking for a job in Marketing…

Any IT Jobs in Africa?

What roles do you have in Europe related to technology …

Any marketing roles in ASIA?

Watch the Demo Video:


APEX enables a business create low code applications within less time. These applications are secure and can be extended to fit the business requirements.

Oracle Digital Assistant offers Natural Language Processing capabilities which can allow the user to query using their normal natural language as they express their intention. This capability comes in handy while extending Enterprise applications using a conversational UI.

I advise you to get a Oracle Free tier cloud to get started with Oracle cloud.

I welcome feedback from the above post.

Happy coding!

Enable RESTful Services on your Oracle Autonomous Database

You can develop and deploy RESTful Services using the native Oracle REST Data Services (ORDS) support on an Autonomous Transaction Processing database.

This is a great use case in application development to access Oracle Database without using native libraries or instant client.

ORDS is a java mid tier application which maps the HTTP(s) verbs to Oracle database transactions using SQL commands as follows:


You can use SQL developer or APEX to enable REST on your ATP database. I will use Oracle Application Express (APEX) on this blog.


Navigate to your Oracle autonomous database on Oracle cloud. Select the service console.

On the dashboard, click on development then select Oracle APEX.

If it’s your first attempt to use APEX, you will have to create a new workspace and username. Then logout from your Oracle ADMIN profile and log in to the APEX using the new credentials created.

APEX workspace has an example RESTful Service Module, You can install the sample service while registering the schema with ORDS to learn more about the functionality.

Once logged in click on SQL Workshop and select RESTful Service. Note that you will have to REST enable your schema. (You can also configure the schema attributes such as Authorization required for metadata access, for the purpose of this demo, I have selected NO)

In my ATP database i have a table called “vacancies” which I want to expose the data using a RESTful service. Find the table DDL script here.


To expose the data using a RESTful service:

Create a new module.

Click on modules and select create module. My module name is called “tutorial

Now that we have the module created, we need to create a template on the under the module.

On the just created module page, locate the “create template” button and create a new template. Give it a name ie. “vacanciesInfo

Now we have a module and a template, now lets create a handler to define the REST methods, ie. GET, POST etc.

Locate the “create Handler” button on the template and create a new handler. Define the handler as shown below.

Restful Data services -> Module -> Template ->Handler

Save or apply changes.

Once done and applied the changes, copy the full url generated and test your RESTful service on a browser. You should get the JSON response of the data from the table in Autonomous database!

That completes the tutorial.

5th May 2020 Update:

What happens if you want to insert into a table in Oracle Database using REST. You use the POST method.

You will create a new handler for POST method, then use an insert PL/SQL code for your table. ie.


 insert into CASES (countyname, code, casesno, recoveryno, deathsno) values ( :countyname, :code, :casesno, :recoveryno, :deathsno);
The ORDs handler page on APEX

Test the request on postman;

The record is inserted, Status 201

To explore more on ORDS check out the documentation.

Also check:

Happy coding!

A multi cloud strategy in cloud adoption

Cloud computing is the convenient model of computing where users can access from anywhere a shared pool of resources and pay only for what they consume.

It has been largely adopted by majority of the startups , SMEs (Small and Medium-sized Enterprises), and large enterprise businesses. Among the reasons being that it makes the businesses to focus on the core of their business as the cloud provider offers a scalable, reliable and on demand infrastructure, platform or software service.

Multi cloud strategy is the newly adopted trend where businesses are involving more than one cloud vendor to support their business. This should not be confused with hybrid computing.

Hybrid computing refers to when a business runs some of it’s workloads on premise and other on a public or private cloud with a secured connection between the two. Multi cloud on the other hand refers to when a business involves different cloud providers to run it’s workloads.

Some of the benefits of a Multi Cloud Strategy include:

  • Avoiding Vendor lock in – Some enterprises are afraid of being locked into a single vendor- thus with the multi cloud strategy they are not “dependent” on a single cloud provider.
  • Redundancy and Performance – Every enterprise wants to have a continual up time and performance to beat their competition. This makes them approach multi cloud strategy to support their strategy.
  • Data sovereignty /Compliance – Due to different regulations, some enterprises are forced to adopt certain cloud vendors within their regions or within the stipulated regulations regarding data .

However, a multi cloud strategy comes at cost such as you require a large talent pool to handle the various technologies from the various vendors, this can also lead to management complexity.

Some cloud vendors have already adopted the multi cloud model.

Oracle Cloud and Microsoft Azure cloud have a interoperability partnership allowing customers to run mission critical enterprise workloads across both clouds.

Google embraces the mult cloud strategy with Anthos, an open application modernization platform.

As an enterprise, you need to evaluate your current status on the digital transformation journey, where you are today and where you target to be, then choose wisely a cloud strategy that shall offer you a high ROI.


Today, the greatest concern on many enterprise IT infrastructures is manageability, reliability , availability, performance, security, scalability just to list a few.

If a multi cloud strategy solves the above challenges for you, then it’s time you take the leap.


Tech Target

ZD Net

Oracle News

Google News

Bash commands you can never live without!

This is more of a generic post. I am posting some of the Bash/Linux/Windows commands i find useful everyday!

Source Control :Git

When you want to push your local project to GitHub, create a repository (Don’t add a Readme file yet) on GitHub.

On your local workstation, fire the bash utility in the root folder of your project on your windows file system.

On the Git Bash, run:

git init
git remote add origin https:// (link to your online repository)
git remote -v (Just to confirm your remote repo)
echo 'node_modules'>>.gitignore (Files you want to ignore to pushing to GitHub)
git add .
git commit -m "First commit"
git push origin master

A times you may have commit issues due to file conflicts:

Open the conflicting file -edit and merge the head accordingly. Then;

git add filename
git commit -m "merged file A and B"git push origin master

Other commands:

Ngrok: (helps you expose your localhost to public internet)

http localhost:<port>  (Windows, where port is your port number)

./ngrok http <port> (Linux, where port is your port number)

In Oracle compute, you might want to run a “screen” command to run the ngrok  on detached mode:

screen-S <name of the new screen> This opens a new Screen.

Deploy ngrok on this screen. When done press Ctrl +A+ D to exit.

To confirm the the screen got created:

screen -ls  (shows you all the screens and their ids named "Detached")

To re-access the screen:

screen -r <screen_ID>

To quit:

screen -X -S <screen_id> quit

What other commands do you use daily… but you keep forgetting?

Happy coding! Keep Innovating!




Connect Oracle Digital Assistant to Oracle ATP Database

From my previous post, i showed you how to connect to Oracle ATP database using Node.js.

We shall take advantage of this capability and build a custom  component to connect a digital assistant skill to the ATP database.


Use Case

I created a simple Event booking skill which run queries on ATP database.

If you managed to connect to ATP database using Node.js then you are up to this task also!

My pseudo persona, Jamie Foxx, will use the Oracle digital assistant to book seats on the events near him. This will update the Bookings column on the EVENTS table in ATP. It will also add a new record on the RESERVATIONS table.

On the bot:


On the ATP database an update is done on the Bookings column of EVENTS table (viewing using SQL Developer):


A new record is added on the RESERVATIONS table:



You cannot book more than the seats available.

Ex. Jamie Foxx has booked two seats at Jazz Festival, so 198 seats are remaining now, lets try and book 201 seats for the same event  🙂


How was this done?

Interested? Let’s walk through the simple architecture: 🙂


Create a Event Skill bot in ODA

My skill has 3 intents; Bookings, Greetings, SearchEvents.

I have implemented a resolveEntities system component to extract event name and number of seats entities from the user input in a composite bag.

  platformVersion: "1.0"
main: true
name: LN_Events
#context: Define the variables which will used throughout the dialog flow here.
    iResult: "nlpresult"
    variableMakeReservations: "MakeReservations" #Composite bag
    name : "string"
    eventid : "string"
    seatstoReserve : "string"
    paid : "boolean"
    bookings : "string"
    eventname : "string"
    remainingSeats : "string"
    events: "Events"


 ############## Intents #######################

    component: "System.Intent"
      variable: "iResult"
        Greetings: "startGreeting"
        Bookings: "startBookings"
        SearchEvents: "searchEvents"
        unresolvedIntent: "unresolvedState"

    component: "System.Output"
      text: "Hi Jamie Foxx, I am your online Events Booking Bot, i can help you reserve seats on the available events, also trust me, I can facilitate your payments!"
      return: "startGreeting"


    component: "System.List"
      prompt: "Hi Jamie Foxx, here are the events near your location. Which one would you wish to book?"
      options: "${events.type.enumValues}"
      variable: "events"        
      next: "setEventName"
    component: "System.SetVariable"
      variable: "variableMakeReservations.BagEvent"
      value: "${events.value}"

    component: "System.SetVariable"
      variable: "name"
      value: "Jamie Foxx"
      next: "resolveEntities"
############# Resolve entities ############################

    component: "System.ResolveEntities"
      #transitionAfterMatch: true
      variable: "variableMakeReservations"
      nlpResultVariable: "iResult"      
      maxPrompts: 2
      cancelPolicy: "immediate" 
        #match: "" 
        cancel: "maxError"
        next: "submitBooking"      

  submitBooking: #This is the custom component
    component: "makeReservation" 
     name : "${name}"
     eventname : "${variableMakeReservations.value.BagEvent}"
     seatstoReserve : "${variableMakeReservations.value.BagNumber.number}"
      next: "endsubmitBooking"      


    component: "System.ResetVariables"
      variableList: "iResult,variableMakeReservations,name,eventid,seatstoReserve,eventname,remainingSeats,Events"
      return: "done"


  output: #For debuging purposes
    component: "System.Output"
      text: "You have reserved ${variableMakeReservations.value.BagNumber.number} seats at ${variableMakeReservations.value.BagEvent}"
      #next: "savetoATP"
      next: "endsubmitBooking"

    component: "System.Output"
      text: "Sorry the input is invalid"
      return: "maxError"


    component: "System.Output"
      text: "Sorry,i didn't quite get that, could you try again?"
      keepTurn: false
      return: "unresolvedState"     

You can download the skill here.

Create a Custom Component

From the YAML file above, we can see that we have a custom component called “makeReservation

I scripted an ODA custom component  running on a Node.js server to get the inputs from the skill and perform CRUD operations on the ATP database.

Here is my custom component, named makeReservation.js I have tried to comment on my code to make it easier to understand 🙂

The inputs from the Event skill custom component includes; name, eventname, seatstoReserve.

After the logic the response is sent back to the bot using the “sdk.reply(message);”

require ('custom-env').env('stagging'); //find the .env.stagging file and place the right location of your wallet

var async = require('async');

var oracledb = require('oracledb');

var dbConfig = require('./../dbconfig.js');


let name;

let eventid;

let seatstoReserve;

let paid = 'No';

let bookings;

let eventname;

let remainingSeats;

let message;

module.exports = {

metadata: function metadata() {

return {

"name": "makeReservation",

"channels": {

"facebook": "1.0",

"webhook": "1.0"


"properties": {

"name": {"type":"string", "required": true},

"eventname": {"type":"string", "required": true},

"seatstoReserve": {"type":"string", "required": true}


"supportedActions": []



invoke: function invoke(sdk, done) {


//console.log('Check User Payload: ' + JSON.stringify(sdk.payload()));

name =;

eventname =;

seatstoReserve = parseInt(;

console.log("DATA: " + eventname +": "+ seatstoReserve);

////Start Functions

var doconnect = function(cb) {


user: dbConfig.dbuser,

password: dbConfig.dbpassword,

connectString: dbConfig.connectString




var dorelease = function(conn) {

conn.close(function (err) {

if (err)




var doCheckAvailability = function(conn, cb){

console.log(`check availability at ${eventname}`);




function(err, result)


if (err) { console.error(err); return cb(err, conn); }


eventid = JSON.stringify(result.rows[0][0]);

remainingSeats = JSON.stringify(result.rows[0][2]);

bookings= JSON.stringify(result.rows[0][1]);

console.log('Event ID: '+ eventid);

console.log('Event Name: '+ eventname);

console.log('Bookings: '+ bookings);

console.log('Remaining Seats: '+ remainingSeats);

return cb(null, conn);



var doinsert = function (conn, cb) {

//before you add a reservation, check if there are seats available

if (parseInt(seatstoReserve) > parseInt(remainingSeats)){

message = `Sorry, you can't book ${seatstoReserve} seats at ${eventname}. Unfortunately we do not have enough seats available... the available seats remaining are ${remainingSeats}`;

console.log(`From console: log Sorry, you can't book ${seatstoReserve} seats at ${eventname}. No enough seats available... the available seats are ${remainingSeats}`);


//return false;


var data = [name,eventid,paid,seatstoReserve]



[data], // bind the JSON string for inserting into the JSON column.

{ autoCommit: true },

function(err) {

if (err) {

return cb(err, conn);

} else {

//console.log("Data Inserted");

message ="Reservation done succesifully for: "+ eventname +": "+ seatstoReserve+" seat(s)";

//return cb(null, conn);




return cb(null, conn);


var doUpdateBooking = function(conn, cb){

bookings = parseInt(bookings) + parseInt(seatstoReserve);

var Updatedata = [bookings, eventid]





{ autoCommit: true },

function(err) {

if (err) {

return cb(err, conn);

} else {

console.log("Data Updated!");

return cb(null, conn);


} );


////Run the logic here using waterfall



doconnect,// Does the connection -Very key!

doCheckAvailability, //Check if seats are available

doinsert, //Add reservation

doUpdateBooking //Update the bookings in Events table


function (err, conn) {

if (err) { console.error("In waterfall error cb: ==>", err, "<=="); }

if (conn)


//reply to the bot!








From the code above, you need to define dbCofig.js -where you store you connection details (Just like from my previous post) and then define environment variables on a .env file.

The dbconfig.js file:
module.exports= {

dbuser: 'admin',

dbpassword: 'Your Password',

connectString: 'YourDatabaseName_TP'


On my .env.stagging file indicates where you have stored your ATP connection wallet.


On my main file which is event_api.js you need to spin up the Node server using express() Node library:

require ('custom-env').env('stagging') //either stagging or production

const apiURL = '/mobile/atp/components';

// Reference component shell

var shell = require('./shell')();

const express = require('express')

const bodyParser = require('body-parser')

const service = express()

const request = require('request')

service.set('port', (process.env.PORT || 5002))

// Process application/x-www-form-urlencoded

service.use(bodyParser.urlencoded({extended: false}))

// Process application/json



* Mobile Cloud custom code service entry point.

* @param {external:ExpressApplicationObject}

* service


//module.exports = function(service) {


* Retrieves metadata for components implemented by this service.


service.get(apiURL, function(req,res) {

res.set('Content-Type', 'application/json')





* Invoke the named component

*/'/:componentName', function(req,res) {


const sdkMixin = { oracleMobile: req.oracleMobile };


shell.invokeComponentByName(req.params.componentName, req.body, sdkMixin, function(err, data) {

if (!err) {



else {

switch ( {
















// Spin up the server

service.listen(service.get('port'), function() {

console.log('running on port', service.get('port'))


The package.json:


"name" : "event_api",

"version" : "1.0.0",

"description" : "API for Event bot connecting to ATP",

"main" : "event_api.js",

"oracleMobile" : {

"dependencies" : {

"apis" : { },

"connectors" : { }


"configuration": {

"node": "6.10"




If you have packaged the custom component with all the other javascript files required ie. Shell.js, sdk.js, registry.js, MessageModel.js , you can spin up the server with:

>node event_api.js

Clone the Component on GitHub

However if you want to keep you server live you can run event_api.js using the opensource tools like pm2 running on Oracle Compute cloud.

You need to connect your skill to your custom component service running on the Node server on the components tab.


All done now! Run the code!

On my ATP database tables RESERVATIONS, i have 0 entries. Let’s reserve two tickets at Jazz festival.



On the events skill bot:


The data is updated on table EVENTS – for jazz festival : 2 seats on Bookings column.oda5

Also added a record on the RESERVATIONS table.


I played around with validations too- you cannot over book 🙂


Final Thoughts

Node-orcledb can be used to create a middleware layer which will present REST APIs and websockets interfaces for Oracle ATP database. The middleware logic can be hosted on any Node.js server or even run in a container such as OKE.



Thank you and Happy coding! If you liked it, share it!

This blog reflects  my own thoughts and doesn’t reflect the thoughts of my employer.


Connect to Oracle ATP database using Node.js

In this week series I will explore the capabilities of Oracle Autonomous Transaction Processing (ATP) database beyond connecting to SQL developer.

There are several ways to achieve this, however this is the simplest way to show the capabilities/possibilities that can be attained. 


  • Node.js installed in your computer
  • node-oracledb library
  • Oracle Instant Client
  • Oracle ATP database

Installing the Oracle Instant Client in Windows OS

We need the Oracle Instant client to connect and run remote Oracle databases in Node.js.

Download and install the Oracle Instant Client.

Unzip the package into a single directory ie. C:\oracle\instantclient_18_5

Set the environment variable PATH to include the path that you created.

Download the ATP Database connection Wallet.

Log in to your ATP database and download your credential wallet. This contains your connection information to your Oracle ATP database.


Extract the wallet files in a given folder. Mine are in:


We need to update the sqlnet.ora &  files in the wallet folder to reflect the location of the wallet.

In file:"c:\wallets")))



Set the TNS_ADIMN variable:

Since we are using Node.js we can define the Environment variables in an .env configurations file many thanks to the custom-env node library.


Install these Node.js libraries to help you run the connection. (Each of the libraries below has their specific functions- We shall see on a later post)

npm install oracledb
npm install async
npm install app
npm install express

You can create a simple testconnection.js file to confirm that the connection to ATP database is working. This file requires an .env configurations file and the dbconfig.js file.

require ('custom-env').env('stagging') //find the .env.stagging file and place the right location of your wallet


var oracledb = require('oracledb');

var dbConfig = require('./dbconfig.js');

let error;

let user;


user: dbConfig.dbuser,

password: dbConfig.dbpassword,

connectString: dbConfig.connectString


function(err, connection) {

if (err) {

error = err;



connection.execute('select user from dual', [], function(err, result) {

if (err) {

error = err;



user = result.rows[0][0];

console.log('Connection test succeeded. You connected to ATP as ' + user + '!');

error = null;

connection.close(function(err) {

if (err) {






The dbconfig.js file:
module.exports= {

dbuser: 'admin',

dbpassword: 'Your Password',

connectString: 'YourDatabaseName_TP'


The .env configurations file helps you load the Node.js app environment variable configurations on different environments. ie. on my .env.stagging file i have:


Run the testconnection.js file:

node testconnection.js 


We succeeded in connecting to the ATP database using Node.js.

Installing the Oracle Instant Client in Oracle Linux

Download a “Basic” or “Basic Light” zip file matching your architecture.

Unzip the package in the folder that is accessible to your application

mkdir -p /opt/oracle
cd /opt/oracle

Install the libaio package as root.

sudo yum install libaio

If there is no other Oracle software on the machine that will be impacted, permanently add Instant Client to the run time link path

sudo sh -c "echo /opt/oracle/instantclient_18_3 > /etc/"
sudo ldconfig

Else set an environment variable LD_LIBRARY_PATH   to the directory of the Instant Client.

export LD_LIBRARY_PATH=/opt/oracle/instantclient_18_3:$LD_LIBRARY_PATH

Co locate the ATP connection wallet with the Instant client, create a network/admin subdirectory if it does not exist.

mkdir -p /opt/oracle/instantclient_12_2/network/admin

Edit the sql.ora file to reflect the directory “?/network/admin”


Load your Node.js files as shown previously. Test to see if your connection worked!


On the next post we shall explore how we can use this ability to connect an Oracle Digital Assistant with ATP database using a custom component.



Thank you and Happy coding! If you liked it, share it!

This blog reflects  my own thoughts and doesn’t reflect the thoughts of my employer.




Running a Web Server on Oracle Compute Instance

Every website sits on a computer which is basically as a Web server. A web server processes requests via HTTP(s) and other related protocols.

There are several web servers out there, but i shall focus on installing Apache HTTP server on Oracle Linux and Ubuntu operating systems running on Oracle compute instances.


  • Oracle Cloud Infrastructure compute instance
  • SSH Client ie. PuTTY
  1. Installing Apache HTTP on Oracle Linux.
Install Apache http
sudo yum install httpd -y

Start Apache server and configure it to start at system reboots

sudo apachectl start
sudo systemctl enable httpd

Check if the Apache configuration syntax is correct, run:

sudo apachectl configtest

Now create firewall rule to access all ports that HTTP listens to.

sudo firewall-cmd --permanent --zone=public --add-service=http

sudo firewall-cmd --reload

Find the web root directory for your web server and add your web files. ( Found at the “/var/www/html” folder)

One more thing!

Open port 80 in the security lists of your compute instance. 

Navigate to your compute instance and select your VCN.

Click on security lists on the left bar under resources and select the default security list for your VCN.

Add an ingress rule for port 80, edit the values as shown on the image below.

Source Type: CIDR
Source CIDR:
IP Protocol: TCP
Source Port Range: All
Destination Port Range: 80

Click on Save Security List Rules at the bottom.


Almost done now…

Create an Egress rule to allow traffic for all ports.


All done!

To test open your  favorite web browser and navigate to the public IP address of the Linux VM.



I have configured DNS to resolve to my public IP address on my Oracle compute instance. I will show you how to do that on the next post 🙂

It works!

2. Installing Apache HTTP server on Ubuntu

The whole process of configuring the compute instance security lists is the same as above, only the commands for installing Apache on Ubuntu change a little bit.

Log in to your compute instance running on Ubuntu using SSH.

Search the Apache package:

apt-cache search apache

Install the Apache2 package using the root privileges.

sudo apt-get install apache2

Where is the Apache HTTP server installed?

To find out,  run the find command:

sudo find / -name apache2


Navigate to your web root folder and add your web files. (Found at /var/www/html)

Test out your web browser using your favorite browser.


Final Thoughts:

Apache HTTP server is an open source cross platform web server software which can can also act as an application server. There are several other web servers out there ie. Nginx, IIS etc.


Thank you and Happy coding! If you liked it, share it!

This blog reflects  my own thoughts and doesn’t reflect the thoughts of my employer.

Run Node.js Applications on Oracle Cloud Infrastructure using PM2


What is PM2?

Pm2 is a production process manager for Node.js applications with built-in load balancer. It allows you to keep you applications alive forever, reloads them at zero downtime. It’s simple to use and makes managing a production environment seamless.

Starting a node app in pm2 is as easy as;

$ pm2 start app.js

Installing PM2 on Oracle Compute;

First, you need to take care of your firewall and open the necessary ports.

Login to your compute instance on OCI through cmd and open the ports you want to use using the firewalld command.

First install firewalld (If not yet installed)

 sudo yum install firewalld

Next expose port you want to use ie.5001 to the public to allow in-bound web traffic via HTTP. (Adding it on the public zone)

sudo firewall-cmd --zone=public --add-port=5001/tcp --permanent

Reload the firewalld – you can even check the features enabled on the public zone.

sudo firewall-cmd --reload
sudo firewall-cmd --info-zone public


Now that we have exposed port 5001 to the public, lets now install PM2. It’s a piece of cake! Run;

npm install pm2 -g

On your node application, specify the port you exposed.


Run your app using pm2 start command (My node app is saved as app.js)

pm2 start app.js


You can now test your Node.js app on your browser using the public IP address and the port you exposed;

 <your compute instance public IP address>: <port>


It works! Note that you can run more than one Node.js application using PM2, just expose them in different ports.

Finally a few other PM2 commands and resources to keep you going.

pm2 ls — Show a list of all applications
pm2 stop <app> — Stops a specific application
pm2 start <app> — Starts a specific application
pm2 <app> scale N — Scales the application you specify to N number of instances (can be used to scale up or down)
pm2 kill — Kills all running applications
pm2 restart — Restarts all running applications
pm2 reload — Reloads the app configuration
pm2 monit -will return a rich set of data around your application’s health
pm2 logs — Outputs logs from all running applications
pm2 logs app — Outputs logs from only the app application
pm2 flush — Flushes all log data, freeing up disk space

Final thoughts;

Pm2 is an amazing open source project. Many thanks to everyone putting all the resources out there!

If you need any support feel free to reach out, or if you have any additional tips and tricks, feel free to share with me!

Here are additional resources and references:


Thank you and Happy coding! If you liked it, share!

This blog reflects  my own thoughts and doesn’t reflect the thoughts of my employer.