Wednesday 4 December 2019

OAC Remote Data Gateway Part 2: The RPD



OAC Data Gateway 105.4: Now with added RPD goodness

In one of my previous posts (here) I went through the setup of the Oracle Analytics Cloud Data Gateway utility. I showed how this could be used to connect your OAC instance to your on-premises data sources without the need of opening firewalls etc.

One missing piece of the puzzle was how to get the OAC RPD semantic layer to connect to on-premises using Data Gateway. This is a key piece of functionality that all current OBIEE customers will need if they want to connect their existing dashboards to their on-premises datasources.  Thankfully with the release of OAC 105.4 it is now possible to connect your to your on-premises data sources via your RPD using Data Gateway.

Prerequisites:
  • Install the latest version of OAC Data Gateway from here
  • Install the OAC 105.4 Admin Tool on the SAME machine

Okay, that's it for pre-reqs :)

Alors mon cher ami, how do I get this to work?

Here are the steps to connect your OAC Cloud RPD to your on-premises data sources:
  • Make sure Data Gateway is running. Check here if you need to review the setup
  • Open up the obiee_rdc_agent.properties in the Data Gateway folder and note down the port number. (Beware, this number will change automatically each time you save the Data Gateway config so alter as necessary)

  • Launch the OAC Admin Tool
  • Before you open your RPD file go to File -> Load Java Datasources...


  • Enter the following details:
    • Hostname: localhost (Data Gateway is on the same machine as your Admin Tool)
    • Port: The port you found in the obiee_rdc_agent.properties earlier
    • User: Whatever. Literally enter anything you like for the username and password, there is no authentication but this field needs to be populated
    • Password: whatever


  • You should see the following message:


  • Open your RPD 'In the Cloud'
  • Duplicate the physical source as a backup (just in case)
  • Open the connection pool select JDBC as the call interface
  • Enter the jdbc string to connect to your local db in the Connection String (SQL Server in my example jdbc:oracle:sqlserver://MySQLHostname:1433;DatabaseName=MyDatabase)

  • In the Miscellaneous tab enter the path to the jdbc driver class. In my example I'm using SQL Server com.oracle.bi.jdbc.sqlserver.SQLServerDriver but you can find the full list of JDBC and JNDI templates and examples here

  • Save your RPD and publish to the cloud
  • To test the connection try to import the metadata by right-clicking on the Connection Pool:

And that's it. All being well you should be able to view your existing dashboards in OAC sourcing the data from your on-premises databases. When we tested response times were surprisingly fast.

Tuesday 1 October 2019

OCI - Installing EPM Automate on Always Free Oracle Linux

Using Oracle's FREE Cloud VMs to Store your EPM Cloud Snapshots

Unless you have been living under a rock for the past month you will know that Oracle have released their "Oracle Cloud Free Tier". This consists of the following services FOR FREE!
https://www.oracle.com/uk/cloud/free/


I've enjoyed playing with ADW and APEX, an incredible combo which will enable us to build simple applications to enhance our existing EPM Cloud apps.

The two free Oracle Cloud Infrastructure Compute VMs also caught my attention. These could of course be used for a multitude of things, one of which could be a host for EPM Automate and a location to store EPM Cloud snapshots.

Installing EPM Automate on Oracle Linux OCI

There are several blogs out there on how to setup your instance. Here is a good one from APEX guru Dimitri:
https://dgielis.blogspot.com/2019/09/free-oracle-cloud-6-create-vm-instance.html
So, once you've got your VM running you can install EPM Automate.

  • Download the Linux EPM Automate installer. The link can be found in the PBCS user name drop-down:


  • Download the Java JDK for Linux from here for example (the JDK doesn't come bundled with the Linux EPM Automate version)
  • Copy both installers to your Oracle Linux VM using your favourite file transfer tool (WinSCP for instance)

Installing the JDK on Oracle Linux VM

JRE version 1.7 or higher is required.

This is nice and simple. The installer is just a compressed file so you can extract it to your preferred location:
tar -zxvf <filename>
e.g.
tar -zxvf jdk_ri-8u40-b25-linux-x64-10_feb_2015.tar.gz

I extracted mine to /GS_software/java-se-8u40-ri


Installing EPM Automate on Oracle Linux VM

This is again nice and simple. The EPM Automate is a compressed file, extract it to your preferred location:

tar -xvf epmautomate.tar

I extracted mine to /GS_software/epmautomate



Setting your JAVA_HOME

You'll need to set the JAVA_HOME in your user profile or as a shell environment variable.


Running EPM Automate

So, we've extracted the JDK and the EPMAUTOMATE archive, we should be good to go right?

WRONG!

I got the following error when trying to login:

EPMAT-7:Unable to connect to https://planning-blahblah.pbcs.em3.oraclecloud.com Root Cause: java.lang.RuntimeException: Unexpected error: java.security.InvalidAlgorithmParameterException: the trustAnchors parameter must be non-empty.
 1. Ensure that you are using the correct URL.
 2. If your organization requires the use of a proxy server to connect to the internet, verify that "proxyHost" and "proxyPort" environment variables are correctly set.
 Ex: setenv proxyHost <proxyhostname>,setenv proxyPort <proxyport>

The problem here is that we don't have the SSL certificate installed into our java keystore.

Importing the SSL Certificate into the Java Keystore

Open a browser and connect to your PBCS URL
https://planning-blahblah.pbcs.em3.oraclecloud.com/HyperionPlanning

Make sure you actually login to PBCS. If you download the certificate on the login page it will be the wrong certificate and you will get a bogus error like this: "EPMAT-7: Unable to connect. Unsupported Protocol: HTTPS".
  • Once you've logged in to EPM Cloud click on the browser padlock-> connection ->more information:


  • Click on 'View Certificate':
  • Click on Export and save the .crt file to your desktop
  • Upload it using WinSCP to your VM e.g. /GS_software/epmautomate/pbcs_em3_oraclecloud_com.crt
We now need to import the certificate into our Java cacerts keystore:
  • Change directory to jre/bin 
cd /GS_software/java-se-8u40-ri/jre/bin
  • Run the keytool command to import the certificate:
./keytool -import -file /GS_software/epmautomate/bin/pbcs_em3_oraclecloud_com.crt -storepass changeit -keystore /GS_software/java-se-8u40-ri/jre/lib/security/cacerts -alias pbcs_em3_oraclecloud_com


Now test EPM Automate again:


It works!

So you have a FREE Oracle VM on which you can run your EPM Automate commands and save your EPM Cloud PBCS snapshots for FREE! Zero installs required in your customer data centre.

Did I mention it was FREE?

If you're not comfortable with Linux or simply haven't got the time then our support team at Brovanture will gladly manage this service for you.

Thanks :)

Wednesday 11 September 2019

OAC - Starting and stopping instances using the REST API (Autonomous/Oracle Managed)

We waited a while to be able to start and stop OAC Oracle Managed (autonomous) instances through the OAC console but there isn't a way to schedule the starting and stopping of instances in the console. This is of course very important if you want to keep costs down by automatically shutting down your instances overnight for example.


Thankfully there are now REST API calls available for Oracle Managed OAC instances which allow you to stop and start your instances. You can schedule these commands from a scripting tool of your choice to automate this process e.g. create a batch script and schedule the start/stop using the Windows scheduler.

OAC Start Instance REST API Call

The call to start the instance is a follows:

curl -v -i -X POST -u username:password -H
"Content-Type:application/vnd.com.oracle.oracloud.provisioning.Service+json"
-H "X-ID-TENANT-NAME:idcs-<IDCS-Account-Name>"
"https://psm-cacct-<PSM-Account-Name>.console.oraclecloud.com/paas/api/v1.1/instancemgmt/idcs-<IDCS-Account-Name>/services/AUTOANALYTICSINST/instances/<Instance-Name>/start"

<IDCS-Account-Name>: You can get this by logging into your Oracle MyServices homepage and clicking on the Oracle Identity Cloud Service icon:




Your IDCS account name can be found in the URL of the new tab: 
https://idcs-ef1234567fds1234567df123456.identity.oraclecloud.com/ui/v1/adminconsole


<PSM-Account-Name>: You can get this by logging into your Oracle MyServices homepage and clicking on the Oracle Analytics Cloud icon


Your PSM account name can be found in the URL of the new tab:

https://psm-cacct-ef123456gh123456hi456.console.oraclecloud.com/psmui/faces/paasRunner.jspx?serviceType=AUTOANALYTICSINST

<Instance-Name>: You can find your instance names by clicking on the Oracle Analytics Cloud icon:


Notice in the above screenshot we have both a BI instance and an Essbase instance. These commands can be used to start/stop any of your OAC instances.

So in this example the command would be:

curl -v -i -X POST -u username:password -H
"Content-Type:application/vnd.com.oracle.oracloud.provisioning.Service+json"
-H "X-ID-TENANT-NAME:idcs-ef1234567fds1234567df123456"
"https://psm-cacct-ef123456gh123456hi456.console.oraclecloud.com/paas/api/v1.1/instancemgmt/idcs-ef1234567fds1234567df123456/services/AUTOANALYTICSINST/instances/PrdOBI/start"

Once you have launched the command you will recieve this output if the command is successful:


You should also see the message in the console saying that the startup is in progress:


Everything gets logged in the OAC activity log as you would expect:



OAC Stop Instance REST API Call

The call to start the instance is a follows:

curl -v -i -X POST -u username:password -H
"Content-Type:application/vnd.com.oracle.oracloud.provisioning.Service+json"
-H "X-ID-TENANT-NAME:idcs-<IDCS-Account-Name>"
"https://psm-cacct-<PSM-Account-Name>.console.oraclecloud.com/paas/api/v1.1/instancemgmt/idcs-<IDCS-Account-Name>/services/AUTOANALYTICSINST/instances/<Instance-Name>/stop"


Spare Oracle Credits!

You can now schedule your instances to stop when they are not being used saving you valuable Oracle Universal Credits which you could use to spin up other Oracle cloud services or add more OCPU during heavy usage periods!

Friday 19 July 2019

OAC Remote Data Gateway: Connecting to MS SQL Server

The objective of this blog is to describe what the OAC Remote Data Gateway is, how it is configured and how to us it to connect an OAC instance to an on-premises MS SQL Server database (although the process will be the same connecting to any other on-premises data source).



The Oracle Analytics team have released a utility called the ‘Oracle Analytics Cloud Data Gateway’ which enables OAC to easily connect to your on-premises datasources with minimal fuss and WITHOUT HAVING TO OPEN UP FIREWALL PORTS!

No more awkward conversations with your network admin. No more grovelling and bribing them to open up ports in their firewall.

Current supported data sources are:

  • Oracle database
  • SQL Server
  • DB2
  • Teradata

The Oracle A-Team have written a blog here on what it does and how to configure and there is also a useful youtube video here.

What is the Oracle Analytics Remote Data Gateway?

The Data Gateway is a replacement for the Remote Data Connector which is the OAC utility to connect to your on-premises data sources. The RDC required you to deploy a webapp to your own install of Weblogic or Tomcat which can be a challenge to those not familiar with web app deployments. The RDC is still supported but it is depreciated and will no longer receive any updates. Installing Data Gateway is super-simple. It uses Jetty which is a self-contained Java HTTP web and servlet container. Your Data Gateway app is already deployed for you. All Data Gateway requires you to do is extract the zip file if on Windows or install if on Linux. I went for the Windows option.

Why don’t I need to open up my firewall? What is this Voodoo?

The honest answer is I don’t know! It uses only one-way traffic (client to server) and I’m guessing this is how it works: The Data Gateway client will repeatedly poll your OAC instance over HTTPS on a random internal port which means the connection is initiated from the Data Gateway client to the OAC server (no inbound request from OAC so opening of firewall ports not required). If the OAC server requests data then the polling client will pick up the request and send the data up to OAC. I did a network trace and there are little TLS [Client Hello], [Server Hello] handshakes occurring every second so this ensures that the connection is kept alive.

Installing Oracle Analytics Cloud Data Gateway on Windows

The install is nice and simple. Download the OAC Data Gateway 105.3 binaries for Windows from here  and just extract the zip file to any location. There isn’t actually any install required. Here is what the folder structure looks like:


Oracle Analytics Cloud Data Gateway Configuration and Techy/Geeky Stuff

I always like to know what is going on in the background so I thought I'd have a sniff around the installation and configuration.

The configuration is simple, I won’t repeat what our friends in the A-Team have published so have a look here for details. Here are the steps:

  • Extract the Datagateway folder from the zip file you downloaded from Oracle tech network
  • Launch datagateway.exe
  • Enter your OAC URL
  • Generate the security key
  • Register a new Data Gateway agent in your cloud OAC instance. Navigate to Console -> Remote Data Connectivity screen
Here is the config screen of my Data Gateway configuration screen:

URL: Here you enter the URL to your OAC instance.
Allowed Hosts: Here you can add a whitelist of server names/IPs you will allow the Data Gateway to connect to. If you leave it empty Data Gateway can connect to any address in your data centre.
Allowed SQL Statements: Enter the types of statement allowed to be executed on the target DB. Currently only select is supported.

This is what my Data Gateway agent looks like when it's registered in the OAC 'Remote Data Connectivity' screen:


Ok, now it's time to hack around the install.

When you click on save in the Data Gateway client window the configuration gets written to a properties file: \datagateway-Windows-105.3\obiee_rdc_agent.properties

Here is whats in the file:

#Wed Jul 17 12:17:47 GMT 2019
rdcURL=http\://localhost\:8641/datagateway/javads
hostWhitelist=
enableLongPoll=1
poolThreadInterval=500
oacURL=https\://dmcbi-dmcgrp.analytics.ocp.oraclecloud.com
proxyURL=proxy-url
numberOfJobPollers=2
agentID=4b81ce5f-2e09-46f1-8451-d250ee999g1c
proxyUserName=
numberOfJobsToFetch=100
proxyHost=
proxyPassword=
statementWhitelist=
numberOfJobExecutors=100
proxyPort=

Pick a port, any port...
Notice the random port number chosen by the configuration is 8641. If you change the config then the port number will change every time you click save. If you wanted you could fix the port number if you wanted by editing the .properties file. Also notice that you can add proxy server information in this file. 

Out of curiosity I tried connecting to the URL using a browser and saw this information:


Clicking on the config.jsp link opens up the config screen but over the web (just like you do on the Linux version):

There is also an interesting rdcagent page which I’m assuming will be used once Oracle get the RPD java datasource configuration working (more on this at the end of this blog)
http://localhost:8641/datagateway/static/rdcagent.js



Unfortunately there isn’t a Windows service which means that Data Gateway will not start up again automatically after a reboot of the OS. To launch the Remote Data Gateway you need to execute the dataGateway.exe manually, this then starts up the Java process for the webapp. I’m hoping Oracle will provide instructions on how to enable Data Gateway as a Windows service. When you launch datagateway.exe you will see several instances of that process in Windows Task Mgr, you will also see the Jetty java.exe process running in the background.


I've no idea why so many run in parallel.

Creating a MS SQL Server Connection

The objective of this blog is to connect to an on-premises Microsoft SQL Server database. 

Once you have configured the Data Gateway agent you can then create the connection to your on-premises MS SQL Server database.

In OAC navigate to Create -> Connection:

Enter the on-premises database host you want to connect to, the userID and the database name.This is a big plus for Data Gateway compared to older RDC. With the RDC you had to add a jdbc connection to a config file on the RDC client. With Data Gateway all connections are managed from within your OAC instance.

Eagle-eyed Essbase users will also notice that this is connecting to a SQL version of the Essbase classic ‘The Beverage Company’ (TBC) database :)

Creating an OAC Data Set

You can create an OAC data set graphically by selecting tables/columns manually. This is very simple and any end user (no SQL knowledge required) can define the data they want to analyse. The downside is that you can only select one table per data set if done graphically. You would need to create an individual data set for each table and then join them together by either using an OAC Dataflow or by adding all the data sets to a project and then joining the data sets together in the data diagram.

People that understand SQL statements can by-pass this and create only one data set using an SQL Statement using joins which combines the data from all tables. This will obviously be more efficient but requires more technical knowledge. OAC really does give you the best of both worlds, from basic end user to SQL jockey. I’ll show both ways.

For this example my TBC SQL database consists of a SALESFACT table and dimensional tables MARKET, PRODUCT, SCENARIO, SUPPLIER. 


Creating a Data Set Graphically

Okay, this is the easy way to create your data set without writing any SQL code. The downside is that you need to create one data set per table. You can write a SQL statement to join all your tables into one data set which is a lot more efficient but might not be simple for non-technical users. OAC gives us the option to do both. Scroll further down to see the SQL statement example.


You can create a data set by selecting the schema from your MS SQL Connection:


Data Gateway will then connect directly to your on-premises database and prompt you to select which schema you want to connect to, dbo in our case. Notice the 'select columns' and 'Enter SQL' radio buttons. Choose 'select columns' for the graphical mode:


Once you have selected your schema you can select which table and columns you want to add to your data set.


Here I have created a data set called TBC_MARKET (it just says MARKET in the screenshot) consisting of the STATEID column and the STATE column. When you save your data set OAC will populate that data set with the data directly from your on-premises MS SQL source.

I then go through the same process for all my other tables, one data set per table and end up with 5 data sets:


My TBC_SALESFACT data set looks like this:


My dimension columns consist of foreign keys so any reports will be meaningless. I need to join these keys with the other dimension tables to give me some meaningful results. 

I can do this two ways, using a data flow to create a final data set or by adding ALL the data sets to a project and then joining the datas sets together in the data diagram.

Joining the TBC data sets in the data diagram of a OAC Project:



Alternatively I can join the individual TBC_* data sets using a OAC data flow and saving to a new TBC_FINAL data set:


Both result in being able to visualize my data using meaningful member names:



Creating a Data Set using SQL

This is a lot more efficient because we can control the exact SQL we invoke to the on-premises SQL server and it will join all of the tables into one data set instead of having to join several data sets via a data flow or via the data diagram.

This is the SQL statement containing my joins which defines the data I want to bring into OAC:

select TBC.DBO.MARKET.STATE, TBC.DBO.PRODUCT.SKU_ALIAS, TBC.DBO.SCENARIO.SCENARIO, TBC.DBO.SUPPLIER.SUPPLIER_ALIAS, TBC.DBO.SALESFACT.TRANSDATE, TBC.DBO.SALESFACT.SALES

FROM TBC.DBO.SALESFACT

INNER JOIN TBC.dbo.MARKET ON TBC.dbo.SALESFACT.STATEID=TBC.dbo.MARKET.STATEID
INNER JOIN TBC.dbo.PRODUCT ON TBC.dbo.SALESFACT.PRODUCTID=TBC.dbo.PRODUCT.PRODUCTID
INNER JOIN TBC.dbo.SCENARIO ON TBC.dbo.SALESFACT.SCENARIOID=TBC.dbo.SCENARIO.SCENARIOID
INNER JOIN TBC.dbo.SUPPLIER ON TBC.dbo.SALESFACT.SCENARIOID=TBC.dbo.SCENARIO.SCENARIOID


We create a new data set just like we did in the previous section but we select the 'Enter SQL' radio button to enter our SQL:



I can now create a project on that one data set and visualize my data. Nice and easy and FAST!




Cached or Live Data?

There is an option in each direct database connected data set to either cache the data or have a 'live' connection. I tested the Live connection, added a new row to my on-premises database and the new row appeared immediately in my OAC project. It does work! Caveats on caching: there is a limit to the size of data held in the cache and you can't modify that setting yet in the OAC console. Also, I don’t know how often the cache is refreshed. 



For large data sets where live data isn’t a requirement I would probably create a data flow and schedule to run overnight to a new target data set. The data wouldn’t be live but it would ensure fast data retrieval times.

Adding your Data Gateway Connection to the RDP

Edit 29/11/19: Ignore this section, with OAC update 105.4 you can now easily connect to Data Gateway using your RPD! Go here to read all about it.

Ok, I admit defeat here. I don't think there is a way to add the current Windows version of Data Gateway as a java datasource to the RPD. This works in the older Remote Data Connector. The Data Gateway config documentation seems to have been copied from the RDC docs.

In the Admin Client you are meant to go to File -> Load Java Datasources and connect to your Data Gateway. This internally connects to http://hostname:port/obiee/javads . There isn't a 'obiee' webapp present on the Windows version of Data Gateway so this must be a work in progress. Fingers crossed it should be added soon.

If I'm wrong or I find out more I'll update the blog.

Conclusions

Here is a list of positives and negatives of the current release of Data Gateway compared to the older Remote Data Connector, given the pace of new features I'm sure any negatives will soon be fixed:

Positives:
  • Very simple to install/configure
  • No need to open up any ports in your local firewall
  • All database connections are created in OAC, no need to touch the Data Gateway config

Negatives:
  • Can't add a connection in the RPD yet You can in 105.4! Look here.
  • No Windows service yet
Cheers, happy on-premises visualizing!



Tuesday 26 March 2019

OAC - Essbase Streaming Dataload from SQL Server Database


Here's a short blog on streaming to the OAC Essbase cloud from a MS SQL Server source. The team at essbaselabs have written a nice blog on streaming Essbase data loads and the Oracle docs are here.

  • Download your MS SQL driver from here
  • Save it to your client machine and add the path to the jar file to your EXTERNAL_CLASSPATH in the esscs.bat file:

  • Using essCLI create the local connection to your on-prem data source:
esscs createLocalConnection -N MSSQL_EPMDB -cs jdbc:sqlserver://epmdb:1433 -u epmsql -D com.microsoft.sqlserver.jdbc.SQLServerDriver


  • Run the dataload command. Here I'm passing the SQL query into the command but you can also store the query in your Essbase rules file:
esscs dataload -application Sample -db Basic -rule SQLData.rul -stream -connection MSSQL_EPMDB -query "Select * from Sales"



Error handling is good and logged in the essbase-cli.log file:



I then correct my SQL statement and the query is successful:


For rules related errors a familiar dataload.err file is created in your working directory:


Correct the rules file and the load is successful:


The other nice feature is that essCLI will compress the data whilst it flies over the interwebs to the Essbase cloud. This means that it reduces any latency which could be associated with loading data from an on-prem source to the cloud.

So there we have it, a very simple way of loading data from your on-prem SQL source to your Essbase cloud target and no messing about with networks and firewalls etc.