Tuesday, 24 January 2017

The Mysterious Case of the Essbase Hang


You would think that this was from the archives given that we're talking about an unpatched version of Essbase 11.1.2.2 but no, this happened recently. A customer had been using Essbase quite happily for years on an unpatched version of Essbase 11.1.2.2 when one day it stopped working.

The symptoms were strange, the applications and databases would startup correctly and it was possible to retrieve data and export all data but as soon as data was sent into the application via spreadsheet lock and send or via a load rule then the application would hang and CPU would spike. There were no error messages in any of the application or server logs and the only way to get the application back was to kill the ESSSVR process. This meant you could retrieve data but you would be back to square one if you tried loading or calculating anything...not much use really.

After some head scratching and questioning of the IT department it turned out that they had recently applied a package to the Linux kernel. With this information MOS came to the rescue:

Essbase Dataloads and Calculations Hang or Spike CPU to 100% after Applying Linux Operating System Package (glibc-2.5-107) (Doc ID 1531236.1)

We rolled back all the glibc packages and Essbase started working again as if by magic.

Thankfully this client is now one of our Manged Services customers and we shall soon be upgrading them to Essbase 11.1.2.4 where Linux kernel issues should be a thing of the past.

Case closed. Watson, go make me a cup of tea...

Thursday, 22 December 2016

PBCS Drill-Through not Working to On-Prem FDMEE

One of the nice things about FDMEE is that it can be used to load data into PCBS, EPBCS and the next patch will enable loading to FCCS.

Just like on-prem Planning you can use drill-through to look at the detailed rows of data that FDMEE loaded from your source to your PBCS base member. We recently encountered an issue where the drill-through worked fine when drilling down to FDMEE data in a webform but the drill-through failed when launching the drill-through via Smart View. There was no error message, but instead of drilling into the load details of our on-prem FDMEE instance the drill-through webpage just took us to the PBCS homepage.

Fortunately there was a simple fix. We were using Firefox and the Firefox Smart View add-in was disabled. You need to enable the Smart View plugin within the Firefox add-ins. Once this is enabled you'll be able to drill-through to all the lovely detail from Smart View PBCS to FDMEE.

Whilst this test was using on-prem FDMEE it is probably the same situation with PBCS data management, just make sure your Smart View add-in is enabled in Firefox:


Wednesday, 23 November 2016

Where did my Planning Job Console History Go?

Quick post on getting purged history from the Planning Job Console...

Planning has a nice feature which is the Job Console. In the Job Console page (Tools-> Job Console) you can check the status (processing, completed, or error) of these job types: Business Rules, Clear Cell Details, Copy Data, Push Data and Planning Refresh.

By default, the job console will delete all information on  any completed jobs older than 4 days. This isn't good if you need to troubleshoot an issue or check a transaction completed before this period. However, the history gets moved to another table in the Planning repository called HSP_HISTORICAL_JOB_STATUS. The table stores the user accounts as objectIDs so we can join it with the HSP_OBJECTS table to get the job history with friendly userIDs.

Here is the SQL to retrieve all the history:

select HSP_HISTORICAL_JOB_STATUS.JOB_ID,
HSP_HISTORICAL_JOB_STATUS.PARENT_JOB_ID,
HSP_HISTORICAL_JOB_STATUS.JOB_NAME,
HSP_HISTORICAL_JOB_STATUS.JOB_TYPE,
HSP_Object.OBJECT_NAME,
HSP_HISTORICAL_JOB_STATUS.START_TIME,
HSP_HISTORICAL_JOB_STATUS.END_TIME,
HSP_HISTORICAL_JOB_STATUS.RUN_STATUS,
HSP_HISTORICAL_JOB_STATUS.DETAILS,
HSP_HISTORICAL_JOB_STATUS.ATTRIBUTE_1,
HSP_HISTORICAL_JOB_STATUS.ATTRIBUTE_2,
HSP_HISTORICAL_JOB_STATUS.PARAMETERS,
HSP_HISTORICAL_JOB_STATUS.JOB_SCHEDULER_NAME

from HSP_HISTORICAL_JOB_STATUS
INNER JOIN HSP_Object
ON HSP_Object.OBJECT_ID=HSP_HISTORICAL_JOB_STATUS.USER_ID

ORDER BY HSP_HISTORICAL_JOB_STATUS.END_TIME

and this is what the output looks like:


To save you from having to do this you can increase the length of history that is stored in the HSP_JOB_STATUS from the default of 4 days to whatever you choose by setting the JOB_STATUS_MAX_AGE in the Planning application properties. All the details can be found here:

http://docs.oracle.com/cd/E57185_01/PLAAG/ch02s08s08.html

Tuesday, 27 September 2016

Oracle PBCS New Features Sept 2016


It's been a while since I've blogged and now I've resorted to plagerism. Mike Falconer, a fantastic colleague of mine made this lovely round-up of the latest updates in PBCS so Mike, it's over to you...

PBCS is always being updated with new features and bug fixes. We’ve been through the Oracle documentation available here and extracted a simple summary of the best new features in PBCS and how best to utilise them.

Where have all my buttons gone!?

Oracle has a tendency to change their user interfaces for a consistent feel across all their cloud applications. Unfortunately this means that they often switch around their interfaces between updates. Here I’ve tried to document some of the more confusing changes.
Console (1) Renamed to Application. Sub-menus have been created replacing the awful tabs on the left hand side. These sub-menus will replace the standard buttons at the top of the screen when you are on a page that contains them. Settings (2) has similarly been renamed to Tools.
Downloads (3) – Moved to Your Name in the top right hand corner.
Navigator (4) Replaced with the 3-line Menu button in the top left-hand corner. You can access almost everything in the application using the new navigator window (see over the page)
Forms (5) Renamed to Data – but to edit form structures, you still need to go via the Navigator.
Application Management – Split up into two separate sections – backups and snapshot have been renamed Migration (6) and user management and security has been renamed Access Control (7).
Navigation Flow (8) – A great new feature that allows you to change all of the buttons around, to confuse your poor users some more.
Refresh Database –Application->Overview->Actions->Refresh Database(x3)->Refresh. Obviously.






What happened to my dimension editor?

Surely Smart View is safe from the monthly updates? Have no fear, it has not been forgotten. For those that use the Smart View Planning Extension (which should be all of you – it saves you so much time!) to edit dimension metadata, you will now need to use the Member Selection box to get some of the dimension properties that used to appear automatically. It functions in exactly the same way as the Member Selection for regular dimension members. 



Top Tip: The quickest way to refresh the database in PBCS is to right-click the Dimensions folder and click Refresh Database:


Fun Features


Activity Reports

The biggest feature included this month is an activity report, located under Application -> Overview -> Activity

These reports are produced daily and have a number of metrics, including user login data, worst performing tasks, and other helpful tools like most/least active users. Perfect for keeping track of your application’s weaknesses and strengths.

External Links to Dashboards and Forms

You can now create yourself a URL which will take a user directly to a form or dashboard they have access to:

https://virtualhost/HyperionPlanning/faces/LogOn?Direct=True&ObjectType=DASHBOARD&ObjectName=DASHBOARDNAME

In the above link, replace virtualhost with your own PBCS link information, and change DASHBOARDNAME to the name of a dashboard in your application.

Clicking on the link will send you directly to that object:


Copy/Paste onto forms in the Simplified Interface!

This one is huge people – you can now use Copy and Paste via Ctrl-C and Ctrl-V to copy and paste data into cells on web forms. You can even copy a range of cells from Excel, select a range of cells in the web form, and paste them. What a time to be alive.

Navigation Flow

You can use this functionality to customise the buttons users can or can’t see. You can also define and add new buttons – but they are limited to displaying certain forms or dashboards. We were hoping this feature will be expanded to add links so watch this space! Below you can see I’ve added a Test Flow card. Navigation Flow is certainly in its infancy, but a bigger feature will come in a future update when Oracle flesh the options out a bit.




Monday, 11 July 2016

Fun and Games with FDMEE 11.1.2.4 and Oracle EBS



It's been a while since my last blog post. I've been very busy with other projects and I managed to get to KScope 2016 in Chicago which was great. I encourage anyone and everyone to go to the KScope conference if you can.

This is a lessons-learned type post where we recently upgrade a FDM application from 11.1.2.2 to 11.1.2.4. The FDM app used the very old FDM EBS adapters (not the ERPi ones) and we hit some challenges along the way.

The version of FDMEE we were using was 11.1.2.4.200 and the target was Planning and HFM.

FDMEE EBS Datasource Initialization

The first issue encountered was when we tried to initialize our EBS source in FDMEE. The ODI configuration was correct but we kept hitting this error message when we ran the initialization:

Error: 
ODI-1228: Task SrcSet0 (Loading) fails on the target GENERIC_SQL connection FDMEE_DATA_SERVER_MSSQL.
Caused By: weblogic.jdbc.sqlserverbase.ddc: [FMWGEN][SQLServer JDBC Driver][SQLServer]Cannot insert duplicate key row in object 'dbo.AIF_DIM_MEMBERS_STG_T' with unique index 'AIF_DIM_MEMBERS_STG_T_U1'. The duplicate key value is (X, X, X, X, X, X).

After some head scratching and troubleshooting it became clear that the issue was down to duplicate codes within EBS. There were two entities in EBS which had the same name, one in upper-case and the other in lower-case. This was causing the duplicate key error when FDMEE tried to insert both enitities into the AIF staging tables.

We went down the easy route to resolve this...we renamed the source entity in EBS to make it unique. You can alter the index in the FDMEE tables to enable duplicates but we preferred the simple option

Data Reconciliation (Data was double what it should be)

We were seeing strange issues with data reconciliation, we were sometimes seeing the data values doubling up. This problem was down to a bug with the FDMEE snapshot load. For whatever reason the data being loaded into FDMEE was incorrect when we used snapshot mode. When we used Full-Refresh the data loaded correctly.

FDMEE Scheduler

Oh my god, the scheduler in FDMEE is terrible. You can't seem to edit a schedule and if you did manage to edit it then it doesn't show the new value. The other issue we had was that there was no way to select the load type (snapshot or full-refresh) in the load. The default load type is snapshot and this was no good because of the bug mentioned above whch was giving us double values.

Our strategy was to use a Windows batch script to do the load. You have more control over the load process with the batch script so we could select the load type. We then scheduled the batch script to run at the appropriate time using the Windows scheduler. The down-side of this is that you can't schedule via the workspace but our customer actually prefered using the Windows scheduler as they were familiar with it and had more confidence in it.

We created two scripts, one which set most of the parameters required for the FDMEE batch loader and another which set the relevant periods for the load

The FDMEE batch loader is located here:
D:\Oracle\Middleware\user_projects\FDNHOST1\FinancialDataQuality\loaddata.bat

The loader uses the following syntax:

loaddata.bat %USER% %PASSWORD% %RULE_NAME% %IMPORT_FROM_SOURCE% %EXPORT_TO_TARGET% %EXPORT_MODE% %EXEC_MODE% %LOAD_FX_RATE% %START_PERIOD_NAME% %END_PERIOD_NAME% %SYNC_MODE% 

Notice the EXEC_MODE parameter, this enabled us to define the load type (Full Refresh).

So we created a simple script to set most of these parameters. This script uses a hard-coded password but you can encrypt the password to make it more secure.

@echo off

REM ################################################################################
REM #### FDMEE Batch Load                                                      #####
REM #### Uses FDMEE loaddata batch script to initiate load via command line.   #####
REM ####                                                                       #####
REM #### Syntax is FDMEE_Batch.bat RULE_NAME START_PERIOD_NAME END_PERIOD_NAME #####
REM #### e.g. FDMEE_Batch.bat PLN_UK_DLR_Actual May-16 May-16                  #####
REM ################################################################################
set BATCH_LOAD=D:\Oracle\Middleware\user_projects\FDNHOST1\FinancialDataQuality\loaddata.bat
set BATCH_SCRIPT_HOME=D:\Scripts\FDMEE_Batches
set USER=admin
set PASSWORD=XXXXXXXX
set RULE_NAME=%1
set IMPORT_FROM_SOURCE=Y
set EXPORT_TO_TARGET=Y
set EXPORT_MODE=STORE_DATA
set EXEC_MODE=FULLREFRESH
set LOAD_FX_RATE=N
set START_PERIOD_NAME=%2
set END_PERIOD_NAME=%3
set SYNC_MODE=SYNC
cd D:\Oracle\Middleware\user_projects\FDNHOST1\FinancialDataQuality
echo %date% %time% Starting load >> %BATCH_SCRIPT_HOME%\FDMEE_Batch.log
call %BATCH_LOAD% %USER% %PASSWORD% %RULE_NAME% %IMPORT_FROM_SOURCE% %EXPORT_TO_TARGET% %EXPORT_MODE% %EXEC_MODE% %LOAD_FX_RATE% %START_PERIOD_NAME% %END_PERIOD_NAME% %SYNC_MODE% >> %BATCH_SCRIPT_HOME%\FDMEE_Batch.log
echo %date% %time% >> %BATCH_SCRIPT_HOME%\FDMEE_Batch.log

FDMEE Load Performance

The other issue we hit was performance. The loads in FDMEE were taking longer than the equivalent loads in 11.1.2.2 FDM. This seemed puzzling to me as FDMEE is usually much more efficient because it uses ODI and the power of the relational database engine for the transformations. We also noted that the HFM consolidation times were also slower, again very strange as HFM 11.1.2.4 performance is generally a lot quicker than older versions 99% of the time.

Here our problem was down to the relational database. We were using a SQL Server 2012 cluster with AlwaysOn synchronization enabled. It turns out this is very bad for performance because the primary SQL Server node updates the secondary node in real-time. Good for resilience, very bad for performance. We decided to turn the synchronization off and went for a periodic asynchronous update strategy.

FDMEE load times were 100% faster and our HFM consolidation times were 100% faster with SQL Server AlwaysOn synchronization turned off.

Turning on database synchronization causes SQL inserts to be slower, Microsoft even say this themselves:

“Synchronous-commit mode emphasizes high availability over performance, at the cost of increased transaction latency”. https://msdn.microsoft.com/en-us/library/ff877931.aspx?f=255&MSPPError=-2147217396

We weren't the only people encountering this: https://clusteringformeremortals.com/2012/11/09/how-to-overcome-the-performance-problems-with-sql-server-alwayson-availability-groups-sqlpass/

“The one thing that I learned is that although AlwaysOn certainly has its applications, most of the successful deployments are based on using AlwaysOn in an asynchronous fashion. The reason people avoid the synchronous replication option is that the overhead is too great. “

Conclusion

Whilst this post is outlining the issues we had with FDMEE I have to say that the application is now way better than it was in the old FDM. We managed to streamline many of the processes and it was great to be able to use multi-dimensional maps in FDMEE instead of having to script everything. In fact in this example we went from having 6 custom scripts to zero, yes ZERO scripts (apart from our custom batch script of course!). The load times are also faster than their FDM equivalents.

Thanks for reading and if you've any comments/suggestions then please add below.

Tuesday, 12 April 2016

OBIEE 12c and Planning

Following on from my last post about configuring OBIEE 12c with HFM I thought I'd give EPM Hyperion Planning a bash.

Integration with EPM Planning was supported in OBIEE 11.1.1.9 and didn't require any configuration. You just had to make sure you used the html character code to replace the colon for the host:port and the Planning metadata imported successfully:

adm:thin:com.hyperion.ap.hsp.HspAdmDriver:ServerName%3A19000:AppName

You would expect that since this worked out of the box in 11.1.1.9 that it would also work in 12c right?

Wrong!

You'll get the following error when trying to import the metadata:

The connection has failed. [nQSError: 77031] Error occurs while calling remote service ADMImportService.
Details: Details: com.hyperion.ap.APException: [1085]
Error: Error creating objectjava.lang.ClassNotFoundException




Many thanks to Ziga who also encountered the same error and blogged the solution here:

http://zigavaupot.blogspot.co.uk/2016/02/obi-12c-series-connecting-obi-with.html.

The solution is hidden in the Metadata Repository Builder's Guide for Oracle Business Intelligence Enterprise Edition:

"For Hyperion Planning 11.1.2.4 or later, the installer does not deliver all of the required client driver .jar files. To ensure that you have all of the needed .jar files, go to your instance of Hyperion, locate and copy the adm.jar, ap.jar, and HspAdm.jar files and paste them into
MIDDLEWARE_HOME\oracle_common\modules."

Copy the following files from your Planning install:

  • E:\Oracle\Middleware\EPMSystem11R1\common\ADM\11.1.2.0\lib\adm.jar
  • E:\Oracle\Middleware\EPMSystem11R1\common\ADM\11.1.2.0\lib\ap.jar
  • E:\Oracle\Middleware\EPMSystem11R1\common\ADM\Planning\11.1.2.0\lib\HspAdm.jar

to this location on your OBIEE 12c server:

  • E:\Oracle\OBIEE\Oracle_Home\oracle_common\modules

Restart the services and away you go:


Thursday, 7 April 2016

OBIEE 12c and HFM

I've worked with several of the clever people from the Oracle CEAL team and I highly recommend their blog: https://blogs.oracle.com/pa/ . It's full of valuable BI and EPM information.

Their latest post https://blogs.oracle.com/pa/entry/integrating_bi_12c_with_hfm shows us how to import HFM metadata into OBIEE 12c. I had some teething troubles configuring this on Windows (mainly due to the direction of the slashes in my path statements) so I thought I'd share my configuration for OBIEE and HFM on Windows.

I followed the instructions in the document Olivier provided but I got the following error when trying to import the HFM metadata:


"Could not connect to the data source. A more detailed error message has been written to the BI Administrator log file".

Don't believe the error message, there isn't any extra info in the admin log file so no hint as to where I had gone wrong. I think my problem was two fold - I'm on Windows and had used back-slashes in my pathing statements and I used the bi java host port number that was specified in the CEAL document (9610). The default port for the java host is 9510 so I used that.

Here is my working config:

E:\Oracle\OBIEE\Oracle_Home\bi\modules\oracle.bi.cam.obijh\setOBIJHEnv.cmd :

Add the following variables. This is a windows command script so you should be using back-slashes:

SET EPM_ORACLE_HOME=E:\Oracle\Middleware\EPMSystem11R1
SET EPM_ORACLE_INSTANCE=E:\Oracle\Middleware\user_projects\FDNHOST1
SET OBIJH_ARGS=%OBIJH_ARGS% -DEPM_ORACLE_HOME=%EPM_ORACLE_HOME%
SET OBIJH_ARGS=%OBIJH_ARGS% -DEPM_ORACLE_INSTANCE=%EPM_ORACLE_INSTANCE%
SET OBIJH_ARGS=%OBIJH_ARGS% -DHFM_ADM_TRACE=2



E:\Oracle\OBIEE\Oracle_Home\bi\modules\oracle.bi.cam.obijh\env\obijh.properties

Add the following variables. This time we use the forward-slash:

#Added for HFM connectivity
EPM_ORACLE_HOME=E:/Oracle/Middleware/EPMSystem11R1
EPM_ORACLE_INSTANCE=E:/Oracle/Middleware/user_projects/FDNHOST1


Find OBIJH_ARGS as indicated in the CEAL document and append the below tags to the end (without carriage return):

-DEPM_ORACLE_HOME=E:/Oracle/Middleware/EPMSystem11R1 
-DEPM_ORACLE_INSTANCE=E:/Oracle/Middleware/user_projects/FDNHOST1 
-DHFM_ADM_TRACE=2


The text has been wrapped in the image above. Just add the text at the end of your OBIJH_ARGS setting.

E:\Oracle\OBIEE\Oracle_Home\bi\bifoundation\javahost\config\loaders.xml

Add the following lines to the <Name>IntegrationServiceCall</Name> Classpath:

E:/Oracle/Middleware/EPMSystem11R1/common/hfm/11.1.2.0/lib/fm-adm-driver.jar$;
E:/Oracle/Middleware/EPMSystem11R1/common/hfm/11.1.2.0/lib/fm-web-objectmodel.jar;
E:/Oracle/Middleware/EPMSystem11R1/common/jlib/11.1.2.0/epm_j2se.jar;
E:/Oracle/Middleware/EPMSystem11R1/common/jlib/11.1.2.0/epm_hfm_web.jar



For the Admin Tool client, edit the following file:

E:\Oracle\OBIEE\OBIEE_Client\domains\bi\config\fmwconfig\biconfig\OBIS\NQSConfig.INI

JAVAHOST_HOSTNAME_OR_IP_ADDRESSES = "SERVERNAME:9510";  


Restart your OBIEE and you can now import your HFM data into OBIEE:

adm:thin:com.oracle.htm.HsvADMDriver:HFMCLUSTER:APPNAME


You can see your HFM application, use the arrow to push it into the repository view:


You should now see it available in the physical layer:


Save the repository and start building your reports!