One of the nice things about FDMEE is that it can be used to load data into PCBS, EPBCS and the next patch will enable loading to FCCS.
Just like on-prem Planning you can use drill-through to look at the detailed rows of data that FDMEE loaded from your source to your PBCS base member. We recently encountered an issue where the drill-through worked fine when drilling down to FDMEE data in a webform but the drill-through failed when launching the drill-through via Smart View. There was no error message, but instead of drilling into the load details of our on-prem FDMEE instance the drill-through webpage just took us to the PBCS homepage.
Fortunately there was a simple fix. We were using Firefox and the Firefox Smart View add-in was disabled. You need to enable the Smart View plugin within the Firefox add-ins. Once this is enabled you'll be able to drill-through to all the lovely detail from Smart View PBCS to FDMEE.
Whilst this test was using on-prem FDMEE it is probably the same situation with PBCS data management, just make sure your Smart View add-in is enabled in Firefox:
Guillaume Slee's EPM/BI blog. Does exactly what it says on the tin.
Thursday, 22 December 2016
Wednesday, 23 November 2016
Where did my Planning Job Console History Go?
Quick post on getting purged history from the Planning Job Console...
Planning has a nice feature which is the Job Console. In the Job Console page (Tools-> Job Console) you can check the status (processing, completed, or error) of these job types: Business Rules, Clear Cell Details, Copy Data, Push Data and Planning Refresh.
By default, the job console will delete all information on any completed jobs older than 4 days. This isn't good if you need to troubleshoot an issue or check a transaction completed before this period. However, the history gets moved to another table in the Planning repository called HSP_HISTORICAL_JOB_STATUS. The table stores the user accounts as objectIDs so we can join it with the HSP_OBJECTS table to get the job history with friendly userIDs.
Here is the SQL to retrieve all the history:
and this is what the output looks like:
To save you from having to do this you can increase the length of history that is stored in the HSP_JOB_STATUS from the default of 4 days to whatever you choose by setting the JOB_STATUS_MAX_AGE in the Planning application properties. All the details can be found here:
http://docs.oracle.com/cd/E57185_01/PLAAG/ch02s08s08.html
Planning has a nice feature which is the Job Console. In the Job Console page (Tools-> Job Console) you can check the status (processing, completed, or error) of these job types: Business Rules, Clear Cell Details, Copy Data, Push Data and Planning Refresh.
By default, the job console will delete all information on any completed jobs older than 4 days. This isn't good if you need to troubleshoot an issue or check a transaction completed before this period. However, the history gets moved to another table in the Planning repository called HSP_HISTORICAL_JOB_STATUS. The table stores the user accounts as objectIDs so we can join it with the HSP_OBJECTS table to get the job history with friendly userIDs.
Here is the SQL to retrieve all the history:
select HSP_HISTORICAL_JOB_STATUS.JOB_ID,
HSP_HISTORICAL_JOB_STATUS.PARENT_JOB_ID,
HSP_HISTORICAL_JOB_STATUS.JOB_NAME,
HSP_HISTORICAL_JOB_STATUS.JOB_TYPE,
HSP_Object.OBJECT_NAME,
HSP_HISTORICAL_JOB_STATUS.START_TIME,
HSP_HISTORICAL_JOB_STATUS.END_TIME,
HSP_HISTORICAL_JOB_STATUS.RUN_STATUS,
HSP_HISTORICAL_JOB_STATUS.DETAILS,
HSP_HISTORICAL_JOB_STATUS.ATTRIBUTE_1,
HSP_HISTORICAL_JOB_STATUS.ATTRIBUTE_2,
HSP_HISTORICAL_JOB_STATUS.PARAMETERS,
HSP_HISTORICAL_JOB_STATUS.JOB_SCHEDULER_NAME
from HSP_HISTORICAL_JOB_STATUS
INNER JOIN HSP_Object
ON HSP_Object.OBJECT_ID=HSP_HISTORICAL_JOB_STATUS.USER_ID
ORDER BY
HSP_HISTORICAL_JOB_STATUS.END_TIME
and this is what the output looks like:
To save you from having to do this you can increase the length of history that is stored in the HSP_JOB_STATUS from the default of 4 days to whatever you choose by setting the JOB_STATUS_MAX_AGE in the Planning application properties. All the details can be found here:
http://docs.oracle.com/cd/E57185_01/PLAAG/ch02s08s08.html
Tuesday, 27 September 2016
Oracle PBCS New Features Sept 2016
Oracle PBCS New Features Sept 2016
It's been a while since I've blogged and now I've resorted to plagerism. Mike Falconer, a fantastic colleague of mine made this lovely round-up of the latest updates in PBCS so Mike, it's over to you...
PBCS is always being updated with new features and bug fixes. We’ve been through the Oracle documentation available here and extracted a simple summary of the best new features in PBCS and how best to utilise them.
Where have all my buttons gone!?
Oracle has a tendency to change their user interfaces for a
consistent feel across all their cloud applications. Unfortunately this means
that they often switch around their interfaces between updates. Here I’ve tried
to document some of the more confusing changes.
Console (1) – Renamed to Application. Sub-menus have been
created replacing the awful tabs on the left hand side. These sub-menus will
replace the standard buttons at the top of the screen when you are on a page
that contains them. Settings (2) has similarly been renamed to Tools.
Downloads (3) – Moved to Your Name in
the top right hand corner.
Navigator (4) – Replaced with the
3-line Menu button in the top
left-hand corner. You can access almost everything in the application using the
new navigator window (see over the page)
Forms (5) – Renamed to Data – but to edit form structures, you
still need to go via the Navigator.
Application Management – Split up into two separate sections –
backups and snapshot have been renamed Migration
(6) and user management and security has
been renamed Access Control (7).
Navigation Flow (8) – A great new
feature that allows you to change all of the buttons around, to confuse your
poor users some more.
Refresh
Database –Application->Overview->Actions->Refresh
Database(x3)->Refresh. Obviously.What happened to my dimension editor?
Surely Smart View is safe from the monthly updates? Have no fear, it has not been forgotten. For those that use the Smart View Planning Extension (which should be all of you – it saves you so much time!) to edit dimension metadata, you will now need to use the Member Selection box to get some of the dimension properties that used to appear automatically. It functions in exactly the same way as the Member Selection for regular dimension members.Top Tip: The quickest way to refresh the database in PBCS is to right-click the Dimensions folder and click Refresh Database:
Fun Features
Activity Reports
The biggest feature included this month is an activity report, located under Application -> Overview -> Activity.
These reports are produced daily and have a number of metrics, including user login data, worst performing tasks, and other helpful tools like most/least active users. Perfect for keeping track of your application’s weaknesses and strengths.
External Links to Dashboards and Forms
You can now create yourself a URL which will take a user directly to a form or dashboard they have access to:
https://virtualhost/HyperionPlanning/faces/LogOn?Direct=True&ObjectType=DASHBOARD&ObjectName=DASHBOARDNAME
In the above link, replace virtualhost with your own PBCS link information, and change DASHBOARDNAME to the name of a dashboard in your application.
Clicking on the link will send you directly to that object:
Copy/Paste onto forms in the Simplified Interface!
This one is huge people – you can now use Copy and Paste via Ctrl-C and Ctrl-V to copy and paste data into cells on web forms. You can even copy a range of cells from Excel, select a range of cells in the web form, and paste them. What a time to be alive.
Navigation Flow
You can use this functionality to customise the buttons users can or can’t see. You can also define and add new buttons – but they are limited to displaying certain forms or dashboards. We were hoping this feature will be expanded to add links so watch this space! Below you can see I’ve added a Test Flow card. Navigation Flow is certainly in its infancy, but a bigger feature will come in a future update when Oracle flesh the options out a bit.
Monday, 11 July 2016
Fun and Games with FDMEE 11.1.2.4 and Oracle EBS
It's been a while since my last blog post. I've been very busy with other projects and I managed to get to KScope 2016 in Chicago which was great. I encourage anyone and everyone to go to the KScope conference if you can.
This is a lessons-learned type post where we recently upgrade a FDM application from 11.1.2.2 to 11.1.2.4. The FDM app used the very old FDM EBS adapters (not the ERPi ones) and we hit some challenges along the way.
The version of FDMEE we were using was 11.1.2.4.200 and the target was Planning and HFM.
FDMEE EBS Datasource Initialization
The first issue encountered was when we tried to initialize our EBS source in FDMEE. The ODI configuration was correct but we kept hitting this error message when we ran the initialization:Error:
ODI-1228: Task SrcSet0 (Loading) fails on the target GENERIC_SQL connection FDMEE_DATA_SERVER_MSSQL.
Caused By: weblogic.jdbc.sqlserverbase.ddc: [FMWGEN][SQLServer JDBC Driver][SQLServer]Cannot insert duplicate key row in object 'dbo.AIF_DIM_MEMBERS_STG_T' with unique index 'AIF_DIM_MEMBERS_STG_T_U1'. The duplicate key value is (X, X, X, X, X, X).
We went down the easy route to resolve this...we renamed the source entity in EBS to make it unique. You can alter the index in the FDMEE tables to enable duplicates but we preferred the simple option
Data Reconciliation (Data was double what it should be)
We were seeing strange issues with data reconciliation, we were sometimes seeing the data values doubling up. This problem was down to a bug with the FDMEE snapshot load. For whatever reason the data being loaded into FDMEE was incorrect when we used snapshot mode. When we used Full-Refresh the data loaded correctly.FDMEE Scheduler
Oh my god, the scheduler in FDMEE is terrible. You can't seem to edit a schedule and if you did manage to edit it then it doesn't show the new value. The other issue we had was that there was no way to select the load type (snapshot or full-refresh) in the load. The default load type is snapshot and this was no good because of the bug mentioned above whch was giving us double values.Our strategy was to use a Windows batch script to do the load. You have more control over the load process with the batch script so we could select the load type. We then scheduled the batch script to run at the appropriate time using the Windows scheduler. The down-side of this is that you can't schedule via the workspace but our customer actually prefered using the Windows scheduler as they were familiar with it and had more confidence in it.
We created two scripts, one which set most of the parameters required for the FDMEE batch loader and another which set the relevant periods for the load
The FDMEE batch loader is located here:
D:\Oracle\Middleware\user_projects\FDNHOST1\FinancialDataQuality\loaddata.bat
The loader uses the following syntax:
loaddata.bat %USER% %PASSWORD% %RULE_NAME% %IMPORT_FROM_SOURCE% %EXPORT_TO_TARGET% %EXPORT_MODE% %EXEC_MODE% %LOAD_FX_RATE% %START_PERIOD_NAME% %END_PERIOD_NAME% %SYNC_MODE%
Notice the EXEC_MODE parameter, this enabled us to define the load type (Full Refresh).
So we created a simple script to set most of these parameters. This script uses a hard-coded password but you can encrypt the password to make it more secure.
@echo off
REM ################################################################################
REM #### FDMEE Batch Load #####
REM #### Uses FDMEE loaddata batch script to initiate load via command line. #####
REM #### #####
REM #### Syntax is FDMEE_Batch.bat RULE_NAME START_PERIOD_NAME END_PERIOD_NAME #####
REM #### e.g. FDMEE_Batch.bat PLN_UK_DLR_Actual May-16 May-16 #####
REM ################################################################################
set BATCH_LOAD=D:\Oracle\Middleware\user_projects\FDNHOST1\FinancialDataQuality\loaddata.bat
set BATCH_SCRIPT_HOME=D:\Scripts\FDMEE_Batches
set USER=admin
set PASSWORD=XXXXXXXX
set RULE_NAME=%1
set IMPORT_FROM_SOURCE=Y
set EXPORT_TO_TARGET=Y
set EXPORT_MODE=STORE_DATA
set EXEC_MODE=FULLREFRESH
set LOAD_FX_RATE=N
set START_PERIOD_NAME=%2
set END_PERIOD_NAME=%3
set SYNC_MODE=SYNC
cd D:\Oracle\Middleware\user_projects\FDNHOST1\FinancialDataQuality
echo %date% %time% Starting load >> %BATCH_SCRIPT_HOME%\FDMEE_Batch.log
call %BATCH_LOAD% %USER% %PASSWORD% %RULE_NAME% %IMPORT_FROM_SOURCE% %EXPORT_TO_TARGET% %EXPORT_MODE% %EXEC_MODE% %LOAD_FX_RATE% %START_PERIOD_NAME% %END_PERIOD_NAME% %SYNC_MODE% >> %BATCH_SCRIPT_HOME%\FDMEE_Batch.log
echo %date% %time% >> %BATCH_SCRIPT_HOME%\FDMEE_Batch.log
FDMEE Load Performance
The other issue we hit was performance. The loads in FDMEE were taking longer than the equivalent loads in 11.1.2.2 FDM. This seemed puzzling to me as FDMEE is usually much more efficient because it uses ODI and the power of the relational database engine for the transformations. We also noted that the HFM consolidation times were also slower, again very strange as HFM 11.1.2.4 performance is generally a lot quicker than older versions 99% of the time.Here our problem was down to the relational database. We were using a SQL Server 2012 cluster with AlwaysOn synchronization enabled. It turns out this is very bad for performance because the primary SQL Server node updates the secondary node in real-time. Good for resilience, very bad for performance. We decided to turn the synchronization off and went for a periodic asynchronous update strategy.
FDMEE load times were 100% faster and our HFM consolidation times were 100% faster with SQL Server AlwaysOn synchronization turned off.
Turning on database synchronization causes SQL inserts to be slower, Microsoft even say this themselves:
“Synchronous-commit mode emphasizes high availability over performance, at the cost of increased transaction latency”. https://msdn.microsoft.com/en-us/library/ff877931.aspx?f=255&MSPPError=-2147217396
We weren't the only people encountering this: https://clusteringformeremortals.com/2012/11/09/how-to-overcome-the-performance-problems-with-sql-server-alwayson-availability-groups-sqlpass/
“The one thing that I learned is that although AlwaysOn certainly has its applications, most of the successful deployments are based on using AlwaysOn in an asynchronous fashion. The reason people avoid the synchronous replication option is that the overhead is too great. “
Conclusion
Whilst this post is outlining the issues we had with FDMEE I have to say that the application is now way better than it was in the old FDM. We managed to streamline many of the processes and it was great to be able to use multi-dimensional maps in FDMEE instead of having to script everything. In fact in this example we went from having 6 custom scripts to zero, yes ZERO scripts (apart from our custom batch script of course!). The load times are also faster than their FDM equivalents.Thanks for reading and if you've any comments/suggestions then please add below.
Tuesday, 12 April 2016
OBIEE 12c and Planning
Following on from my last post about configuring OBIEE 12c with HFM I thought I'd give EPM Hyperion Planning a bash.
Integration with EPM Planning was supported in OBIEE 11.1.1.9 and didn't require any configuration. You just had to make sure you used the html character code to replace the colon for the host:port and the Planning metadata imported successfully:
adm:thin:com.hyperion.ap.hsp.HspAdmDriver:ServerName%3A19000:AppName
You would expect that since this worked out of the box in 11.1.1.9 that it would also work in 12c right?
Wrong!
You'll get the following error when trying to import the metadata:
The connection has failed. [nQSError: 77031] Error occurs while calling remote service ADMImportService.
Details: Details: com.hyperion.ap.APException: [1085]
Error: Error creating objectjava.lang.ClassNotFoundException
Many thanks to Ziga who also encountered the same error and blogged the solution here:
http://zigavaupot.blogspot.co.uk/2016/02/obi-12c-series-connecting-obi-with.html.
The solution is hidden in the Metadata Repository Builder's Guide for Oracle Business Intelligence Enterprise Edition:
"For Hyperion Planning 11.1.2.4 or later, the installer does not deliver all of the required client driver .jar files. To ensure that you have all of the needed .jar files, go to your instance of Hyperion, locate and copy the adm.jar, ap.jar, and HspAdm.jar files and paste them into
MIDDLEWARE_HOME\oracle_common\modules."
Copy the following files from your Planning install:
to this location on your OBIEE 12c server:
Restart the services and away you go:
Integration with EPM Planning was supported in OBIEE 11.1.1.9 and didn't require any configuration. You just had to make sure you used the html character code to replace the colon for the host:port and the Planning metadata imported successfully:
adm:thin:com.hyperion.ap.hsp.HspAdmDriver:ServerName%3A19000:AppName
You would expect that since this worked out of the box in 11.1.1.9 that it would also work in 12c right?
Wrong!
You'll get the following error when trying to import the metadata:
The connection has failed. [nQSError: 77031] Error occurs while calling remote service ADMImportService.
Details: Details: com.hyperion.ap.APException: [1085]
Error: Error creating objectjava.lang.ClassNotFoundException
Many thanks to Ziga who also encountered the same error and blogged the solution here:
http://zigavaupot.blogspot.co.uk/2016/02/obi-12c-series-connecting-obi-with.html.
The solution is hidden in the Metadata Repository Builder's Guide for Oracle Business Intelligence Enterprise Edition:
"For Hyperion Planning 11.1.2.4 or later, the installer does not deliver all of the required client driver .jar files. To ensure that you have all of the needed .jar files, go to your instance of Hyperion, locate and copy the adm.jar, ap.jar, and HspAdm.jar files and paste them into
MIDDLEWARE_HOME\oracle_common\modules."
Copy the following files from your Planning install:
- E:\Oracle\Middleware\EPMSystem11R1\common\ADM\11.1.2.0\lib\adm.jar
- E:\Oracle\Middleware\EPMSystem11R1\common\ADM\11.1.2.0\lib\ap.jar
- E:\Oracle\Middleware\EPMSystem11R1\common\ADM\Planning\11.1.2.0\lib\HspAdm.jar
to this location on your OBIEE 12c server:
- E:\Oracle\OBIEE\Oracle_Home\oracle_common\modules
Restart the services and away you go:
Thursday, 7 April 2016
OBIEE 12c and HFM
I've worked with several of the clever people from the Oracle CEAL team and I highly recommend their blog: https://blogs.oracle.com/pa/ . It's full of valuable BI and EPM information.
Their latest post https://blogs.oracle.com/pa/entry/integrating_bi_12c_with_hfm shows us how to import HFM metadata into OBIEE 12c. I had some teething troubles configuring this on Windows (mainly due to the direction of the slashes in my path statements) so I thought I'd share my configuration for OBIEE and HFM on Windows.
I followed the instructions in the document Olivier provided but I got the following error when trying to import the HFM metadata:
"Could not connect to the data source. A more detailed error message has been written to the BI Administrator log file".
Don't believe the error message, there isn't any extra info in the admin log file so no hint as to where I had gone wrong. I think my problem was two fold - I'm on Windows and had used back-slashes in my pathing statements and I used the bi java host port number that was specified in the CEAL document (9610). The default port for the java host is 9510 so I used that.
Here is my working config:
E:\Oracle\OBIEE\Oracle_Home\bi\modules\oracle.bi.cam.obijh\setOBIJHEnv.cmd :
Add the following variables. This is a windows command script so you should be using back-slashes:
SET EPM_ORACLE_HOME=E:\Oracle\Middleware\EPMSystem11R1
SET EPM_ORACLE_INSTANCE=E:\Oracle\Middleware\user_projects\FDNHOST1
SET OBIJH_ARGS=%OBIJH_ARGS% -DEPM_ORACLE_HOME=%EPM_ORACLE_HOME%
SET OBIJH_ARGS=%OBIJH_ARGS% -DEPM_ORACLE_INSTANCE=%EPM_ORACLE_INSTANCE%
SET OBIJH_ARGS=%OBIJH_ARGS% -DHFM_ADM_TRACE=2
Their latest post https://blogs.oracle.com/pa/entry/integrating_bi_12c_with_hfm shows us how to import HFM metadata into OBIEE 12c. I had some teething troubles configuring this on Windows (mainly due to the direction of the slashes in my path statements) so I thought I'd share my configuration for OBIEE and HFM on Windows.
I followed the instructions in the document Olivier provided but I got the following error when trying to import the HFM metadata:
"Could not connect to the data source. A more detailed error message has been written to the BI Administrator log file".
Don't believe the error message, there isn't any extra info in the admin log file so no hint as to where I had gone wrong. I think my problem was two fold - I'm on Windows and had used back-slashes in my pathing statements and I used the bi java host port number that was specified in the CEAL document (9610). The default port for the java host is 9510 so I used that.
Here is my working config:
E:\Oracle\OBIEE\Oracle_Home\bi\modules\oracle.bi.cam.obijh\setOBIJHEnv.cmd :
Add the following variables. This is a windows command script so you should be using back-slashes:
SET EPM_ORACLE_HOME=E:\Oracle\Middleware\EPMSystem11R1
SET EPM_ORACLE_INSTANCE=E:\Oracle\Middleware\user_projects\FDNHOST1
SET OBIJH_ARGS=%OBIJH_ARGS% -DEPM_ORACLE_HOME=%EPM_ORACLE_HOME%
SET OBIJH_ARGS=%OBIJH_ARGS% -DEPM_ORACLE_INSTANCE=%EPM_ORACLE_INSTANCE%
SET OBIJH_ARGS=%OBIJH_ARGS% -DHFM_ADM_TRACE=2
Add the following variables. This time we use the forward-slash:
#Added for HFM connectivity
EPM_ORACLE_HOME=E:/Oracle/Middleware/EPMSystem11R1
EPM_ORACLE_INSTANCE=E:/Oracle/Middleware/user_projects/FDNHOST1
Find OBIJH_ARGS as indicated in the CEAL document and append the below tags to the end (without carriage return):
-DEPM_ORACLE_HOME=E:/Oracle/Middleware/EPMSystem11R1
-DEPM_ORACLE_INSTANCE=E:/Oracle/Middleware/user_projects/FDNHOST1
-DHFM_ADM_TRACE=2
The text has been wrapped in the image above. Just add the text at the end of your OBIJH_ARGS setting.
E:\Oracle\OBIEE\Oracle_Home\bi\bifoundation\javahost\config\loaders.xml
Add the following lines to the <Name>IntegrationServiceCall</Name> Classpath:
E:/Oracle/Middleware/EPMSystem11R1/common/hfm/11.1.2.0/lib/fm-adm-driver.jar$;
E:/Oracle/Middleware/EPMSystem11R1/common/hfm/11.1.2.0/lib/fm-web-objectmodel.jar;
E:/Oracle/Middleware/EPMSystem11R1/common/jlib/11.1.2.0/epm_j2se.jar;
E:/Oracle/Middleware/EPMSystem11R1/common/jlib/11.1.2.0/epm_hfm_web.jar
For the Admin Tool client, edit the following file:
E:\Oracle\OBIEE\OBIEE_Client\domains\bi\config\fmwconfig\biconfig\OBIS\NQSConfig.INI
JAVAHOST_HOSTNAME_OR_IP_ADDRESSES = "SERVERNAME:9510";
Restart your OBIEE and you can now import your HFM data into OBIEE:
adm:thin:com.oracle.htm.HsvADMDriver:HFMCLUSTER:APPNAME
You can see your HFM application, use the arrow to push it into the repository view:
You should now see it available in the physical layer:
Save the repository and start building your reports!
Tuesday, 22 March 2016
Top Gun Update
OBIEE 12c - Look Under the Bonnet and Take it for a Test Drive: this was the title of the OBIEE 12c presentation I gave at the Infratects Top Gun conference in Berlin a couple of weeks ago.
Infratects has been organising the EPM Top Gun conference for 5 years now and it has transformed from being a purely technical EPM conference (it's roots lie in the original Hyperion Solutions Top Gun conference which was purely infrastucture related) to now introducing functional and customer story streams.
I've presented or been on the expert panel for every conference so far and it's been a real pleasure sharing knowledge and learning from everyone involved.
My personal highlight at every Top Gun conference so far has been Kash Mohammadi's key note speech. For those who don't know, Kash is a Vice President, EPM Product Development at Oracle. He was the trail blazer for EPM in the cloud and is in charge of PBCS. His presentations have always been very enjoyable, light hearted and honest and give great insight into how Oracle are managing their PBCS service and all the new features we can expect on the roadmap.
Back to my presentation...it's basically an overview of OBIEE 12c and the new features available in this release. It also explains how to upgrade and some of the changes in administration. I've uploaded the presentation to slideshare so feel free to download, share and comment if you wish.
http://www.slideshare.net/GuillaumeSlee/obiee-12c-look-under-the-bonnet-and-test-drive
Wednesday, 20 January 2016
EPM patch 11.1.2.2.500 Breaks EPMA: com.hyperion.awb.web.common.DimensionServiceException
EPMA can be frustrating at the best of times but this was the first time I've had an issue with patching. The issue only occurs with some environments, in this example the patch worked fine on DEV but broke the PROD environment.
The error occurs when applying the 11.1.2.2.500 super patch to EPM 11.1.2.2.300. The first time you start the EPMA services everything will work as normal. However, when the EPMA dimension service starts it executes some SQL on the database which means that the second time you start the EPMA dimension service it won't start correctly.
You'll get this error message in the EPM Workspace when selecting the EPMA Application or Dimension components:
Requested Service not found
Code:
com.hyperion.awb.web.common.DimensionServiceException
Description: An error occurred processing the result from the server.
A look at the DimensionServer.log reveals the following:
Failed 11.1.2.2.00 FixInvalidDynamicPropertyReferences task at System.String.InternalSubStringWithChecks(Int32 startIndex, Int32 length, Boolean fAlwaysCopy)
...
An error occurred during initialization of the Dimension Server Engine: startIndex cannot be larger than length of string.
Parameter name: startIndex. at Hyperion.DimensionServer.LibraryManager.FixInvalidDynamicPropertyReferences()
at Hyperion.DimensionServer.Global.Initialize(ISessionManager sessionMgr, Guid systemSessionID, String sqlConnectionString)
The same error can also occur when applying 11.1.2.3.500. Oracle released 11.1.2.3.501 to fix the issue, unfortunately they didn't release a patch for the 11.1.2.2 code line.
Thankfully there is a fix. Run the following SQL on your EPMA repository and your EPMA Dimension Server will start successfully again:
UPDATE DS_Property_Dimension
SET c_property_value = null
FROM DS_Property_Dimension pd
JOIN DS_Library lib
ON lib.i_library_id = pd.i_library_id
JOIN DS_Member prop
ON prop.i_library_id = pd.i_library_id
AND prop.i_dimension_id = pd.i_prop_def_dimension_id
AND prop.i_member_id = pd.i_prop_def_member_id
JOIN DS_Dimension d
ON d.i_library_id = pd.i_library_id
AND d.i_dimension_id = pd.i_dimension_id
WHERE
prop.c_member_name = 'DynamicProperties' AND
pd.c_property_value IS NOT NULL AND pd.c_property_value = '';
The error occurs when applying the 11.1.2.2.500 super patch to EPM 11.1.2.2.300. The first time you start the EPMA services everything will work as normal. However, when the EPMA dimension service starts it executes some SQL on the database which means that the second time you start the EPMA dimension service it won't start correctly.
You'll get this error message in the EPM Workspace when selecting the EPMA Application or Dimension components:
Requested Service not found
Code:
com.hyperion.awb.web.common.DimensionServiceException
Description: An error occurred processing the result from the server.
A look at the DimensionServer.log reveals the following:
Failed 11.1.2.2.00 FixInvalidDynamicPropertyReferences task at System.String.InternalSubStringWithChecks(Int32 startIndex, Int32 length, Boolean fAlwaysCopy)
...
An error occurred during initialization of the Dimension Server Engine: startIndex cannot be larger than length of string.
Parameter name: startIndex. at Hyperion.DimensionServer.LibraryManager.FixInvalidDynamicPropertyReferences()
at Hyperion.DimensionServer.Global.Initialize(ISessionManager sessionMgr, Guid systemSessionID, String sqlConnectionString)
The same error can also occur when applying 11.1.2.3.500. Oracle released 11.1.2.3.501 to fix the issue, unfortunately they didn't release a patch for the 11.1.2.2 code line.
Thankfully there is a fix. Run the following SQL on your EPMA repository and your EPMA Dimension Server will start successfully again:
UPDATE DS_Property_Dimension
SET c_property_value = null
FROM DS_Property_Dimension pd
JOIN DS_Library lib
ON lib.i_library_id = pd.i_library_id
JOIN DS_Member prop
ON prop.i_library_id = pd.i_library_id
AND prop.i_dimension_id = pd.i_prop_def_dimension_id
AND prop.i_member_id = pd.i_prop_def_member_id
JOIN DS_Dimension d
ON d.i_library_id = pd.i_library_id
AND d.i_dimension_id = pd.i_dimension_id
WHERE
prop.c_member_name = 'DynamicProperties' AND
pd.c_property_value IS NOT NULL AND pd.c_property_value = '';
Sunday, 3 January 2016
Bursting Using OBIEE Agents Part 3: The Return of the Burst
Happy New Year!
This bursting series has got legs...maybe I could follow the Star Wars model and do prequels, sequels and bursting franchises?
In the final part of the agent bursting series we will piece all of this together using a batch script.
Remember in Bursting using OBIEE Agents Part 1 our Sales dashboard was filtered by a default region session variable (defined in an SQL table)?
Here is the secret sauce: will use a batch script to modify the default region in the SQL table and then run the dashboard agent for each region.
However, you can only work with the tools at your disposal. This is easy to implement and extremely flexible with the output. You can add emailing to the loop, you could upload to Sharepoint...the options are endless.
You can even use this javascript action script in an agent to call our batch script on-demand from the OBIEE browser:
/////////////////////////////////////////////////////////////
//
// launch_burst.js
//
// Launches a batch script on the server
//
/////////////////////////////////////////////////////////////
var WshShell = new ActiveXObject("WScript.Shell");
WshShell.Run("E:\\OBIEE_BURSTING\\Agent_Scripts\\Agent_Burst.bat");
So there we have it. A more robust version of this script (error trapping, file polling etc) is currently running in a Production environment bursting up to a thousand reports daily and the customer is very happy with the result.
Please feel free to post opinions, both good and bad. Maybe this will spark an idea for you to come up with a better idea? Share and let us let us know!
This bursting series has got legs...maybe I could follow the Star Wars model and do prequels, sequels and bursting franchises?
In the final part of the agent bursting series we will piece all of this together using a batch script.
Remember in Bursting using OBIEE Agents Part 1 our Sales dashboard was filtered by a default region session variable (defined in an SQL table)?
Here is the secret sauce: will use a batch script to modify the default region in the SQL table and then run the dashboard agent for each region.
- Launch the busting definition agent and write all our bursting regions to Region_Definition.txt.
- Read the Region_Definition.txt file and for each line:
- Create a temporary SQL script to update Default_Region SQL column for our "burster" user.
- Run the dashboard agent as "burster" (so the dashboard is filtered on the Default_Region set in the previous step)
- Rename Sales_Dashboard.pdf to Sales_Dashboard_<REGION>.pdf and copy to E:\OBIEE_BURSTING\Agent_Output
- Delete the temporary Sales_Dashboard.pdf
- Delete the temporary SQL script
- Loop round to 2nd line of Region_Definition.txt and repeat.
Here are the final ingredients:
- Batch scripting language of your choice.
- Oracle or SQL client in order to run sqlplus/sqlcmd from the command line.
In my example I am using Windows 2012 and SQL Server <gasp!> and my scripting language will use the windows command. Unlike me, I recommend that you step out of the 1990s and use a more up-to-date language!
The Batch Script
The first issue we have is that the SaSchinvoke.exe doesn't provide us with a flag to define a password so you need to echo the password and pipe the output to the SaSchinvoke.exe command. There are lots of blogs on this but Rittman Mead's was the earliest I could find so I'll quote it here. The link is relatively old now but still useful as you could obviously integrate this script to any 3rd party ETL/scheduler tool like ODI.
I'll break the script up for explanation purposes and then copy the full script at the end.
Run the Bursting Definition Agent to Create the Burst Definition File
Here is the first part of our script. At the top it sets a bunch of variables. It then runs our Region_Burst_Agent from the command line. Remember the agent writes our list of bursting regions to a text file.
@echo off
REM Set Bursting Variables
set SCRIPT_HOME=E:\OBIEE_BURSTING\Agent_Scripts
set BURST_DEFINITION=E:\OBIEE_BURSTING\Agent_Temp\Region_Definition.txt
set OBIEE_BATCH_HOME=E:\Oracle\OBIEE\Oracle_Home\bi\bifoundation\server\bin
set BURST_SOURCE=E:\OBIEE_BURSTING\Agent_Temp
set BURST_OUTPUT=E:\OBIEE_BURSTING\Agent_Output
echo ##################################################################
echo #### Create list of Regions to burst from ##
echo ##################################################################
echo Bursting started %date% %time% >> %SCRIPT_HOME%\bursting.log
REM Launch the Region_Burst_Definition agent to create the list of bursting
E:
cd %OBIEE_BATCH_HOME%
echo PASSWORD|saschinvoke -u bi_admin -i "/shared/Blog/Agent Burst/Region_Burst_Agent" -m epmdb:9511
timeout 5 /nobreak
Notice the timeout at the end of the script. I've added this as we need to wait for the bursting definition file to be written before we move to the next step. I advise you to add better logic here and poll for the file to be written before going to the next step.
echo ##################################################################
echo #### Begin bursting. Dashboards are written to %BURST_OUTPUT% ##
echo ##################################################################
REM Parse Region_Definition.txt file created by the Region_Burst_Definition agent.
REM Start For Loop
for /f "delims=" %%i in ('type %BURST_DEFINITION%') do (
REM Create temporary SQL statement to set default Region and execute
echo UPDATE OBIEE_Defaults SET Default_Region='%%i' WHERE User_Name='burster'; >> %SCRIPT_HOME%\update_region.sql
sqlcmd -S SERVER -d DATABASE -U USER -P PASSWORD -i "%SCRIPT_HOME%\update_region.sql"
REM Launch OBIEE Agent to create PDF
echo PASSWORD|saschinvoke -u bi_admin -i "/shared/Blog/Agent Burst/Sales_Burst_Agent" -m epmdb:9511
timeout 5 /nobreak
REM Rename file with Region name and move out BURST_OUTPUT folder
echo F|xcopy /Y %BURST_SOURCE%\Sales_Dashboard.pdf "%BURST_OUTPUT%\Sales_Dashboard_%%i.pdf"
REM Clean up temporary files
del /q %SCRIPT_HOME%\update_region.sql
del /q %BURST_SOURCE%\Sales_Dashboard.pdf
REM End For Loop
)
The last steps of this section clean up the temporary SQL script and temporary dashboard.pdf
REM Delete Region list file
del %BURST_DEFINITION%
echo Bursting finished %date% %time% >> %SCRIPT_HOME%\bursting.log
echo ##################################################################
echo #### Bursting complete. ##
echo ##################################################################
@echo off
REM Set Bursting Variables
set SCRIPT_HOME=E:\OBIEE_BURSTING\Agent_Scripts
set BURST_DEFINITION=E:\OBIEE_BURSTING\Agent_Temp\Region_Definition.txt
set OBIEE_BATCH_HOME=E:\Oracle\OBIEE\Oracle_Home\bi\bifoundation\server\bin
set BURST_SOURCE=E:\OBIEE_BURSTING\Agent_Temp
set BURST_OUTPUT=E:\OBIEE_BURSTING\Agent_Output
echo ##################################################################
echo #### Create list of Regions to burst from ##
echo ##################################################################
echo Bursting started %date% %time% >> %SCRIPT_HOME%\bursting.log
REM Launch the Region_Burst_Definition agent to create the list of bursting
E:
cd %OBIEE_BATCH_HOME%
echo Hyp3r10n|saschinvoke -u bi_admin -i "/shared/Blog/Agent Burst/Region_Burst_Agent" -m epmdb:9511
timeout 5 /nobreak
echo ##################################################################
echo #### Begin bursting. Dashboards are written to %BURST_OUTPUT% ##
echo ##################################################################
REM Parse Region_Definition.txt file created by the Region_Burst_Definition agent.
REM Start For Loop
for /f "delims=" %%i in ('type %BURST_DEFINITION%') do (
REM Create temporary SQL statement to set default Region and execute
echo UPDATE OBIEE_Defaults SET Default_Region='%%i' WHERE User_Name='burster'; >> %SCRIPT_HOME%\update_region.sql
sqlcmd -S SERVER -d DATATBASE -U USERNAME -P PASSWORD -i "%SCRIPT_HOME%\update_region.sql"
REM Launch OBIEE Agent to create PDF
echo PASSWORD|saschinvoke -u bi_admin -i "/shared/Blog/Agent Burst/Sales_Burst_Agent" -m epmdb:9511
timeout 10 /nobreak
REM Rename file with Region name and move out BURST_OUTPUT folder
echo F|xcopy /Y %BURST_SOURCE%\Sales_Dashboard.pdf "%BURST_OUTPUT%\Sales_Dashboard_%%i.pdf"
REM Clean up temporary files
del /q %SCRIPT_HOME%\update_region.sql
del /q %BURST_SOURCE%\Sales_Dashboard.pdf
REM End For Loop
)
REM Delete Region list file
del %BURST_DEFINITION%
echo Bursting finished %date% %time% >> %SCRIPT_HOME%\bursting.log
echo ##################################################################
echo #### Bursting complete. ##
echo ##################################################################
And here is the result:
Our Sales dashboard has been saved as PDF to our output location and burst across all the regions we defined in our Burst Definition report.
Remember, the object of this exercise is to burst our existing OBIEE dashboards, not to replace BI-Publisher.
@echo off
REM Set Bursting Variables
set SCRIPT_HOME=E:\OBIEE_BURSTING\Agent_Scripts
set BURST_DEFINITION=E:\OBIEE_BURSTING\Agent_Temp\Region_Definition.txt
set OBIEE_BATCH_HOME=E:\Oracle\OBIEE\Oracle_Home\bi\bifoundation\server\bin
set BURST_SOURCE=E:\OBIEE_BURSTING\Agent_Temp
set BURST_OUTPUT=E:\OBIEE_BURSTING\Agent_Output
echo ##################################################################
echo #### Create list of Regions to burst from ##
echo ##################################################################
echo Bursting started %date% %time% >> %SCRIPT_HOME%\bursting.log
REM Launch the Region_Burst_Definition agent to create the list of bursting
E:
cd %OBIEE_BATCH_HOME%
echo PASSWORD|saschinvoke -u bi_admin -i "/shared/Blog/Agent Burst/Region_Burst_Agent" -m epmdb:9511
timeout 5 /nobreak
Notice the timeout at the end of the script. I've added this as we need to wait for the bursting definition file to be written before we move to the next step. I advise you to add better logic here and poll for the file to be written before going to the next step.
Update our Default Region in SQL Table
Once our region definition file is created our batch will read the file and for each line it will update the default region in the SQL table for the "burster" user:echo ##################################################################
echo #### Begin bursting. Dashboards are written to %BURST_OUTPUT% ##
echo ##################################################################
REM Parse Region_Definition.txt file created by the Region_Burst_Definition agent.
REM Start For Loop
for /f "delims=" %%i in ('type %BURST_DEFINITION%') do (
REM Create temporary SQL statement to set default Region and execute
echo UPDATE OBIEE_Defaults SET Default_Region='%%i' WHERE User_Name='burster'; >> %SCRIPT_HOME%\update_region.sql
sqlcmd -S SERVER -d DATABASE -U USER -P PASSWORD -i "%SCRIPT_HOME%\update_region.sql"
Run the Dashboard Agent to Write Dashboard to PDF
With our default region set we can now call the Dashboard agent to write the dashboard to PDF. Notice again how I've put in a timeout to ensure that the dashboard is run and saved to PDF before we loop round to the next region in our Burst Definition file. I would recommend this polls for the file to be written before going to the next step.REM Launch OBIEE Agent to create PDF
echo PASSWORD|saschinvoke -u bi_admin -i "/shared/Blog/Agent Burst/Sales_Burst_Agent" -m epmdb:9511
timeout 5 /nobreak
Rename the Dashboard with the Region Location. Copy to Output Folder
Here I rename the dashboard so it includes the location name. You can of course add a timestamp or anything else you want to with the name/location.REM Rename file with Region name and move out BURST_OUTPUT folder
echo F|xcopy /Y %BURST_SOURCE%\Sales_Dashboard.pdf "%BURST_OUTPUT%\Sales_Dashboard_%%i.pdf"
REM Clean up temporary files
del /q %SCRIPT_HOME%\update_region.sql
del /q %BURST_SOURCE%\Sales_Dashboard.pdf
REM End For Loop
)
The last steps of this section clean up the temporary SQL script and temporary dashboard.pdf
Finally Delete the Region Definition File
Once all the busting has completed the last step is to delete the file that contains our bursting definition.REM Delete Region list file
del %BURST_DEFINITION%
echo Bursting finished %date% %time% >> %SCRIPT_HOME%\bursting.log
echo ##################################################################
echo #### Bursting complete. ##
echo ##################################################################
Complete Script
Here is the complete script. If you remove the commentary and variables there is only 14 lines of code.@echo off
REM Set Bursting Variables
set SCRIPT_HOME=E:\OBIEE_BURSTING\Agent_Scripts
set BURST_DEFINITION=E:\OBIEE_BURSTING\Agent_Temp\Region_Definition.txt
set OBIEE_BATCH_HOME=E:\Oracle\OBIEE\Oracle_Home\bi\bifoundation\server\bin
set BURST_SOURCE=E:\OBIEE_BURSTING\Agent_Temp
set BURST_OUTPUT=E:\OBIEE_BURSTING\Agent_Output
echo ##################################################################
echo #### Create list of Regions to burst from ##
echo ##################################################################
echo Bursting started %date% %time% >> %SCRIPT_HOME%\bursting.log
REM Launch the Region_Burst_Definition agent to create the list of bursting
E:
cd %OBIEE_BATCH_HOME%
echo Hyp3r10n|saschinvoke -u bi_admin -i "/shared/Blog/Agent Burst/Region_Burst_Agent" -m epmdb:9511
timeout 5 /nobreak
echo ##################################################################
echo #### Begin bursting. Dashboards are written to %BURST_OUTPUT% ##
echo ##################################################################
REM Parse Region_Definition.txt file created by the Region_Burst_Definition agent.
REM Start For Loop
for /f "delims=" %%i in ('type %BURST_DEFINITION%') do (
REM Create temporary SQL statement to set default Region and execute
echo UPDATE OBIEE_Defaults SET Default_Region='%%i' WHERE User_Name='burster'; >> %SCRIPT_HOME%\update_region.sql
sqlcmd -S SERVER -d DATATBASE -U USERNAME -P PASSWORD -i "%SCRIPT_HOME%\update_region.sql"
REM Launch OBIEE Agent to create PDF
echo PASSWORD|saschinvoke -u bi_admin -i "/shared/Blog/Agent Burst/Sales_Burst_Agent" -m epmdb:9511
timeout 10 /nobreak
REM Rename file with Region name and move out BURST_OUTPUT folder
echo F|xcopy /Y %BURST_SOURCE%\Sales_Dashboard.pdf "%BURST_OUTPUT%\Sales_Dashboard_%%i.pdf"
REM Clean up temporary files
del /q %SCRIPT_HOME%\update_region.sql
del /q %BURST_SOURCE%\Sales_Dashboard.pdf
REM End For Loop
)
REM Delete Region list file
del %BURST_DEFINITION%
echo Bursting finished %date% %time% >> %SCRIPT_HOME%\bursting.log
echo ##################################################################
echo #### Bursting complete. ##
echo ##################################################################
The Result
Our Sales dashboard has been saved as PDF to our output location and burst across all the regions we defined in our Burst Definition report.
Performance
Don't expect to burst out 1000 reports per minute; there is a delay when you launch an agent via Sachinvoke and you need to wait for the dashboard to be written before moving onto the next one. However you could use multiple users to run the process; using 2 user accounts could half the time.Remember, the object of this exercise is to burst our existing OBIEE dashboards, not to replace BI-Publisher.
Conclusion
Am I happy with this? Yes and no.
This is more complicated than it should be, it is not as fast as normal bursting methods and I have to hard-code passwords in the script (although arguably if someone is able to sniff around on your server you've got bigger issues than password protection). You can also have issues if agents start overlapping each other.
However, you can only work with the tools at your disposal. This is easy to implement and extremely flexible with the output. You can add emailing to the loop, you could upload to Sharepoint...the options are endless.
You can even use this javascript action script in an agent to call our batch script on-demand from the OBIEE browser:
/////////////////////////////////////////////////////////////
//
// launch_burst.js
//
// Launches a batch script on the server
//
/////////////////////////////////////////////////////////////
var WshShell = new ActiveXObject("WScript.Shell");
WshShell.Run("E:\\OBIEE_BURSTING\\Agent_Scripts\\Agent_Burst.bat");
So there we have it. A more robust version of this script (error trapping, file polling etc) is currently running in a Production environment bursting up to a thousand reports daily and the customer is very happy with the result.
Please feel free to post opinions, both good and bad. Maybe this will spark an idea for you to come up with a better idea? Share and let us let us know!
Subscribe to:
Posts (Atom)