Uncategorized

A while back our oraclenerd had a question about presentation server cache settings which he sent out through twitter.

rnm1978 and me answered to his distress call (emails not included. twitter’s quite useless for sending instanceconfig.xml subsets ;-)).

However, I wanted to make it clear through this post, what part of the presentation server cache is actually managed by which parameter since there’s a little gotcha in there.
So basically we there are four parameter which exist for the presentation servers instanceconfig.xml (see also Johns pdf extract). By default they’re not explicitly noted in the file so you need to specify them in order to change them. Here’s a little sample with already modified paths:

Now normally the cache files for the presentation server reside in the tmp directory within the respective subdirectories sawcharts, sawrptcache and sawvc; with the xml cache files lying in the tmp folder itself. Note that I cleaned out all files in the directory for in the screenshot below to have a clean testing baseline:

Restarting the presentation server with the additional XML above gives me the following results:

Chart cache:

Report cache:

State pool cache:

XML cache:

XML cache empty? Aha, seems there’s an issue. Let’s check the original path:

The XML cache files are still here. Not really what we wanted. So the XMLCacheDefault CacheDirectory seems to be useless. At this point I actually created an SR on MyOracleSupport to confirm a hunch I had. Result:
“With regards the parameter , I have researched this and have not found any documentation on it, internally or published. Infact through my research it appears that it has never been the case that this parameter is used to determine the path of the nqsxxxx temp files.
Previous versions of OBI i.e. Siebel Analytics document that the temp files generated by the Analytics Web server could be redirected by using the parameter ‘TempDir’ in Instance config or as a registry variable, in conjunction with the work directory path.”

So off to the NQSConfig.ini and change the WORK_DIRECTORY_PATHS:

#WORK_DIRECTORY_PATHS = “C:OracleOracleBIDatatmp”;
WORK_DIRECTORY_PATHS = “C:DataSourcesCachetmp”;
SORT_MEMORY_SIZE = 4 MB ;
SORT_BUFFER_INCREMENT_SIZE = 256 KB ;
VIRTUAL_TABLE_PAGE_SIZE = 128 KB ;

After that, restart all the services and check again where the cache files went.

Ok now we know how to make the rest of the cache files elsewhere.

Cheers!

Uncategorized

A question which I never covered in my blog yet since it was too basic and should normally work at all times when you import an Essbase cube:
How do I get a fact to display in the BMM layer after importing a cube?

The question came up here and basically has its origin in a bizarre cube outline. Correctly defined cubes will never face this (and hence no one covered it yet, though I already spoke about changing the measure hierarchy).

Let’s look at an import without a working “fact” (in OBIEE terms…).

No fact columns…but why? Let’s open the properties of the physical cube table and check the properties of the hierarchy “Account” (my measure hierarchy).

Now that’s just wrong. The “fact” must be the measure hierarchy. Let’s change that.

Better. Now we’re still missing the actual fact columns (i.e. the members of the measure hierarchy in the Essbase outline.). In my case that should be Costs, Gross Sales, Net Sales, Returns and Sales.


Repeat that for the other four.

Now drag-and-drop the whole thing to your BMM layer.

Done.

Uncategorized

First of all a little disclaimer: No, I haven’t died in the meantime. Other topics took precedence over writing blog posts.

Back on track, I recently received a question from Jeff Tam who also works on a hybrid OBIEE-Essbase solution. Funny enough, jsut this morning there was another question on this on the OTN OBIEE forum here. Basically being a follow up on my post here botg JEff and the poster on OTN wanted to know the detailed steps which are to be taken to manually extend dimensional hierarchies of already imported Essbase outlines in the RPD. Oracle has their own MOS Note on this issue here.

“The workaround that the customer is using now is to copy this into a notepad, correct and paste it back.” is a bit too minimalistic for my taste, so I’ll take the concept in my previous post more into detail here.

We’ll start off with a 5 generation product hierarchy within an Essbase cube that’s already in the RPD.

In Essbase, the outline for the product hierarchy has grown from 5 levels to 6. In order to represent this, we repeat the steps outlined earlier and

a) create the new physical cube column “Gen6,Product”
b) add the new physical level “Gen6,Product” to the existing “Product” hierarchy object

Now that we have the structure built in the RPD, off to correcting the external level number (which isn’t possible through the Administration Tool!). For this we save the RPD and close the Admin Tool.

Open a command window and navigate to your serverbin folder. Then we run the UDML command line to create a txt representation of the RPD in question (paths and filenames need to be adjusted to your needs of course):

C:OracleOracleBIserverBin>nqudmlgen.exe -u Administrator -p Administrator -r “C:OracleOracleBIserverRepositoryShowcase_002.rpd” -o “C:OracleOracleBIserverRepositoryUDMLShowcase_002.txt” -8 -N

Open the resulting txt file in your trusty notepad and search for DECLARE PHYSICAL LEVEL “Sample Accounts”.”LOGIC_A”..”LOGIC”.”Product”.”Gen6,Product”

Modify the the “LEVEL NUMBER EXTERNAL” line to reference the correct external level. WARNING: counting starts at 0! So Gen6,Product is

LEVEL NUMBER 5 EXTERNAL “Gen6,Product”

Save the txt file. Switch back to you command window and run the nqudmlexec command to recreate the RPD from the modified definitions:

C:OracleOracleBIserverBin>nqudmlexec.exe -u Administrator -p Administrator -I “C:OracleOracleBIserverRepositoryUDMLShowcase_002.txt” -O “C:OracleOracleBIserverRepositoryShowcase_003.rpd” -8

After that you’re done and the dimensional level can now be accomodated normally in your BMM and presentation layers.

Cheers,
Christian

Uncategorized

I stumbled upon a post from Shiv Bharti and thought I’d expand a bit on this idea. First of all to prevent the necessity of using the “Advanced” tab; one of the “don’t do this at home, kids” parts of Answers. Secondly since just recently there was yet another question on OTN asking for advice on how to assign / force new values to a session variable during runtime.

Let’s do the example with an Essbase cube as a data source to showcase that this approach works with any data source. First, I create an initialization block which goes against an Oracle 10gR2 source to be able to write a simple dummy select statement from DUAL indicating CoFSM42 as my default Essbase server. CoFSM41 is a backup instance and hence my Essbase server which I want to switch to in order to see what the backup contains in terms of data loads (in this example, 42 will contain data until today whereas 41 will be the backup from last week, so no data this week).


select ‘CoFSM42’ from DUAL

Of course you can implement more elaborate solutions with control tables holding the servers and schemas of your different environment and instances.

In the connection pool to my Essbase source, I change the “Essbase Server” parameter to “VALUEOF(NQ_SESSION.Essbase_Server)

Starting the server, let’s to a quick check of the data through answers by using the approach Shiv proposed. First, I create the request on my standard subject area which currently points to CoFSM42 due to the session variable.

I see that the data I entered for this week is present. Using the SET VARIABLE prefix, I change the source to CoFSM41.

Ok, that’s empty. Now off to making this switch available on the dashboard through a prompt. The prompt is a fake prompt simply unioning my two server names. Here again, you may use a control table which holds your servers.

The important bit is to choose “Set Variable” as “Request Variable” and in the variable name “Essbase_Server” (my session variable used in the connection pool). Now I combine the prompt and the request in a dashboard to see the effect the “Set Variable” has on the request I’ve built. Bear in mind that there is no “is prompted” speficication or filter on that request apart from my week specification.

Here’s the results prompted with the current server:

And here the ones for the backup server:

So the prompt nicely switches between the different data sources for us without the need of Answers access or within Answers the need to play with the query prefix. Plus, it has shown the use of prompts to changes the values assigned to session variables during runtime.

Now for all those who were hoping for another OBIEE/Essbase post or are thinking about using this to switch cubes easily in their architecture:
Unfortunately this approach can NOT be used to easily switch between cubes. And who’s to blame? The substitution variables.

Essbase substitution variables, when defined on database level (“cube” in OBIEE terminology), arrive in the form of “server:application:database:varaiable”. E.g.: “CoFSM42:Sample:Basic:vCurQtr”.
The “CoFSM42” bit can’t be switched out using a variable since the variable name is always interpreted as a literal string. Taking the example from above, doing something like “VALUEOF(NQ_SESSION.Db_Server):Sample:Basic:vCurQtr” won’t work since the variable is then actually called “VALUEOF…”

Pity, since that would make your development and testing of different Essbase sources extremely flexible.

Cheers,
C.

Uncategorized

UDML constantly keeps popping up in conversartions, questions I receive and – as can be seen from a quick query – on the OBIEE OTN forum. So before I go on with this post, a reminder: UDML is NOT supported as an rpd modification mechanism! Everything you do is at your own risk.

Right-o. I’d like to tie my post to the official OBIEE-Essbase modelling guide which can be found here.

Page 8, paragraph 4.1 “Subsequent Changes to the Essbase Outline” mentions the following:

“Cube structure changes (that is, adding or deleting dimensions, and levels) require either a re-import of the cube, or manual modification to the BI Server physical metadata objects to reflect changes.”

This is something that quite some people have contacted me about and honestly, I doubt that there’s really any case which justifies re-import of a cube if you know your way around the Admin Tool and UDML. Even though it’s an Essbase source! (I’ll stick to using the term “cube” to denominate Essbase “databases” for this post.)
One thing needs to be noted though. It’s a small thing, but it basically forces you to use UDML rather than manual modification through the Admin Tool.

Let’s start with a basic cube which I have already imported into my RPD while I was still developing on the Essbase side. So far – where OBIEE is concerned – it only consists several accounts which I can analyze by time.

On the Essbase side, the cube has grown somewhat and I’ve added my “Scenario” dimension.

To get this into OBIEE, I have two possibilities: re-import of the cube of manual creation of the dimension in the physical layer. Not wanting to lose my work on the BMM and Presentation layers, I choose the latter.

Right-clicking on the “Physical Cube Table” object, select “New Object” -> “Hierarchy”.

Then create two new “Physical Cube Columns” below the physical cube table:

Now I have the hierarchy and the two columns:

Next we create the actual hierarchy tree out of them. “New Physical Level”:

And add the column:

And the same thing for level 2 giving us this:

Now this structure is correct, usable and transformable into a corresponding business model:

However in the background there is one little thing going wrong which can give you headache in Answers…especially since tracking down the source of the weird errors this produces is a real pain. I have to admit it took me a while to figure it out.

Let’s take both the time and the scenario hierarchy and copy+paste them into a text editor.

Looking at the top level of the exported UDML, the two hierarchies are alike and don’t differ:

DECLARE HIERARCHY “Demo”.”Sample”..”CustomDemo”.”Time” AS “Time” UPGRADE ID 2161957949 HAVING
(
“Demo”.”Sample”..”CustomDemo”.”Time”.”Gen1,Time”,
“Demo”.”Sample”..”CustomDemo”.”Time”.”Gen2,Time”,
“Demo”.”Sample”..”CustomDemo”.”Time”.”Gen3,Time”,
“Demo”.”Sample”..”CustomDemo”.”Time”.”Gen4,Time”,
“Demo”.”Sample”..”CustomDemo”.”Time”.”Gen5,Time”,
“Demo”.”Sample”..”CustomDemo”.”Time”.”Gen6,Time” ) MEMBER TYPE ALL EXTERNAL “Time”
FULLY BALANCED
BELONGS TO TIME DIMENSION
DIMENSION UNIQUE NAME “Time” TYPE 1
ALIASES NOT UNIQUE
PRIVILEGES ( READ);
DECLARE PHYSICAL LEVEL “Demo”.”Sample”..”CustomDemo”.”Time”.”Gen1,Time” AS “Gen1,Time” UPGRADE ID 2161959167 HAVING
(
“Demo”.”Sample”..”CustomDemo”.”Gen1,Time” )
KEY “Demo”.”Sample”..”CustomDemo”.”Gen1,Time”
LEVEL NUMBER 0 EXTERNAL “Gen1,Time”
PRIVILEGES ( READ);

DECLARE HIERARCHY “Demo”.”Sample”..”CustomDemo”.”Scenario” AS “Scenario” UPGRADE ID 2161960605 HAVING
(
“Demo”.”Sample”..”CustomDemo”.”Scenario”.”Gen1,Scenario”,
“Demo”.”Sample”..”CustomDemo”.”Scenario”.”Gen2,Scenario” ) MEMBER TYPE ALL EXTERNAL “Scenario”
FULLY BALANCED
DIMENSION UNIQUE NAME “Scenario” TYPE 3
ALIASES NOT UNIQUE
PRIVILEGES ( READ);
DECLARE PHYSICAL LEVEL “Demo”.”Sample”..”CustomDemo”.”Scenario”.”Gen1,Scenario” AS “Gen1,Scenario” UPGRADE ID 2161960612 HAVING
(
“Demo”.”Sample”..”CustomDemo”.”Gen1,Scenario” )
KEY “Demo”.”Sample”..”CustomDemo”.”Gen1,Scenario”
LEVEL NUMBER 0 EXTERNAL “Gen1,Scenario”
PRIVILEGES ( READ);

Looking at the two respective extracts for the second level, we see the difference:

DECLARE PHYSICAL LEVEL “Demo”.”Sample”..”CustomDemo”.”Time”.”Gen2,Time” AS “Gen2,Time” UPGRADE ID 2161959169 HAVING
(
“Demo”.”Sample”..”CustomDemo”.”Gen2,Time” )
KEY “Demo”.”Sample”..”CustomDemo”.”Gen2,Time”
LEVEL NUMBER 1 EXTERNAL “Gen2,Time”
PRIVILEGES ( READ);

DECLARE PHYSICAL LEVEL “Demo”.”Sample”..”CustomDemo”.”Scenario”.”Gen2,Scenario” AS “Gen2,Scenario” UPGRADE ID 2161960614 HAVING
(
“Demo”.”Sample”..”CustomDemo”.”Gen2,Scenario” )
KEY “Demo”.”Sample”..”CustomDemo”.”Gen2,Scenario”
LEVEL NUMBER 0 EXTERNAL “Gen2,Scenario”
PRIVILEGES ( READ);

For the imported hierarchy “Time”, the “LEVEL NUMBER EXTERNAL” is correctly incremented and stored as “1” (and in fact represents the level number in Essbase) while for the manually created hierarchy “Scenario” the external level number stayed at “0”.
If you have hierarchies with more than 2 levels, each level from 1 to N has an external level number of “0”.

In the rpd, there is no way for you to affect the external level number, so UDML is your only choice. In all honesty, I normally write my new dimension hierarchies – which should be reflected in the rpd due to cube changes – simply inside a text editor. Starting with an existing hierarchy which I copy+paste, I then write the UDML to fit the Essbase outline and then adapt the external level number to fit the real Essbase level number.

With that problem out of the way there’s really nothing you can’t represent in terms of cube outline changes without having to re-import the whole thing.

So much for today. Until next time!

Uncategorized

I realized this one while importing a rather large cube on a test machine. After the initial import of the cube definitions into the physical layer, I pulled everything over into the BMM layer and saw that dimensions I’d expect were missing completely from the business model. Checking back on the physical layer I saw the corresponding hierarchies were missing as well.

Re-importing and fumbling around didn’t resolve this while doing a cross-check import on my laptop produced a correct representation in both the physical layer and the BMM layer.

Luckily, it seems I wasn’t the only one hitting this issue since a Metalink search yielded document 872342.1. The issue is, that the Essbase API doesn’t find enough open ports to import the outline successfully.

Workaround: Open the registry and navigate to:

\HKEY_LOCAL_MACHINESYSTEMCurrentControlSetServicesTcpipParameters

Modify / create the DWORD parameter “MaxUserPort” to / with value 65534. Then apply the changes, start the Admin Tool again and re-run the import.

Cheers,
Christian

Uncategorized

I’m pleased to announce that the RittmanMead training days for Oracle BI are now officially open for registration.

Speakers at this event will be Mark Rittman, Venkat Janakiraman and myself.

Agenda:

Day 1
o Oracle BI and EPM architecture overview – Mark Rittman
o Oracle BI EE Data Modeling against DW and 3NF sources – Mark Rittman
o Oracle BI Delivers + Integration with Java and BI Publisher – Venkat Janakiraman
o What’s new in Oracle BI, DW and EPM from Oracle Open World – Mark Rittman

Day 2
o Oracle BI EE Data Modeling against Essbase – Venkat Janakiraman
o Leveraging MDX functions and calculations in OBIEE – Christian Berg
o Integrating Security across OBIEE and EPM – Venkat Janakiraman
o I can do this in Hyperion – how do I do it in OBIEE? – Christian Berg and Venkat Janakiraman

Day 3
o OBIEE Systems Management with OEM BI Mgmt Pack – Mark Rittman
o OBIEE Configuration Management Best Practices – Christian Berg
o ODI functionality in Oracle BI Applications – Mark Rittman
o ODI Integration with Essbase, Planning and Oracle EPM Suite – Venkat Janakiraman

Once more, here’s the link to the full event details:
http://www.rittmanmead.com/trainingdays2009/

You can find the registration page at http://www.regonline.co.uk/rmtrainingdays2009

Cheers,
Christi@n

Uncategorized

I was just testing multiple web catalogs when I realized that it’s a bad idea to put version numbers into the web catalog name.

As an example, I have my web catalog “samplesales_paint_v1.3” sitting ready in OracleBIDatawebcatalog and my instanceconfig.xml looking like this:

< ?xml version="1.0" encoding="utf-8"?>
< WebConfig>
< ServerInstance>
< DSN>AnalyticsWeb
< CatalogPath>F:OracleBIDatawebcatalogsamplesales_paint_v1.3

Starting up the server will not load the refrenced catalog, but rather create a new one from scratch: “samplesales_paint_v1”

And the Oracle BI Presentation Services Administration duly notes:

Physical Presentation Catalog Path \?F:OracleBIDatawebcatalogsamplesales_paint_v1root

The sawlog0.log reads as follows:

Type: Error
Severity: 40
Time: Wed Jul 21 23:46:36 2009
File: project/webcatalog/localwebcatalog.cpp Line: 1507
Properties: ThreadID-4328
Location:
saw.catalog.local.loadCatalog
saw.webextensionbase.init
saw.sawserver
saw.sawserver.initializesawserver
saw.threads

Could not load catalog F:OracleBIDatawebcatalogsamplesales_paint_v1.3. Either it does not exist or insufficient permissions.
—————————————
Type: Warning
Severity: 40
Time: Wed Jul 21 23:46:36 2009
File: project/websubsystems/httpserverinit.cpp Line: 49
Properties: ThreadID-4328
Location:
saw.catalog.local.loadCatalog
saw.webextensionbase.init
saw.sawserver
saw.sawserver.initializesawserver
saw.threads

Creating Catalog F:OracleBIDatawebcatalogsamplesales_paint_v1.3.
—————————————

The log is incorrect on both accounts. “F:OracleBIDatawebcatalogsamplesales_paint_v1.3” does exist and the folder creation in the warning message may use the correct naming but actually creates folder “F:OracleBIDatawebcatalogsamplesales_paint_v1”.

XML normally accepts “.” inside the element content so I guess this is a legacy fragment from Siebel Analytics versions where the web catalog was a .webcat file. Why? Well, using this element:

< CatalogPath>F:OracleBIDatawebcatalogpaint.webcat

starts my “paint” folder 😉

Any comments on this are welcome.

Cheers,
Christi@n