Thursday 17 September 2009

Set up drill through from a crosstab to a detail list report

The following blog describes the process of setting up drill through from a source crosstab report to a target detailed list report by clicking on the measure values within the crosstab.

Create a target detail list drill through report.

1. Create a new report list report. Ensure that you include on this report (either on the report itself or in the underlying query) the same query items that you intend to use as the axis of the crosstab. If your crosstab shows product name by order year, include these in your query.

2. Test your report works

3. Add two filters to your report. These act as the prompts to the source report to supply the filtering values. These must be the same query items as the cross tab query items.
For example:
product_name =?product?
order_year=?year?

4. Save the target drill report

Create a source crosstab drill through report

Create a new crosstab report using the same (filtered) query items for your axis. The measure can be anything you like.

Select the measure cell, right click and select ‘Drill Throughs’

Click the new button.
Click the ellipsis (…) for the report and select the target report you just created.
Set the Action to ‘Run the report’
Click the ‘pencil’ button at the bottom of the window. This will load the required parameters from the target report. (The filters you set up earlier).
Set the method for both to ‘Pass data item value’. Then set the appropriate data item from your crosstab query to be the data item that you pass through to the target report filter.
Click OK a couple of times and save the report.

You’re done.

Run the source report and click on the measure value to drill through. This will open the target report and pass the axis data items for the measure intersection you selected through to the target report and filter it.

Labels:

Friday 14 August 2009

A summary of migration from series7 to series8

Migration from series 7 to series8 always causes lots of questions due the number of different version of software the source environment has and the complexity of the reports.

Portal
In series 8 Upfront is replaced by Cognos Connection as the portal used to access Cognos content.

Transformer & PowerPlay
Transformer models can be used as is, but they will need to be placed on the application server. The models can be batched but need to use cubeswap to disable / enable the cubes after build/copying the cubes. Note that the authentication source between environments may change so make sure that any security that is set up is tested.
I am told that PowerPlay reports can be migrated too but will need to be exported as MDX (portable files) before import although I have never done this preferring to rebuild reports directly in Analysis studio. I'm unaware if PowerPlay reports in PPES can be migrated.

Impromptu reports
There is a formal migration route for Impromptu reports into Cognos8. You will need to download the migration tools from Cognos themseleves. In a similar manner, Impromptu catalogues are also migrated into Framework Manager models. However it should be noted that this method is rather hit and miss.

ReportNet reports
ReportNet reports should migrate directly from ReportNet to Cognos8 although there are a number of known issues with migration and all reports should go through full regression testing prior to release. It is highly likely that reports will have at least one issue that needs to be resolved. If possible create a seperate Cognos8 environment to your ReportNet environment. In this way reports and packages can be migrated incrementally. It is possible to have COgnos8 and reportNet running on the same application server but I would advise against it.

I hope this helps. For more details please see the Cognos knowledgebase and documentation. It's all available without login from the new IBM Cognos support website.

best regards

Phil

Tuesday 28 October 2008

Granularity & Determinants in Framework Manager

Hi,

following a question from Divya last week, this blog relates to granularity and determinants in Framework Manager.

Granularity is a simple concept describing the level data relates to.
Imagine sales areas:

Level1: Country
Level2: Sales Area
Level3: State
Level4: City
Level5: Store
Level6: Department
Level7: Employee

We can summarise sales volume (quantity) at each of the levels above.
The level at which we report is the grain (or granularity) of the table.

Tables or query subjects in Framework Manager often have multiple levels (grains) which you may have to join to another table with a different level grain.

An example of this is forecasts or budgets.
Using our example above, we want to join our sales area table to our budgets table. Unfortunately the grain of the sales area table is employee but the budgets have been completed at Store level.

In Framework Manager, if we join directly from store in sales areas to store in the budget data the budget data value will be repeated for how many employees are part of that store.












New Query Subject (join on Store with no determinants set):
Sales Area.City
Sales Area.Store
Budget.Month
Budget.Sales_Target

Result:







































As there are four employees at the Bayside store this forces the model query subject to return four rows repeating the sales budget.

To correct this Framework Manager has to be instructed what level of granularity the budget applies to. This is done using determinants.

In our example we need to set the determinants on the Sales Area query subject.
Right click the query subject and select the determinants tab.

One determinant should be created for each level of granularity as shown below.
Create the determinant by clicking Add.
Rename the determinant and then drag the identifying column (Key) into the key window (bottom left). If the level has any attributes that exist at the same level of granularity these should be added to the Attribute window.
If the new determinant you have created is the lowest level grain (Employee in this case) the Uniquely Identified check box should be ticked.
The Group By tickbox should be ticked if the level is ever to be used for aggregating values.

Ensure that once you have created all of your determinants for the query subject that they are in the correct order (use the up and down buttons to sort them correctly).








































After setting the determinants click OK to save the changes.
When the model query subject is now run, only one row of data is returned:





































I hope this helps

Phil

Labels: , ,

Wednesday 30 January 2008

Dynamic Security in ReportNet & Cognos8

There are three methods of setting up dynamic security, all of which are done in Framework Manager.

Row level/Data Level Security
This is the simplest form of security and is achieved by using a fixed filter for each user class or group of user classes.
1. Select the query subject - right click and select 'Specify security filters'
2. Click the 'Add Groups' button.
3. Select the user classes you require from the LDAP/AD/NTLM namespace. (For example 'Japan users')
4. Click in the box below 'Filter' and select 'Create/Edit Embedded'. This will open a filter definition window.
5. Enter a filter definition and click OK. For example Country_Code ='JP'
6. Click OK again to exit the set up window.

The security filter will become active once the package the query subject belongs to is published. At runtime when any users with the applicable user class uses a report based on the query subject the query results will be filtered using the filter definition you specified. (In my example all user in the user class 'Japan users' will only have the ability to view data with a country code of 'JP').

Rather than setting up this security many times, try to set it up on a common or conformed dimension query subject.

Dynamic Filtering (CSVIdentityNameList)

This security set up is similar to the first option but is more dynamic in that the filter definition does not have to be altered every time the user class structure is changed. This option is also useful where a large number of user classes are used.

The security method can be used against table held entries or against a calculated query item and matches the user class in the LDAP to an entry in the table (or query item).

1. Select the model query subject that you wish to set up the security on.
2. Right click and select 'Edit definition'

3. Click on the filter button/tab (to add a new filter)
4. Add the required query item to the filter expression.
5. Add the CSVIdentify function to the expression. The end result should look like the following sample:
([Presentation Layer].[Countries].[Country_Code] in (#CSVIdentityNameList()# ))
6. Add any additional filters using a 'OR' to ensure that Administrators and report developers are not filtered:
('Administrators' in (#CSVIdentityNameList()# ))
OR
('Report Administrators') in (#CSVIdentityNameList()# ))
OR
([Business Layer].[Countries].[Country_Code] in (#CSVIdentityNameList()# ))

7.Click OK twice to enable the filter.

The security filter will become active once the package the query subject belongs to is published. At runtime the filter will activate when the query subject is used. The filter works because the CSVIdentifyNameList function lists all userclasses that the runtime user is a member of and then filters against the calculation or table entry. Again - this method is best used where large volumes of user classes are used such as account numbers.

Burst recipient (CAM ID)
This method of security only hides burst report output within Cognos connection but it is useful when users of many different classes access the same folder to view report output. It should be noted that user should only have read, traverse access to the folder. The report burst should be done by a more senior account such as directory admin or report admin.

1. Within Framework Manager add a calculated query item to the query subject that either contains a list of the user classes or where the user class can be calculated.
2. Use a static case statement to define the calculation (an example is shown below):

CASE ([Business Layer].[Countries].[Country_Code] )
WHEN 'JP' THEN ('CAMID("EP Series7:r:authid=3771403238")')
WHEN 'MA' THEN ('CAMID("EP Series7:r:authid=3771403548")')
WHEN 'NL' THEN ('CAMID("EP Series7:r:authid=3049983260")')
WHEN 'NZ' THEN ('CAMID("EP Series7:r:authid=197856540")')
WHEN 'PO' THEN ('CAMID("EP Series7:r:authid=2680884508")')
WHEN 'RO' THEN ('CAMID("EP Series7:r:authid=3872066844")')
WHEN 'SE' THEN ('CAMID("EP Series7:r:authid=4123725084")')
WHEN 'SI' THEN ('CAMID("EP Series7:r:authid=13307164")')
WHEN 'ZA' THEN ('CAMID("EP Series7:r:authid=2999651612")')
ELSE (NULL)
END


3. Use the recipient user class CAM ID as the CASE statment result for the user class or calculation. (This can be found in Cognos Connection by selecting the properties of the user class from the directory.)
4. Publish the package that includes the altered query subject.
5. Within the report add the new query item that uses the CASE statement.
6. Change the burst options of the report so that the burst recipient is based upon the new query item. (The report should be burst so that the report output is logically linked to the userclass selected ie. Japan users can only see the Japan result of the burst report). Make sure that the report bursts to a directory entry.
7. Run the report and burst it.

As the administrator/directory administrator you should be able to check that the report burst correctly and that all report outputs have been created.
Check the security filter works by logging in to Cognos Connection as a user with access to a userclass listed in the CASE statement. When you select show multiple outputs you should only see the output pertaining to the userclass.


Labels:

Monday 14 May 2007

Single or Multiple LDAP in multiple environments


Imagine the scenario:
You have a development, a test and a live environment all set up according to theory. Each environment is a standalone environment (for security reasons) and is only accessible by http.

In this scenario you need a LDAP database (such as Active Directory or Sun One) in each environment. This ticks all the boxes for security. Great - sit back with a big grin on you face.

You now start working with the environments and follow best practice, developing solutions in your development environment, migrating to your test environment to UAT the solution and then migrate again to your live environment where everything will work perfectly - right? Wrong!

Because you have three LDAPs, your Cognos applications have three different sets of internal IDs for each LDAP based object. The newer Cognos applications such as ReportNet and Series8 can work around this but if you use Analyst, Contributor, PowerPlay Enterprise Server or PowerPlay Transformer you are stuffed.

There are two methods to resolve the problem and one method to correct the architecture. First the workarounds:

Work around 1: Because all the internal user IDs are stuffed in your thick client applications, you have to re-import your LDAP objects (such as users and user classes) into your application and rebuild the security in the target environment. (Lots of work)

Work around2: In theory you should be able to use create exports from your LDAP using Cognos Access Manager and then migrate these LAE files into the target environment LDAP. If you do this at the same time as the application migration you should be OK. Oh and you MUST overwrite your target LDAP completely. (This doesn't feel like a comfortable option does it? Especially when you have 1500 users happily using the system in the live environment).

The architecture resolution:
The answer is to do away with the three LDAPs and in their place use one LDAP that is common to all three Cognos environments. By doing this you can migrate as often as you like from environment to environment without having to do any remedial work. Why? Because the internal user IDs held within the applications are always the same. The only snag here is to get the server support team to open a port in each environment to allow the Cognos applications to access the common LDAP. (Don't use 389 if you can help it as that's the default).

If you opt for the architecture resolution you with save yourself weeks of work every year. Remember though that if you already have one LDAP in each environment that you will have to rebuild the security at least once when you change to the single LDAP.

I hope this saves you as much time as it has saved some of clients.

Phil Thompson
http://www.digitalviper.co.uk

Friday 23 March 2007

List Tables & Columns

The following SQL can be used to list tables and column names

SQL server

select TABLE_CATALOG,
TABLE_SCHEMA,
TABLE_NAME,
COLUMN_NAME,
DATA_TYPE,
CHARACTER_MAXIMUM_LENGTH
from .information_schema.columns


Oracle

select OWNER,
TABLE_NAME,
COLUMN_NAME,
DATA_TYPE,
DATA_LENGTH
DATA_PRECISION
from ALL_tab_cols;

Thursday 22 March 2007

Welcome to the Cognos Blogger

Hi and welcome to the Cognos blog.
I'm intending to use this blog to post new techniques and answers to other Cognos developers questions. If you need help - just ask!

Cheers

Phil