Cognos Analytics 11.1.7
Microsoft Windows 2019 server


Error in cognosserver.log file on a new installation.

2021-01-28T09:31:51.384+0100 ERROR Audit.RTUsage.cms.CAM.AAA.SRVC [Thread-51] qjl2C9lhh82hwvqyyvM4MyGG2lldydCyGw49dw4l 0 NA 9300 8C38E8DDD4011089DE605BFA04DA4A10EEBABF5DFCA9E0C4BAF46FADC73864A4 qjl2C9lhh82hwvqyyvM4MyGG2lldydCyGw49dw4l_0_ AAA 5964 Logon <parameters><item name=”namespace”><![CDATA[AD]]></item><item name=”username”><![CDATA[admin]]></item><item name=”display name”><![CDATA[admin]]></item><item name=”CAMID”><![CDATA[CAMID(“AD:u:8791a873dc836d499a0fc7f6000540d2″)]]></item><item name=”REMOTE_ADDR”><![CDATA[::1]]></item><item name=”TENANTID”><![CDATA[]]></item></parameters> Success Account /directory/AD/account/admin


This is normally not a error, only the basic logging that a user have logged in to the CA11 (ibmcognos) portal. Best would be not to flag it as ERROR, more like INFO.

Only happens when you have BASIC logging activated in Administration.

More information:

Log files are found in folder C:\Program Files\ibm\cognos\analytics\logs

Cognos Analytics 11.1.x
Microsoft Windows 2019 server
Microsoft SQL server

I have a new Cognos environment, and want to easy copy the content store from the old environment to the new. The new Cognos environment have the same or newer version of Cognos Analytics.


To get security values over  – you must have exact the same Active Directory connection setup on both the old and new environment. Double check in Cognos Configuration that namespace is the same.

On the old server – check in Cognos Configuration where the zip file is stored.

This is normally in folder C:\Program Files\ibm\cognos\analytics\deployment on your CA11 server.
Browse to ..ibmcognos from your web browser. Login as adminstrator in cognos connection.

Click Manage – Administration console

Click Configuration tab
Click Content Administration and click on the export icon

Enter a name and click on Next button

Mark “Select the entire Content Store” and check “Include user account information” to get most information over. Click Next

Click Next

Enter a password you can remember and click OK

Click Next

Select “save and run once” and Click Finish

Click Run

Mark “View the details of this export after closing this dialog” and click OK.

Click on blue “refresh” every 10 min to see if it is finished.

Wait until status says Finish. Above is not a finish status, there is no Completion time.
This can take 30 minutes, depending of the amount of data in your Content Store.

When succeeded, click Close.

When done go to Windows file explorer and copy the zip file over from the old Cognos BI server to your new Cognos Analytics server.  Place the file in the deployment folder you are going to use.

If the deployment folder inside Cognos Configuration is pointing to a file share: \\servername\sharefolder then the Cognos Analytics service must be run under a windows service account and not local system. Local system can only access folders on the same server.

Import content store by loading the deployment file via cognos connection.

Login to new IBMCOGNOS and go to Administration page, click on configuration – Content Administration. Click on the import icon.

Select you full content store file and click Next

Enter your password. Click OK

Click Next

Click Next

Click Next

Select “save and run once” and click Finish.

Do not run upgrade of report specifications. Do that at a later time, as it can take a very long time.

Click Run.

Mark “View the details of this export after closing this dialog” and click OK

Click Refresh every 15 min to see if it is done. When you have a completion time it is finish.


You can see errors in the report, note them down and search in google for more information.

If you have changed also the database server host for your AUDIT database, then you need to go into Cognos Administration – Configuration – Data source connections. There you need to update the link to the new database server there for your audit data source.

Click on Audit, then on “more” to right of the test icon.
Click “Set properties”
Click “Connection” tab

Click pencil icon, to get to the data source update dialog.

Change the server name and any other values you need to change. Update also the JDBC tab.
Click OK when done.
Test your data source connection.

Any special configuration you have done on the Cognos Dispatcher is not part of the deployment, they you have to manually add again. Go to Cognos Administration – Configuration. Click on Dispatchers and service and click on properties icon.

Click on settings

Check on all pages for values that is not default=Yes, as they have been changed and may need to be inserted in the new environment. Only enter values that you know you need in the new environment.

Click on the Advance settings blue Edit link, to see if there are any special settings in the environment.
Configuring advanced settings for specific services (

Repeat above steps in the new CA11 environment to get the fine tuning you want.
Logging should in most cases be set to BASIC.

Content Manager service advanced settings (

You can also import content store, by backup/restore the full cm database, but then you need to consider other parts like old dispatchers that will follow the move.

More information:

Cognos Analytics 11.1.7
Microsoft Windows 2016 Server
Oracle database

During a import of a large content store deployment file into a new Content Store database schema – the process take time and there is no new object imported.
After some time, if you browse to the IBMCOGNOS website on the server, you do not get a response either.
Troubleshoot by on the Cognos server start Cognos Configuration and right click on your content store and click Test.

ORA-00257: Archiver error. Connect AS SYSDBA only until resolved.
2021-01-14T INFO startup.Audit.JSM [Thread-56] NA  9300 __ JSM 5836 Run Failure Error connecting to the database, some services may not work properly. Check the logs for further details. ORA-00257: Archiver error. Connect AS SYSDBA only until resolved.


Ask the Oracle DBA to check the oracle server and log files.
After Oracle is corrected, then the Cognos Analytics (BI service) will continue to import data.
You do not need to restart Cognos, only adjust the Oracle settings.

More Information:

Cause: The archiver process received an error while trying to archive a redo log. If the problem is not resolved soon, the database will stop executing transactions. The most likely cause of this message is the destination device is out of space to store the redo log file

Cognos Controller 10.4.2
Microsoft Windows 2016 server

When a user make a change and save a value inside “Define Data Mart” in Cognos Controller Client, the program will freeze.
The client will wait for a SQL statement to finish.

select @@trancount; SET FMTONLY ON select * from ##Fa189b4c SET FMTONLY OFF exec tempdb..sp_tablecollations_100 N’.[##Fa189b4c]’

You need to restart the Cognos Controller windows server to release the process.

Kill the controller client program will not help. You get same issue when you go back inside Define Data Mart dialog.

A restart of IIS looks like it release the process, and you can work again in Cognos Controller.


Install a fix pack to Cognos Controller 10.4.2 that solves the problem.
Not supported workaround is to update the SQL table XOLAP direct in the SQL database. Ensure you have a backup of the SQL database before you do any changes.

You can run a export of existing data mart, but not update them from inside Cognos Controller. If you want to change the Structure Version on a existing data mart definition, you need to open SQL management studio and edit table XOLAP. Here you can update some of the values that are in the define dialog.

In Microsoft SQL Server Management Studio you can create a query like this to update definition CBI to version 200812:

UPDATE dbo.xolap
SET structversion = 200812
WHERE cubeid = ‘CBI’;

More information about SQL:

The selection of forms are stored in table XOLAPFORM

You have the list from you define data mart dialog at the form field.

You can check values in database with Cognos Controller function BROWSE DATA if you are a administrator.
Enter table name and click on Arrow to see tables data.

More information:

Planning Analytics 2.0.6 TM1_version=TM1-AW64-ML-RTM-
Microsoft Windows 2016 server

Does it exist a better TI editor than TM1 Architect?

Yes, there exist – one is PAW (planning analytics workspace) or you can download and use cubewise ARC for a trail period. Browse to
Click on the “Accept License Agreement” and click on SERVER FOR WINDOWS 64 to download it.


Save the file to a empty folder like c:\arc on your tm1 server and unzip it in that folder.

Double click on the arc.exe file to start the web-server arc.

It will start up Internet explorer with the start page. Dismiss the news prompt, and click on the license icon.

Under Start Trail tab, select Individual, and scroll down to accept the license by click on “Agree and start Trail”. It will create the license file in you arc folder.
Now stop the arc.exe dos window – to stop the service.
Start a new COMMAND windows as administrator.
Go to the c:\arc folder and enter this command:

arc.exe   -install

This will install the program as a service. Go to services in windows control panel and start the arc service.

Now you can from you laptop browse to the ARC on the TM1 server, by entering http://tm1servername:7070
(you need to have port 7070 open in any firewalls between your server and your laptop)
Click on the TM1 instance you want to work with and you will be prompted to login.

ARC can remember your login, so next time you do not need to enter it again.
For a example to work with ARC, we will load a dimension from a text file (using this process
In the rare occasion when we have in the dimension name the same character as the separator for the list, we need to adjust it a little.

Click on new process, by right click on the process line.

Enter a name:  import.dim and click Create

Under Parameters, click on string to add below two parameters:

Under Data Source tab, do this steps:

Select ASCII as type from drop down list.
Enter the path and file name to import.
Select the comma separator (if that is what you use in the file).
Set Headers records to Zero.
Click on preview.

When above is OK, click on create variables. Change variables names to be below names.

You can here continue with the default setup from But we are going to change to only read one line.

In data source tab, change to use only a not used separator like pipe, to read each line only.
Click on preview and Variables tab change type name to FullLine:

In prolog tab enter this code: (press CTRL+SPACE to get code help in the editor)

In Metadata tab enter this code:

You need to adjust the code to fit your files.

ASCIIOUTPUT place txt files in your \data folder if no path is submitted.
Click on the SAVE icon to save your changes to the TM1 instance.
Click on “lightning” icon to run the TI process for a test.

Enter the values and click on Execute button at the bottom of the dialog window.

Click the refresh icon, and expand the Dimensions list.
Double click on the new accounttest  to see that the values have been imported correct.

You can solve this in many different ways, but the cubewise ARC editor looks nice.

More information: