Product:
Cognos Controller 10.4.2 IF7

CONTRL_UPDATE_version=CCR-AW64-ML-RTM-10.4.2000.1063-0
CONTRL_UPDATE_name=IBM Cognos Controller Update

CCRPORTAL_version=CONTRL-AW64-ML-RTM-10.4.2000.185-0
CCRPORTAL_name=IBM Cognos Controller Portal

Microsoft Windows 2019 server

Problem:
One a fresh new Windows server, after installation of Cognos Controller program for the first time, there is missing COM+ applications for Cognos Controller in Component Services.

Solution:
Uninstall Cognos Controller from the server, leave a few files left, as below picture show.

Then run the installation again. Select everything.
This can help in many cases.

Other things to check;
Run below power shell command to check that NET framework 4.7.2 is installed:

(Get-ItemProperty “HKLM:\SOFTWARE\Microsoft\NET Framework Setup\NDP\v4\Full”).Release -ge 461808

More information:
https://www.ibm.com/support/pages/vfp-and-vfpcom-error-messages-when-installing-controller-server-causing-frangovfpdll-not-be-registered-component-services-com-caused-apar-ph24802

https://docs.microsoft.com/en-us/dotnet/framework/migration-guide/how-to-determine-which-versions-are-installed

https://www.ibm.com/support/pages/node/6348246

Product:
Cognos Analytics 11.0.12
kit_version=11.0.12.18062512
kit_name=IBM Cognos Analytics
Microsoft Windows 2016 server

Problem:
After restart of servers the Cognos BI system does not respond. The users see this message:

The IBM Cognos gateway is unable to connect to the IBM Cognos BI server. The server may be unavailable or the gateway may not be correctly configured.

Error in cognosserver.log file:

com.cognos.accman.jcam.crypto.CAMCryptoException: CAM-CRP-1093 Unable to read the contents of the keystore ‘D:/Program Files/ibm/cognos/analytics/configuration/certs\CAMKeystore’. Reason: java.io.IOException: stream does not represent a PKCS12 key store

Solution:

First, try to restart the IIS server on the Cognos Gateway with the dos command: IISRESET

If that does not help, create new cryptographic keys;
“C:\Program Files\ibm\cognos\analytics\bin64\cogconfigw.exe” open to start cognos configuration.

Stop the running of Cognos BI in Cognos Configuration.

From inside Cognos Configuration, click ‘File > Export As’.

Choose ‘Yes’ at the prompt and save the file.  Name it ‘cogstartup backup.xml’ which will be stored in the analytics\configuration folder.

Close Cognos Configuration.

Moving this files to a different, secure location e.g. c:\temp\backup (they should during the cryptographic keys regeneration process be re-created):

· analytics/configuration/cogstartup.xml

· analytics/configuration/caSerial

· analytics/configuration/certs/CAMCrypto.status

· analytics/configuration/certs/CAMKeystore

· analytics/configuration/certs/CAMKeystore.lock

· analytics/temp/cam/freshness

Moving this folder to a different, secure location e.g. c:\temp\backup

· analytics/configuration/csk

In the analytics\configuration folder, rename ‘cogstartup backup.xml’ to ‘cogstartup.xml’.

Go in to Cognos Configuration and click SAVE, and then start.

To test, you can browse to http://ca11servername:9300/p2pd/servlet

Repeat all the above steps on all the Cognos servers, starting with the Content Manager server and take the Cognos Gateway servers last.

 

If you see the message “CM-CFG-5069 A serious error occurred while committing a delete operation” when starting up the Cognos Analytics service, it can be a temporary error when CA11 try to clean up its own cache.  Restart the CA11 server again and see if the error goes away.

Misleading Error in cognosserver.log at first logon of any user after a Cognos restart:
“/v1/identity bi-service Can not retrieve the password from configuration.”
Solved in later versions of CA11.

More information:

https://www.ibm.com/support/pages/how-regenerate-cryptographic-keys-cognos-analytics-11

https://www.ibm.com/support/pages/cognos-gateway-unable-connect-cognos-bi-server-2

https://www.ibm.com/support/pages/cognos-gateway-unable-connect-cognos-bi-server-while-trying-logon-cam-namespace-solved-iisreset

By default, the cryptographic keys are valid for 365 days.

  • This value is configured inside Cognos Configuration
  • Specifically, browse to “Local Configuration -> Security -> Cryptography” and modify the value for: Common symmetric key lifetime in days

Each time you open Cognos configuration and click the save button, it resets the clock on your 365 days. Therefore, if you installed the software and didn’t save the configuration for 365 days, they would expire and you’d need to manually regenerate them.

You must restart the services every so often to ensure the new keys are actually being used.

  • If you think you won’t be opening and saving your configuration at any point in the next year or two, you can change the expiration date to 8 years and re-encrypt everything.

https://www.ibm.com/support/pages/apar/PI94141

Product:
Planning Analytics Workspace 2.0.
Linux Red Hat 7

Problem:
After change/crash of PAW, the system is starting but user can not login. They get the message “Planning Analytics Workspace is unavailable. Try again in a few minutes”
On the Linux server you run command to check that all containers are up, and they all show the same number of seconds to be up. This make it look like the PAW is working.
sudo docker ps

If you check the logs for a container, with below command, you can see if there are any errors giving more information.

sudo docker logs share-platform

cp: failed to extend ‘bootstrap.properties’: No space left on device

cp: error writing ‘bootstrap.properties’: No space left on device

Check the date before in log file, above can be a fill up disk of log files that have crashed the PAW before.

Check space with command: df -h

sudo docker logs pa-gateway

This may show AH01114: HTTP: failed to make connection to backend: share-platform, that tell us that it does not speak with the share-platform container, who had a problem before.

Solution:

Recreate the images, they can have been corrupted of the space outage you had earlier. This may remove the books you have inside PAW, so take a backup first by run script /ibm/paw/scripts/backup.sh to create a backup file under backup folder.

Go to the /ibm/paw/scripts folder on Linux server. Enter below command to stop PAW

./paw.sh stop

Backup the config folder and it sub-folders;   cp -ivR config  /tmp/cognos  (will copy all files in config folder to /tmp/cognos folder)

Kill all running containers
docker kill $(docker ps -q)
Delete all stopped containers
docker rm $(docker ps -a -q)
Delete all images
docker rmi $(docker images -q)

Then go to the PAW folder of your installation on the Linux server and run start.sh to create the images/containers from scratch.

./Start.sh

Answer YES on both questions.

You can in Linux test that the PAW is up by entering command (replace with your server DNS alias):

curl -k https://planningworkspaceserver.domain.com

this will return the html page as plain text – check for errors.

More Information:
https://www.ibm.com/support/pages/troubleshooting-planning-analytics-workspace-related-docker-issues

https://www.ibm.com/support/knowledgecenter/SSD29G_2.0.0/com.ibm.swg.ba.cognos.tm1_inst.2.0.0.doc/c_paw_trbl_cant_access_paw.html

https://www.ibm.com/support/pages/unable-access-ibm-planning-analytics-workspace

https://www.ibm.com/support/knowledgecenter/SSD29G_2.0.0/com.ibm.swg.ba.cognos.tm1_inst.2.0.0.doc/t_paw_uninstall.html

To limit docker log files you can edit the daemon.json file in folder /etc/docker

This sample configuration will limit the json log files to 10 megabytes, and will only keep the 5 most recent logs:

{

“log-opts”: {

“max-size”: “10m”,

“max-file”: “5”

}

https://ss64.com/bash/cp.html

Product:
Planning Analytics Workspace 2.0.39
Microsoft Windows 2016 server

Problem:
After restart/upgrade of PAW – when going to the administrator tab over databases, some show “Threads blocked: unavailable”

Solution:
Clear the cache in the web browser first, or try with a different web browser (chrome).

It can be that the account you logged in to PAW is not administration in the TM1 instance, then the threads info will not be shown.

More information:
https://www.ibm.com/support/pages/changes-planning-analytics-workspace-database-administration-2041

https://www.ibm.com/support/knowledgecenter/SSD29G_2.0.0/com.ibm.swg.ba.cognos.tm1_prism_gs.2.0.0.doc/c_paw_administer_servers.html

Product:
Planning Analytics 2.0.9
TM1SERVER_APP_version=TM1SERVER_APP-AW64-ML-RTM-11.0.93.28-0
TM1SERVER_APP_name=IBM Cognos TM1 Server Application
Microsoft Windows 2019 server

https://www.ibm.com/support/knowledgecenter/SSD29G_2.0.0/com.ibm.swg.ba.cognos.tm1_nfg.2.0.0.doc/c_pa_nfg_introduction.html

Problem:
Company have moved from NATIVE TM1 security to Active Directory security – where all users have got a new ID in Cognos Connection. That means that inside Planning Analytics (Tm1) the user is a new person with a new ID, so all private views are lost.

When user Donald login with his new AD ID, he does not see his old private views in Tm1 Architect.

Solution:

Stop the TM1 service for that instance.

Good that TM1 use a file system to store information.
All private views and data are stored under the user name folder in \data — in above case that should be in folder \data\donald\.
You can then copy the vue files from C:\Program Files\ibm\cognos\tm1_64\samples\tm1\Rules_Guide_Data\donald\Currency}vues\testview1.vue to folder C:\Program Files\ibm\cognos\tm1_64\samples\tm1\Rules_Guide_Data\Currency}vues to make the testview1 public by place it direct under the cube view folder.

We login to TM1 application RulesGuide as Admin in native mode, and create a private view called testview1.

This view is saved under the Tm1 application data folder “rules_guide_data” under the user name and cube name as shown above.

If we copy only the testview1.vue file to the cube folder direct under data e.g. Currency}vues then the view becomes public.
To copy it from NATIVE user Admin to AD user Admin, you have to go under the folder with the same name as the Cognos Namespace, in our example AD. Under the user there you have to copy both the cube folder and the cubeview file.

Folder AD will be different in your environment, as it is the Cognos Namespace name.

You need to copy both cube name folder – in our example Currency}vues and the vue file there under to move the private view to a different user as a private view.

Start the TM1 server after you have copied the files and you should now see the views as other user.

More information:

https://blog.octanesolutions.com.au/views-integration-in-pax-and-paw

https://www.ibm.com/support/knowledgecenter/SSD29G_2.0.0/com.ibm.swg.ba.cognos.tm1_rest_api.2.0.0.doc/t_tm1_rest_api_cubes_and_native_views.html

 

 

Product:
Cognos Analytics 11.1.7
Microsoft Windows 2019 server
Problem:
In a multi-server environment, when the second CA11 service is started, then not all the java processes needed for DQM reports is running. There should be 3 java process on a Cognos Analytics server.

The Cognos Configuration say that CA11 is up and running, but if you look into the Cognos Administration you find that the QueryService is not working (as its java process is not started).

Solution:
Check the cognosserver.log file to find out that the xqe.config.com file is damage.

Error you find in the cognosserver.log file are:

ERROR com.cognos.pogo.services.DefaultHandlerService [pool-79-thread-19] NA problem reconfiguring handler queryServiceHandler
com.cognos.xqe.config.ConfigFileException: XQE-CFG-0003 Failed to load configuration file file:/D:/Program%20Files/ibm/cognos/analytics/configuration/xqe.config.xml.
at com.cognos.xqe.config.XQEConfiguration.loadConfiguration(XQEConfiguration.java:907) ~[xqeService.jar:?]

Stop the IBM Cognos service from inside Cognos Configuration.
Copy the file xqe.config.xml from the working CA11 server to the none-working CA11 server.
Start the IBM Cognos service from inside Cognos Configuration.

Wait 8 minutes until it starts up. Test to run the reports now.
If no other working Cognos installation is around, you may need to reinstall Cognos to get correct files.

More information:
https://www.ibm.com/support/pages/cleaning-temporary-java-workarea-cognos-analytics

  1. Stop Dispatcher (that in turn will stop Query service).
  2. Assuming <cognos_home> is home directory of that specific dispatcher delete everything from <cognos_home>\wlp\usr\servers\dataset-service\workarea\*  (i.e. keep the workarea directory, but delete everything in it, including subdirectories)
  3. Delete everything from <cognos_home>\wlp\usr\servers\cognosserver\workarea\*
  4. Delete everything from <cognos_home>\javasharedresources\*
  5. Start Dispatcher (Query service will start automatically)

https://www.ibm.com/support/pages/queryservice-status-displays-unknown-within-ibm-cognos-administration

Product:
Planning Analytics 2.0.7
Microsoft Windows 2016 server

Problem:
How to make the log files not so big for TM1?
Can i have tm1top data logged to a file?

Solution:

Create the tm1s-log.properties file in the same folder as your tm1s.cfg file.
log4j.appender.S1.MaxFileSize=10 MB tells how big your tm1server.log file will be, default is much larger.
log4j.appender.S1.MaxBackupIndex=40 is the number of files that will be kept. Increase this to the value you need.

To enable TM1TOP data, without using the tm1top utility, add to your tm1s.cfg file
TopLogging=T
Then active and set the filename with this parameters in tm1s-log.properties file:

log4j.logger.Top=INFO, S_Top
log4j.appender.S_Top=org.apache.log4j.SharedMemoryAppender
log4j.appender.S_Top.MemorySize=5 MB

log4j.appender.S_Top.File=tm1top.log

log4j.appender.S_Top.Format=TM1Top

Then in the tm1s-log.properties file you adjust the size with the parameters:

log4j.appender.S_Top.MaxFileSize=10 MB
log4j.appender.S_Top.MaxBackupIndex=20

This is a great example of a tm1s-log.properties file that is copied from https://www.ykud.com/blog/cognos/tm1-cognos/tm1s-log-properties/

# main logging file
log4j.logger.TM1=INFO, S1
# it's good to have Locks details -- provides the names of the locked objects
log4j.logger.TM1.Lock.Exception=DEBUG, S1
# S1 is set to be a SharedMemoryAppender
log4j.appender.S1=org.apache.log4j.SharedMemoryAppender
# Specify the size of the shared memory segment
log4j.appender.S1.MemorySize=5 MB
# Specify the max filesize
log4j.appender.S1.MaxFileSize=10 MB
# Specify the max backup index
log4j.appender.S1.MaxBackupIndex=20
# Specify GMT or Local timezone
log4j.appender.S1.TimeZone=Local

# event logging configuration
# Meeds EventLogging=T in the tm1s.cfg
# https://www.ibm.com/support/knowledgecenter/SSD29G_2.0.0/com.ibm.swg.ba.cognos.tm1_inst.2.0.0.doc/c_eventlogging.html
log4j.logger.Event=INFO, S_Event
log4j.appender.S_Event=org.apache.log4j.SharedMemoryAppender
log4j.appender.S_Event.MemorySize=1 MB
log4j.appender.S_Event.MaxFileSize=10 MB
log4j.appender.S_Event.MaxBackupIndex=10
log4j.appender.S_Event.File=tm1event.log
log4j.appender.S_Event.Format=TM1Event
log4j.appender.S_Event.TimeZone=Local

# tm1top logging configuration
# Set up TopLogging=T in the tm1s.cfg
# https://www.ibm.com/support/knowledgecenter/SSD29G_2.0.0/com.ibm.swg.ba.cognos.tm1_op.2.0.0.doc/c_pa_top_logger.html
log4j.logger.Top=INFO, S_Top
log4j.appender.S_Top=org.apache.log4j.SharedMemoryAppender
log4j.appender.S_Top.MemorySize=5 MB
log4j.appender.S_Top.MaxFileSize=10 MB
log4j.appender.S_Top.MaxBackupIndex=20
log4j.appender.S_Top.File=tm1top.log
log4j.appender.S_Top.Format=TM1Top

# Logins file -- records every time a user logins, can be used for license evaluation or just checking user activity
log4j.logger.TM1.Login=DEBUG, S_Login
log4j.appender.S_Login=org.apache.log4j.SharedMemoryAppender
log4j.appender.S_Login.MemorySize=5 MB
log4j.appender.S_Login.MaxFileSize=10 MB
log4j.appender.S_Login.MaxBackupIndex=20
log4j.appender.S_Login.File=tm1login.log
log4j.appender.S_Login.TimeZone=Local

More Information:
https://www.ykud.com/blog/cognos/tm1-cognos/tm1s-log-properties/

https://www.ibm.com/support/knowledgecenter/SSD29G_2.0.0/com.ibm.swg.ba.cognos.tm1_op.2.0.0.doc/c_pa_top_logger.html 

https://www.ykud.com/posts/

https://www.wimgielis.com/tm1_articles_EN.htm

https://www.ibm.com/support/pages/write-tm1-server-debug-logging-alternate-file

Product:
Cognos Analytics 11.1.7
Microsoft Windows 2019 server

Problem:

Error in cognosserver.log file on a new installation.

2021-01-28T09:31:51.384+0100 ERROR Audit.RTUsage.cms.CAM.AAA.SRVC [Thread-51] qjl2C9lhh82hwvqyyvM4MyGG2lldydCyGw49dw4l 0 NA 192.168.1.28 9300 8C38E8DDD4011089DE605BFA04DA4A10EEBABF5DFCA9E0C4BAF46FADC73864A4 qjl2C9lhh82hwvqyyvM4MyGG2lldydCyGw49dw4l_0_ AAA 5964 Logon <parameters><item name=”namespace”><![CDATA[AD]]></item><item name=”username”><![CDATA[admin]]></item><item name=”display name”><![CDATA[admin]]></item><item name=”CAMID”><![CDATA[CAMID(“AD:u:8791a873dc836d499a0fc7f6000540d2″)]]></item><item name=”REMOTE_ADDR”><![CDATA[::1]]></item><item name=”TENANTID”><![CDATA[]]></item></parameters> Success Account /directory/AD/account/admin

Solution:

This is normally not a error, only the basic logging that a user have logged in to the CA11 (ibmcognos) portal. Best would be not to flag it as ERROR, more like INFO.

Only happens when you have BASIC logging activated in Administration.


More information:

https://www.ibm.com/support/pages/apar/PH12785

https://www.ibm.com/support/pages/how-trace-authentication-issues-cognos-bi-cam-trace-aaa-trace

Log files are found in folder C:\Program Files\ibm\cognos\analytics\logs

Product:
Cognos Analytics 11.1.x
Microsoft Windows 2019 server
Microsoft SQL server

Problem:
I have a new Cognos environment, and want to easy copy the content store from the old environment to the new. The new Cognos environment have the same or newer version of Cognos Analytics.

Solution:

To get security values over  – you must have exact the same Active Directory connection setup on both the old and new environment. Double check in Cognos Configuration that namespace is the same.

On the old server – check in Cognos Configuration where the zip file is stored.

This is normally in folder C:\Program Files\ibm\cognos\analytics\deployment on your CA11 server.
Browse to ..ibmcognos from your web browser. Login as adminstrator in cognos connection.

Click Manage – Administration console

Click Configuration tab
Click Content Administration and click on the export icon

Enter a name and click on Next button

Mark “Select the entire Content Store” and check “Include user account information” to get most information over. Click Next

Click Next

Enter a password you can remember and click OK

Click Next

Select “save and run once” and Click Finish

Click Run

Mark “View the details of this export after closing this dialog” and click OK.

Click on blue “refresh” every 10 min to see if it is finished.

Wait until status says Finish. Above is not a finish status, there is no Completion time.
This can take 30 minutes, depending of the amount of data in your Content Store.

When succeeded, click Close.

When done go to Windows file explorer and copy the zip file over from the old Cognos BI server to your new Cognos Analytics server.  Place the file in the deployment folder you are going to use.

If the deployment folder inside Cognos Configuration is pointing to a file share: \\servername\sharefolder then the Cognos Analytics service must be run under a windows service account and not local system. Local system can only access folders on the same server.

Import content store by loading the deployment file via cognos connection.

Login to new IBMCOGNOS and go to Administration page, click on configuration – Content Administration. Click on the import icon.

Select you full content store file and click Next

Enter your password. Click OK

Click Next

Click Next

Click Next

Select “save and run once” and click Finish.

Do not run upgrade of report specifications. Do that at a later time, as it can take a very long time.

Click Run.

Mark “View the details of this export after closing this dialog” and click OK

Click Refresh every 15 min to see if it is done. When you have a completion time it is finish.

 

You can see errors in the report, note them down and search in google for more information.

If you have changed also the database server host for your AUDIT database, then you need to go into Cognos Administration – Configuration – Data source connections. There you need to update the link to the new database server there for your audit data source.

Click on Audit, then on “more” to right of the test icon.
Click “Set properties”
Click “Connection” tab

Click pencil icon, to get to the data source update dialog.

Change the server name and any other values you need to change. Update also the JDBC tab.
Click OK when done.
Test your data source connection.

Any special configuration you have done on the Cognos Dispatcher is not part of the deployment, they you have to manually add again. Go to Cognos Administration – Configuration. Click on Dispatchers and service and click on properties icon.

Click on settings

Check on all pages for values that is not default=Yes, as they have been changed and may need to be inserted in the new environment. Only enter values that you know you need in the new environment.

Click on the Advance settings blue Edit link, to see if there are any special settings in the environment.
Configuring advanced settings for specific services (ibm.com)

Repeat above steps in the new CA11 environment to get the fine tuning you want.
Logging should in most cases be set to BASIC.

Content Manager service advanced settings (ibm.com)

You can also import content store, by backup/restore the full cm database, but then you need to consider other parts like old dispatchers that will follow the move.

More information:

http://mail.heritagebrands.com.au/ibmcognos/documentation/en/ug_cra_10_2/c_deploying_the_entire_content_store.html

https://www.ibm.com/support/pages/what-difference-between-exporting-content-store-cognos-connection-and-doing-database-backup-content-store

https://www.ibm.com/support/pages/how-copy-entire-content-store-another-cognos-analytics-server-same-version

https://www.bspsoftware.com/products/metamanager/Download

https://www.ibm.com/support/knowledgecenter/SSEP7J_11.1.0/com.ibm.swg.ba.cognos.ug_cra.doc/c_deploying_the_entire_content_store.html

Product:
Cognos Analytics 11.1.7
Microsoft Windows 2016 Server
Oracle database

Problem:
During a import of a large content store deployment file into a new Content Store database schema – the process take time and there is no new object imported.
After some time, if you browse to the IBMCOGNOS website on the server, you do not get a response either.
Troubleshoot by on the Cognos server start Cognos Configuration and right click on your content store and click Test.

ORA-00257: Archiver error. Connect AS SYSDBA only until resolved.
2021-01-14T INFO startup.Audit.JSM [Thread-56] NA  9300 __ JSM 5836 Run Failure Error connecting to the database, some services may not work properly. Check the logs for further details. ORA-00257: Archiver error. Connect AS SYSDBA only until resolved.

Solution:

Ask the Oracle DBA to check the oracle server and log files.
After Oracle is corrected, then the Cognos Analytics (BI service) will continue to import data.
You do not need to restart Cognos, only adjust the Oracle settings.

More Information:

http://www.dba-oracle.com/sf_ora_00257_archiver_error_connect_internal_only_until_freed.htm

Cause: The archiver process received an error while trying to archive a redo log. If the problem is not resolved soon, the database will stop executing transactions. The most likely cause of this message is the destination device is out of space to store the redo log file

https://www.ibm.com/support/pages/export-large-content-store-database

https://www.ibm.com/support/knowledgecenter/SSEP7J_10.2.2/com.ibm.swg.ba.cognos.crn_arch.10.2.2.doc/c_adgsizethecs.html