Product:
Cognos Analytics 11.0.12
Microsoft Windows 2012 Server
TM1 10.2.2

Issue:
The IBM Cognos Windows service look like it is started in Windows, but when you try to login you get a message that the page can not be shown. When starting Cognos from Cognos configuration you get message CFG-ERR-0106 … did not receive response from the IBM Cognos service in the time allotted. You can not surf to http://servername:9300/p2pd/servlet
The firewall is not blocking the connection.

Possible solution:
A other software was installed on the Cognos BI server, that uses JAVA and have set JAVA_HOME, JDK_HOME and JRE_HOME as windows variables. You can see this by enter SET at a CMD prompt on the server. This have given that Cognos have used the wrong java version.

Adjust JAVA_HOME to point to the Cognos java folder.

More information:
https://www.ibm.com/support/knowledgecenter/en/SSEP7J_10.2.1/com.ibm.swg.ba.cognos.c8pp_inst.10.2.1.doc/c_reviewthedefaultsettings.html
https://www-01.ibm.com/support/docview.wss?uid=swg21342592

Product:
Planning Analytics 2.0.5
Microsoft Windows Server 2016

Recommend to have only the valid TM1S.CFG functions in the file, the default values change with each version of TM1. You need to check the values, and remove the unnecessary ones. https://www.ibm.com/support/knowledgecenter/en/SSD29G_2.0.0/com.ibm.swg.ba.cognos.tm1_inst.2.0.0.doc/c_parametersinthetm1s.cfgfile_n1503fe.html

(This is a copy of text on this site – they contain good information)
https://cubewise.com/blog/adjusting-server-configuration-planning-analytics/

Planning Analytics or TM1 server v11 comes with a host of new features. In order to get best performance and control behavior of the TM1 server if you are upgrading or considering upgrading to PAL then you should make sure that as a minimum you optimize your server configuration file (tm1s.cfg) to get the most out of the latest version. This article is part of our series on upgrading from TM1 to Planning Analytics. To get an overview of other areas to consider please refer to this topic.

Not all the parameters examined here are strictly brand new to v11 some have been around for a while or have changed default values or might now be redundant and therefore require a revisit.

One of the most significant changes to TM1 server behavior between versions 10.2.2 and 11.x is that almost all server configuration parameters are now dynamic. Even parameters concerning login and security are now dynamic. The parameters remaining static or immutable for the duration of the server session are now only a handful (where also it must be said that static makes sense) e.g. AdminHost, ServerName, DataDirectory, PortNumber. This is a huge change which should be acknowledged and hats off to IBM for achieving this.

tm1s.cfg parameters worth revisiting

  • HTTPPortNumber (static, default=5001) Determines the port number used by the Rest API. With the Rest API becoming the default API for the TM1 server it is important to set this parameter and not use the default value.
  • HTTPSessionTimeoutMinutes (dynamic, default=20) Determines the idle timeout for Rest sessions. We suggest matching the value of HTTPSessionTimeoutMinutes to IdleConnectionTimeoutSeconds.
  • UseSSL (static, default=T) Determines whether the server uses encrypted communication.
  • MaximumCubeLoadThreads (static, default=0) By default this parameter is disabled. In most situations Cubewise has recommended enabling. However, this recommendation changes with Planning Analytics as the new parameter MTCubeLoad should be used in its place. Having this parameter enabled can cause the server to crash during start-up in PA. MaximumCubeLoadThreads should now be set to 0 or removed from the tm1s.cfg file.
  • MTQ (dynamic, default=ALL) Multi-Threaded Query mode is now enabled by default to use all available cores (default value ALL or -1). Our recommendation however remains to set MTQ=-2 which will set MTQ to use all available cores less one which allows other operations on the server to be processed even under peak load from the TM1 server.
  • MTQQuery (dynamic, default=T) Determines whether TI processes utilize MTQ to materialize data source cube views.
  • PersistentFeeders (static, default=T) Persistent Feeders are now enabled by default allowing for faster server load times (at the cost of additional disk consumption for .feeders files and longer time to save all server data). The default value is recommended.
  • IntegratedSecurityMode (dynamic, default=1) This parameter is now dynamic, meaning for example that login method could be changed from CAM to TM1 security or vice versa while the server is in session. This has been discussed previously on our blog here.
  • UserDefinedCalculations (dynamic, default=T) We are including this parameter here because as of the latest 2.0.x PAL release user defined calculations are not supported in the new Rest based clients (Workspace, PAX) and it remains unclear whether or how UDCs will be supported in the future.

Brand new parameters

  • EnableNewHierarchyCreation (static, default=F) Alternate hierarchies are the banner new feature of Planning Analytics however are disabled by default. To enable hierarchies this parameter must be set to T. If you plan on implementing hierarchies make sure you also read our blog posts on the changes you need to consider to rules and TI processes as this can have far reaching impacts on your model.
  • MTCubeLoad (dynamic, default=F) This parameter replaces MaximumCubeLoadThreads. With MTCubeLoad cubes are still loaded in serial but multiple threads are used to load each cube. We recommend enabling this feature as this dramatically decreases server load times. The feature is disabled if MTQ=0 or MTQ=1.
  • IndexStoreDirectory (dynamic, default=DataDirectory) When MTCubeLoad is enabled this parameter defines the directory where index and bookmarks are stored to further optimize the server load phase. We recommend setting a directory other than the data directory.
  • MTFeeders (dynamic, default=F) Enables multi-threaded feeder evaluation during saving of rules or running the CubeProcessFeeders function. Enabling this function is recommended as this can significantly speed up saving of rules.
  • MTFeeders.AtStartup (dynamic default=F) Enables multi-threaded feeder evaluation during the server load phase. Combined with MTCubeLoad this can lead to dramatic reduction of startup times to a level comparable to the load time achieved with Persistent Feeders. We recommend enabling this feature.
  • PreallocatedMemory.Size, PreallocatedMemory.BeforeLoad (dynamic, default=F) These parameters can be used to fine-tune the behaviour of MTCubeLoad.
  • CreateNewCAMClients (dynamic, default=T) This parameter was introduced in 10.2.2 FP5 so is not strictly new with PA but due to the recent introduction is probably still unknown to most administrators. Setting this parameter to F disables the creation of new clients in a server using CAM authentication. This can reduce locking contention experienced when new clients attempt to log in.
  • EnableSandboxDimension (dynamic, default=F) Enabling this parameter allows sandboxes to be used as a virtual dimension when querying cubes. This is a fantastic feature however should be used with caution as once enabled all Rest and MDX queries will expect sandbox to be addressed as the 1st dimension in all cube queries.
  • EnableTIDebugging (dynamic, default=F) Debug mode for TurboIntegrator is another valuable new feature which is disabled by default. We recommend enabling this parameter on all non-production servers. The default value can be left for production environments where step-through debugging is not required. However, as the parameter is dynamic it can be enabled if needed.

In summary

Planning Analytics has several new features which require administrators to change default server configurations to access the new features. Even if you don’t plan on implementing alternate hierarchies we recommend using all of the new parameters listed above. When launching new features IBM has a history of first disabling new features by default and in later releases changing the default behaviour. Examples of this can be seen with ParallelInteraction, PersistentFeeders and MTQ which were all initially disabled by default but are now enabled by default. We would expect the default values for EnableNewHierarchyCreation, MTCubeLoad & MTFeeders to all switch to being True within the next few releases.

Should you want to delve even deeper into optimizing tm1s.cfg parameters the Cubewise Code blog has a good article on this topic.

Product:
Planning Analytics 2.0.5
Microsoft Windows Server 2016

(this is a copy of text from IBM site)
https://www-01.ibm.com/support/docview.wss?uid=ibm10737313&myns=swgimgmt&mynp=OCSSD29G&mync=F&cm_sp=swgimgmt-_-OCSSD29G-_-F

Summary

The purpose of this document is to help with understanding and managing the memory utilization of the TM1 Application Server.

Understanding of Usage

Your TM1 Application Server (JAVA) memory is impacted by the following applications:

  • TM1Web (viewing/exporting data)
  • PMPSVC (viewing/exporting data)
  • Planning Analytics for Excel (PA for Excel, which uses the PMHub service on the TM1 Application Server)
  • Planning Analytics Workspace (if used to display PA for Excel/Perspectives reports)
  • Operations Console (obtaining server status/logging data)

As much as one would like to know what the best settings are or how much memory is required to host the TM1 Application Server, the answer does not come easy.  The following questions must be taken in to account and even with answers to the following questions, only load/stress testing can provide an accurate estimate to the resources required for your specific load:

  • How many concurrent users are expected?
  • What application(s) are users expected to use?
  • Are users doing reporting only or is data entry included?
  • How much data is to be consumed by a user?  (how many rows/columns/cells)
  • How much data is expected to be exported?
  • Do web/excel books contain more than one tab?  (is it really 1 report, or 20 reports under 1?)

Applications Settings / Governors

To get a better handle on your TM1 Application Server memory management, the below safeguards/settings should be introduced.  It is recommended to implement more restrictive safeguards, raising the restrictions as required and where it makes sense to do so.

TM1Web (via /tm1_64/webapps/tm1web/web-inf/configuration/tm1web_config.xml):

<!– If specified, sets a limit for the maximum number of cells in the workbook file. It will not be possible to open workbooks with a number of cells that exceeds this limit. Leaving this blank will allow all workbooks to be opened. –>
<add key=”WorkbookMaxCellCount” value=”500000” />

<!– If specified, sets a threshold for limiting of exports using the number of cells. If the set threshold is exceeded when performing an export (looped exports will reach this multiplicatively), then the export will not be allowed for that user. Is applied for all users, and for  Websheets and CubeViewer. Leaving this blank will allow for classic-style exporting with no cell count limitations. –>
<add key=”ExportCellsThreshold” value=”200000” />

<!– MaximumSheetsForExport: Maximum number of sheets allowed to Export –>
<add key=”MaximumSheetsForExport” value=”50” />

<!– Determines the maximum number of exports that can occur concurrently. If less than or equal to zero, will allow virtually unlimited concurrent exports (2147483647 max). The default value is 5. –>
<add key=”MaximumConcurrentExports” value=”2” />

<!– CubeViewerRowPageSize: Number of rows to fetch in a page of cubeviewer –>
<add key=”CubeViewerRowPageSize” value=”100” />

<!– CubeViewerColumnPageSize: Number of columns to fetch in a page of cubeviewer –>
<add key=”CubeViewerColumnPageSize” value=”20” />
Planning Analytics for Excel (via Options Menu):

  • Application Settings:
    • Auto spread consolidated input, recommended value: disabled
      •   If not required, disable this setting to prevent users from accidentally spreading
      •   If spreading is allowed, ensure users are educated so that a spread is not attempted on an intersection that spans too many cells
  • Exploration or list settings:
    •  Data display row limit, recommended value: 5000
      •   Reduce the number of rows displayed to ensure a larger than necessary query is no executed
    •  Expand member limit, recommended value: 2000
      •   Reduce the number of rows displayed to ensure a larger than necessary query is no executed
  • Planning Analytics for Excel Design Recommendations:
    • Ensure that Zero Suppression is enabled where possible, to prevent larger than necessary queries
    • Do not include too many sheets/tabs in a single book, create multiple reports where possible
    • Ensure the Excel Sheet does not contains many unnecessary blank/empty columns/rows

 

Tuning the JAVA Process

The TM1 Application Server sits on a Websphere Liberty Application Server, which utilizes the JAvA platform.  The details in this section can assist you with further tuning of the JAVA Process for the TM1 Application Server, however as mentioned earlier in this document – only a proper test/load plan can ensure a successful deployment.

To fine tune the TM1 Application Server’s JAVA Process, edit the jvm.options file (found in \tm1_64\wlp\usr\servers\tm1).  Consider adding the following entries, below the existing jvm.options settings (or updating any settings that may already exist).  Values should be adjusted accordingly to available resources, for example – don’t allocate an Xmx of 24576m if you have less than 32GB of RAM on the server.

  • -Xmx12288m
    • Xmx specifies the maximum memory allocation pool (heap) for a Java Virtual Machine (JVM)
      • Xmx is more commonly set using Cognos Configuration.  Changes to Xmx in the jvm.options file will take precedence.
      • Xmx should NOT exceed 24576m, as additional performance problems may be encountered.
  • -Xms12288m
    • Xms specifies the initial memory allocation pool on process startup.
    • Xms should reflect the same value specified by the Xmx parameter.
  • -Xmn6144m
    • Xmn sets the size of the new (nursery) heap.
    • Xmn should be set to half of the value specified by the Xmx parameter.
  • -Xgcpolicy:balanced
    • A garbage collection policy of balanced is recommended when the Xmx exceeds 8192m
    • Depending on application usage, a different garbage collection policy may prove beneficial to performance
      • If using TM1Web, a policy of balanced is recommended
      • If Planning Analytics for Excel is the only application using the TM1 Application Server, a policy of gencon may be better suited
        • Usage of the application, size of reports, number of users all come in to effect.  Both policies should be tested accordingly.
  • -Xverbosegclog:${bin_path}/verbosegc_%Y%m%d.%H%M%S.%pid.log,1,20000
    • Enabling Verbose Garbage Collection logging is recommended
    • The performance impact of having the logging enabled is negligible
    • Should any problems arise with the JAVA process, the output of the Verbose GC logs may prove beneficial to understanding root cause

Using the settings above, the entries in the jvm.options file will look like:
-Xmx12288m
-Xms12288m
-Xmn6144m
-Xgcpolicy:balanced
-Xverbosegclog:${bin_path}/logs/verbosegc_%Y%m%d.%H%M%S.%pid.log,1,20000

 

 

Additional Information

The following documents may prove useful in troubleshooting issues related to the TM1 Application Server

How to Troubleshoot a TM1 Application Server Hang
https://www-01.ibm.com/support/docview.wss?uid=swg22002262

How to Troubleshoot a TM1 Application Server Crash
http://www-01.ibm.com/support/docview.wss?uid=swg22002285

Product:
Planning Analytics 2.0.5 workspace
Microsoft Server 2016

Problem:
You get value in to the TM1 application, from a yearly cost of company car, and you want to know what the monthly cost is. Where you have the values for start date, and the accumulated sum for the month.

Starting solution:
This show part of the solution, to get you to understand the ParseDate function.
You must expand this solution to get a fully working TI process.

You can a find better solutions here:
http://www.tm1forum.com/viewtopic.php?t=9736

 

Login to PAW (Plannig Analytics Workspace) and your application.
ti1
Click on the pencil to make it blue so you are in edit mode.
ti2
Right click on dimensions to create a new dimension
ti3
Enter the name of the dimension and click create
ti4
Click OK and click on enter new members
ti5
ti6
Enter StartDate and click on commit
ti7
click on plus sign and select after selection to add a new member of the dimension.
ti8
Enter this values – this will be part of our cube to show the result.
ti9
Select the Breakdate and right click to set format to Date, repeat for StartDate.
ti10
Click on the show members attributes to see the format.
Create a new dimension named Result.Log with only one member.
ti11
ti12
Right click on cubes to create a new cube of above dimensions.
ti13
Mark Result.Event and Result.log and click arrow to move them over.
Enter Result Cube as name of the cube, and click on Create.
ti14
Click on the plus sign to add a new sheet. Select a boxed template.
ti15
Select the cube and right click and select Add new view
ti16
Click on the box and on the icon to change row and columns value. So you get this;
ti17
ti18
Right click on process to create a new process. Name the process AmountCalculated.
ti19
Expand the code window so you see most of it below the cube view.
ti20
Click on parameters tab, and click on plus sign to add input parameters. Here we will enter the values to test the process. In your solution you may get this values from a file instead.
ti21
Enter above names and start values. (p is to show it is a parameter variable)
ti22
Enter script tab, to enter your code.
Enter below code under Prolog End: Generated Statements

# define the cube and dimensions to work with
sDimName = ‘Result.Log’;
sCubename = ‘Result Cube’;

nBreakDateA = ParseDate( pBreakDateA, ‘yyyyMMdd’,0 );
nStartDateA = ParseDate( pStartDateA, ‘yyyyMMdd’,0 );

# write variables values to txt file in servers log folder for debug
asciioutput (‘../logfiles/debuglog1.txt’, str(nBreakDateA,8,2),str(nStartDateA,8,2) , numbertostring(pTotalAmountA));

# check if it is before break date then divide with 12
if (nStartDateA < nBreakDateA);
nMonthAmountA = pTotalAmountA / 12;
else;

# check if the distance is not in the same year
nStartYear = stringtonumber( subst (pStartDateA, 1,4));
nEndYear = stringtonumber( subst (pTotalDateA, 1,4));

# write variables values to txt file in servers log folder for debug
asciioutput (‘../logfiles/debuglog3.txt’, numbertostring(nStartYear),numbertostring(nEndYear) , pStartDateA, pTotalDateA );

if (nStartYear <>  nEndYear);
nTotalDateA = ParseDate( pTotalDateA, ‘yyyyMMdd’,0 );
nResult=  ((YEAR(DATE( nTotalDateA  )) – YEAR(DATE( ( nStartDateA ))) )*12)+ (MONTH(DATE( nTotalDateA )) – MONTH(DATE( ( nStartDateA ))) ) ;

# write variables values to txt file in servers log folder for debug
asciioutput (‘../logfiles/debuglog4.txt’, numbertostring(nStartYear), numbertostring(nEndYear) , numbertostring(nResult)  );

# if total is far from start date to today – you use the same formula  but nResult is total number of months
nMonthAmountA = pTotalAmountA / nResult;
else;
# change here what values you want to use for the calculation
nStartmonth = stringtonumber( subst (pStartdateA, 5,2));
nEndmonth = stringtonumber( subst (pTotaldateA, 5,2));
nOfMonths = nEndmonth – nStartmonth +1;
# calculate value if less than a year
nMonthAmountA = pTotalAmountA / nOfMonths;
endif;
endif;
ti23
Then enter below code as Epilog

# write variables values to txt file in servers log folder for debug
asciioutput (‘../logfiles/debuglog2.txt’, str(nStartmonth,8,2) ,numbertostring(nMonthAmountA) ,numbertostring(pTotalAmountA), numbertostring(nOfMonths));

# create a new dimension element with the date and time as value to make it uniq
sDate = (timSt(now, ‘\Y-\m-\d \h:\i’));
DimensionElementInsertDirect( sDimname,’ ‘, sDate, ‘N’ );
# update the log cube with the values
CellPutN( nStartDateA+21916, sCubename,’StartDate’, sDate );
CellPutN( pTotalAmountA, sCubename,’TotalAmount’, sDate );
CellPutN( nMonthAmountA, sCubename,’MonthAmount’, sDate );
CellPutN( nBreakDateA+21916, sCubename,’BreakDate’, sDate );

# 1 in excel is 1900, but in TM1 that is 1960 – therefor the addition of 21916 days.
ti24
Click validate, and correct any ‘ that may be of wrong format.
ti25
Save and run.
ti26
Enter your values and click OK.
ti27
Above error if there is no /logfiles/ folder parallel to the data folder.
Create on folder c:\Program Files\ibm\cognos\tm1_64\samples\tm1\logfiles, if you use a sample TM1 application that do not have a separate log folder.
ti28
Connect to the server and check the log files to see variables values.
ti29
Click on refresh for the view to see the result.
ti30

Things to review:
nBreakDateA = ParseDate( pBreakDateA, ‘yyyyMMdd’,0 );
Above code change the string to a date number, but should be changed so it is not use the 1960 as start date.

if (nStartDateA < nBreakDateA);
Above code check if the date is before a specific date, you can change this to be automatic check for 12 months .

if (nofmonths > 12);
# if total is for from start date to today – you use the same formula
nMonthAmountA = pTotalAmountA / nOfMonths;
else;
Above code check if the difference is more than 12 months, then you can enhance it to calculate the value from January to today date only, if the total is only for this year.

# asciioutput (‘../logfiles/debuglog1.txt’, str(nBreakDateA,8,2),str(nStartDateA,8,2) , numbertostring(pTotalAmountA));
Remark out the debug asciioutput lines from your code.

More information:
http://www.wimgielis.com/tm1_newdateformatter_EN.htm
https://www.exploringtm1.com/date-time-functions-tm1-10-2/
https://www.exploringtm1.com/accumulating-values-in-tm1/

Operators in TI IF Statements

Product:
Cognos BI 10.2.2 (or Cognos Analytics 11.0.x)
Microsoft Windows 2008 Server

Issue:
You have not access to the Cognos Connection. The IBM Cognos service is running, what it looks like in Windows Services. The cogserver.log is empty since last restart.

Error message:
IBM Cognos gateway can not connect to IBM Cognos BI server.

Cause:
The root cause, a corruption in the \wlp\usr\servers\cognosserver\workarea for WebSphere Liberty Profile (WLP)
Solution:
Stop the IBM Cognos Windows service.
Backup the “workarea” folder.
Delete the “workarea” folder at  C:\Program Files\ibm\cognos\analytics\wlp\usr\servers\cognosserver\workarea
Start IBM Cognos service again.

More information:
http://www-01.ibm.com/support/docview.wss?uid=swg22011841
http://www-01.ibm.com/support/docview.wss?uid=swg21991231
http://www-01.ibm.com/support/docview.wss?uid=swg21347919

http://www-01.ibm.com/support/docview.wss?uid=swg21341113

Product:
Cognos Analytics 11.0.12  (kit_version=11.0.12.18062512)
Microsoft Windows Server 2016

Problem:
When in Cognos Connection you go to Administration console. Click on system in left column to have a view of the system health. You get a error instead of the Metrics-system tab.

Error Message:
SOAPFaultException
PRS-TRS-0902 The “XSLT” transform instruction encountered an error while processing the source at location “/cogadmin/transforms/gen-ui-markup/metrics.xslt”.
PF-COM-6204 The complete error has been logged.

Possible cause:
After replace the oracle driver in folder C:\Program Files\ibm\cognos\analytics\drivers to ojdbc7.jar file, the restart of the servers was not done correct.

Solution:
Restart the content manager server first.
Wait to it is finish.
Then restart the other CA (BI REPORT) servers.
Wait until they all are up.

More information:
http://www-01.ibm.com/support/docview.wss?uid=ibm10726007
https://www.ibm.com/support/knowledgecenter/en/SSEP7J_11.0.0/com.ibm.swg.ba.cognos.inst_cr_winux.doc/t_databaseconnectivityforcontentstore_stepsfororacle.html#DatabaseConnectivityforContentStore_StepsforOracle

Product:
Planning Analytics 2.0.5
Planning Analytics Workspace version 35
Linux Red Hat

Issue:
After setup of HTTPS to PAW you can not surf to the paw webpage.
If you check the pa-gateway log files in folder /ibm/paw/log/pa-gateway/ you see a error like this
AH01903: Failed to configure CA certificate chain!
in the pa-proxy-error.log file.

Solution:
Go to the /ibm/paw/config folder and check the file rights
ls -l
drwx—— dockeruser dockeruser 8774 Sep 26 11:03 pa-workspace.pem
if above will give that only the user dockeruser can read the certificate file, and therefor can not Apache read the file and include it in the upstart.
You need to ensure that the group docker can read the files.
To change owner and group of the file enter:
chown dockeruser:docker pa-workspace.pem
To change read rights to the file enter:
chmod 755 pa-workspace.pem
https://ss64.com/bash/chmod.html
https://ss64.com/bash/chown.html

Stop PAW with command in /ibm/paw/scripts folder;
./paw.sh stop
Start paw with command
./paw.sh

Now it should work. (you need to change path and users to match your environment.)

Troubleshoot:
In PUTTY you can write this BASH commands to check the PAW environment.
To check what docker containers that is running enter:
docker ps
if the first one say pa-gateway is restarting, this mean that the Apache server is not up and the pa-gateway can not load.
To go into the pa-gateway container enter:
docker exec -i -t pa-gateway /bin/bash
To read the Apache log enter:
more /var/log/apache2/error.log
Enter exit to return to Linux:
exit
To move to config folder enter:
cd /ibm/paw/config
To check content of paw.env file enter:
more paw.env
To edit paw.env file enter:
nano -w paw.env
Check that below rows are inside the paw.env file, this activate the https:
export EnableSSL=true
export ServerName=paw-server-name.domain.com
To save the changes in NANO enter:
ctrl+o
To exit NANO text editor enter:
ctrl+x

 

Sometimes a container is damage, then you can try to recreate it as below:

Stop the PAW containers with command:
./paw.sh stop
To erase pa-gateway container enter:
docker rm pa-gateway
To start paw and recreate containers missing enter:
./paw.sh

https://docs.docker.com/engine/reference/commandline/rm/

How setup HTTPS with PAW:
https://www.ibm.com/support/knowledgecenter/en/SSD29G_2.0.0/com.ibm.swg.ba.cognos.tm1_inst.2.0.0.doc/t_paw_enable_ssl.html

Product:
Cognos Analytics 11.0.7
Microsoft Windows 2012 server

Issue:
When you surf to Cognos Connection in a new setup of CA11 you get different error messages

Surf to: https://servername.domain.com/ibmcognos
Error: This page cant be displayed
Make sure the web address is correct
Look for the page with your search engine
Refresh the page in a few minutes

Surf to: https://servername.domain.com/ibmcognos/ibmcognos/ibmcognos
Error: server error in /ibmcognos application
the length of the URL for this request exceeds the configured maxUrlLength value.

Surf to: http://servername.domain.com/ibmcognos/bi/
Error: The website declined to show this webpage
Most likely causes:
This website requires you to log in

Surf to: https://servername.domain.com/ibmcognos/bi
Error: A blank page in Internet Explorer

Solution:
Go to Internet Information Services Manager
Check if you have set a HTTP redirect on the default web site, remove and try again.
Check if you have bindings on the default web page, if HTTPS is setup to a specific server name. Try with HTTPS set to a blank host name in the IIS Site binding setting.
Check if you have marked Require SSL for the Default Web Site, in IIS.

Product:
Planning Analytics 2.0.5 (PAX)
Microsoft Windows 2016 Server

Issue:
When you select a TM1 server in list of TM1 connections in Planning Analytics for Excel a HTTP 503 error appears.
If you test the connection to PAW in the options box it is OK.

Possible solution:
Inside the ADMINTOOL for PAW (planning analytics workspace) the value for TM1 Application Server Gateway URI is wrong, it may point to HTTPS when it should point to HTTP.
This if you have changed your setup of TM1WEB(WLP) from HTTPS to HTTP to do some testing.

Copy the URL to a web browser (IE) and test if you can surf there.
Example of URL are:
http://tm1webservername.domain.com:9510

More information:
https://www.exploringtm1.com/client-connection-to-pax/
https://www.ibm.com/communities/analytics/cognos-analytics-blog/cognosanalytics-and-planninganalytics-integration-walkthrough-part-3/

Product:
Planning Analytics 2.0.5 TM1
Microsoft Windows 2016 terminal server
TM1 package connector version 10.2.6100.8-0

Issue:
Inside TM1 architect you get a error when you run a TI process that uses the Package connector. Error happens when you start to import data to TM1 with the process.

Error Message:
Error: MetaData procedure line (0): CCognosPackage::BuildDataSource Exception: (TR3117) Unable to retrieve security objects. Please verify credentials.
– CCL-BIT-0005 A socket reported a communication error.
– CAM_Connect=0xffffffff -2113929065CAM-CRP-0321 The GSKit function ‘gsk_environment_init’ failed with the error code ‘9’

Solution:
Ensure that on the Planning Analytics server, where the TM1 applications are, that you have the GSK*.dll files that come with the TM1 package connector version. If they have been replace to solve other issues, you need to copy them back.

A clean new install of TM1 Package Connector on the PA 2.0.5 server can also help.

More information:

http://www-01.ibm.com/support/docview.wss?uid=swg21988056