Product:
Planning Analytics 2.0.5
Microsoft Windows 2016 server

Problem:
How change the number of columns shown and loaded in tm1web?

Suggested Solution:
Update the file tm1web_config.xml, that are in folder
C:\Program Files\ibm\cognos\tm1_64\webapps\tm1web\WEB-INF\configuration on the TM1WEB server.
Change below lines to a number that works for your application:

<!– CubeViewerRowPageSize: Number of rows to fetch in a page of cubeviewer –>
<add key=”CubeViewerRowPageSize” value=”250″ />
<!– CubeViewerColumnPageSize: Number of columns to fetch in a page of cubeviewer –>
<add key=”CubeViewerColumnPageSize” value=”100″ />

In TM1 10.2 FP1 two settings, controls the number of columns and rows that are loaded in TM1 Web Websheets,
WebsheetRowThreshold and WebsheetColumnThreshold.
http://www-01.ibm.com/support/docview.wss?uid=swg27040580

In the documentation for Planning Analytics 2.0 these settings are not found, instead you find settings CubeViewerRowPageSize and CubeViewerColumnPageSize.
https://www.ibm.com/support/knowledgecenter/SSD29G_2.0.0/com.ibm.swg.ba.cognos.tm1_inst.2.0.0.doc/c_modifyingtm1webconfigurationparameters_n50ce5.html

The tm1web_config.xml.new do not have WebsheetRowThreshold and WebsheetColumnThreshold settings.

The old parameters can configure the amount of rows and columns in a websheet that completely fetch on loading. If the sheet fits roughly within the number of rows and columns specified with the new parameters, it will be fully loaded on start up.
If a sheet is larger than the values set with the parameters, then Cognos TM1 Web fetches additional data on demand as you scroll beyond the buffers of the previously loaded data.

New Planning Analytics have a better TM1WEB solution, and will cache values different, and above old parameters should not be needed for new TM1WEB.

The parameters in this file, tm1web_config.xml, control the following IBM TM1 Web features.

  • View node
    Cube Viewer page size
    Number of sheets to export from a Cube Viewer
    IBM TM1 Web startup and appearance settings
    Session timeouts

https://www.ibm.com/support/knowledgecenter/en/SSD29G_2.0.0/com.ibm.swg.ba.cognos.tm1_ug.2.0.0.doc/c_tm1web_cfg_params_v10r2.html

Product:
Cognos Planning Analytics Workspace version 36
Centos Linux

Problem:
How to upgrade PAW?
https://www.ibm.com/support/knowledgecenter/en/SSD29G_2.0.0/com.ibm.swg.ba.cognos.tm1_inst.2.0.0.doc/t_paw_upgrade.html

Solution:
Download the latest version from
https://www-01.ibm.com/support/docview.wss?uid=swg27049597

First check there is space left on your linux box with command df -h

Create a new folder for your new software
cd /
mkdir ibm
cd ibm
mkdir paw36

Copy the software over to the linux box with WINSCP
https://winscp.net/eng/download.php

Place the zip file in folder /ibm/paw36.
Unzip the file with the command:
unzip ipa_workspace_local_2.0.36.1440.5.zip -d /ibm/paw36

Go to the old existing folders script folder (/ibm/paw/scripts/) and run a backup with the command:
./backup.sh

Copy the paw.env file from your current installation to the new installation location.
cp /ibm/paw/config/paw.env  /ibm/paw36/config/paw.env

Copy the /config/certs directory from your current installation to the new installation location.
cp /ibm/paw/config/certs/*.* /ibm/paw36/config/certs/

If you use SSL and have saved the certificate in the pa-workspace.pem file, you need to copy that from your current installation to the new installation location.
cd /ibm/paw36/config
mv pa-workspace.pem  pa-org-workspace.pem
cp  /ibm/paw/config/pa-workspace.pem  pa-workspace.pem

If you have changed the old Start.sh file, you need to add the same values in the new Start.sh file.
Open Start.sh in Nano, add the IP address of the Linux server as ADMINTOOL_IP, and add the folder where docker-compose are to the PATH with the command PATH=$PATH:/ibm/comp/
cd /ibm/paw
nano Start.sh
Enter values like:
export ADMINTOOL_IP=192.168.254.11
export PATH=$PATH:/ibm/comp/
Press CTRL+O and click ENTER to save file.
Press CTRL+X to exit.

Change above path to your folder where DOCKER-COMPOSE is saved.

You may need to update the access to the new files with command:
cd /ibm/paw36
chmod 744 *.*
to give all read access to the files.

Stop the running PAW with command:
cd /ibm/paw/scripts
./paw.sh stop

Go to the new folder with command:
cd /ibm/paw36

Start the installation by enter
./Start.sh

Enter “y” when you are prompted to install the Docker images. Enter “y” when you are prompted to open the administration tool.

Start a Internet Explorer from your laptop and surf to the Linux server on http://192.168.254.11:8888

Inside the Planning Analytics Workspace administration tool, check that all URL to CA11 and TM1 are correct.
https://www.ibm.com/communities/analytics/cognos-analytics-blog/cognosanalytics-and-planninganalytics-integration-walkthrough-part-3/

Some Values need to be entered again in configuration tab, do it and save them.


Click Validate.
Click Status.
Click Restart.
Wait until CPU is below 1%.
Click Refresh.

Then surf to HTTP://pawservername.domain.com

Normally the docker containers and images are in folder /var/lib/docker

The daemon.json file are in folder /etc/docker

NOTE: to prevent from having to many PAW folders, and make it hard to now what folder that is in use, do the opposite – rename the old folder /ibm/paw to /ibm/pawold. Create the new folder /ibm/paw. Copy the new zip file to that folder, and unpack it. Then copy the needed config files from /ibm/pawold to the /ibm/paw folder (as listed above). Then start the installation(upgrade) from /ibm/paw/Start.sh.

More Information:
http://blog.thoward37.me/articles/where-are-docker-images-stored/

Product:
Planning Analytics Workspace 35
Linux Red Hat

Problem:
You have no access to the paw, and a restart of the docker containers with command  ./paw.sh stop  and  ./paw.sh  did not help.

If you click “validate” inside the Workspace administration page, you get Error getaddrinfo ENOTFOUND on all the server links.

Error Message:
Proxy Error
The proxy server received an invalid response from an upstream server.
The proxy server could not handle the request GET.
Reason: DNS lookup failure for: wa-proxy

Workaround:
Restart the linux box.
Login to the linux server with PUTTY.
Change to root user with command:
sudo su – root
Restart the linux server with command:
init 6

Try again after a few minutes.

Product:
Planning Analytics 2.0.5
Microsoft Windows Server 2016

Recommend to have only the valid TM1S.CFG functions in the file, the default values change with each version of TM1. You need to check the values, and remove the unnecessary ones. https://www.ibm.com/support/knowledgecenter/en/SSD29G_2.0.0/com.ibm.swg.ba.cognos.tm1_inst.2.0.0.doc/c_parametersinthetm1s.cfgfile_n1503fe.html

(This is a copy of text on this site – they contain good information)
https://cubewise.com/blog/adjusting-server-configuration-planning-analytics/

Planning Analytics or TM1 server v11 comes with a host of new features. In order to get best performance and control behavior of the TM1 server if you are upgrading or considering upgrading to PAL then you should make sure that as a minimum you optimize your server configuration file (tm1s.cfg) to get the most out of the latest version. This article is part of our series on upgrading from TM1 to Planning Analytics. To get an overview of other areas to consider please refer to this topic.

Not all the parameters examined here are strictly brand new to v11 some have been around for a while or have changed default values or might now be redundant and therefore require a revisit.

One of the most significant changes to TM1 server behavior between versions 10.2.2 and 11.x is that almost all server configuration parameters are now dynamic. Even parameters concerning login and security are now dynamic. The parameters remaining static or immutable for the duration of the server session are now only a handful (where also it must be said that static makes sense) e.g. AdminHost, ServerName, DataDirectory, PortNumber. This is a huge change which should be acknowledged and hats off to IBM for achieving this.

tm1s.cfg parameters worth revisiting

  • HTTPPortNumber (static, default=5001) Determines the port number used by the Rest API. With the Rest API becoming the default API for the TM1 server it is important to set this parameter and not use the default value.
  • HTTPSessionTimeoutMinutes (dynamic, default=20) Determines the idle timeout for Rest sessions. We suggest matching the value of HTTPSessionTimeoutMinutes to IdleConnectionTimeoutSeconds.
  • UseSSL (static, default=T) Determines whether the server uses encrypted communication.
  • MaximumCubeLoadThreads (static, default=0) By default this parameter is disabled. In most situations Cubewise has recommended enabling. However, this recommendation changes with Planning Analytics as the new parameter MTCubeLoad should be used in its place. Having this parameter enabled can cause the server to crash during start-up in PA. MaximumCubeLoadThreads should now be set to 0 or removed from the tm1s.cfg file.
  • MTQ (dynamic, default=ALL) Multi-Threaded Query mode is now enabled by default to use all available cores (default value ALL or -1). Our recommendation however remains to set MTQ=-2 which will set MTQ to use all available cores less one which allows other operations on the server to be processed even under peak load from the TM1 server.
  • MTQQuery (dynamic, default=T) Determines whether TI processes utilize MTQ to materialize data source cube views.
  • PersistentFeeders (static, default=T) Persistent Feeders are now enabled by default allowing for faster server load times (at the cost of additional disk consumption for .feeders files and longer time to save all server data). The default value is recommended.
  • IntegratedSecurityMode (dynamic, default=1) This parameter is now dynamic, meaning for example that login method could be changed from CAM to TM1 security or vice versa while the server is in session. This has been discussed previously on our blog here.
  • UserDefinedCalculations (dynamic, default=T) We are including this parameter here because as of the latest 2.0.x PAL release user defined calculations are not supported in the new Rest based clients (Workspace, PAX) and it remains unclear whether or how UDCs will be supported in the future.

Brand new parameters

  • EnableNewHierarchyCreation (static, default=F) Alternate hierarchies are the banner new feature of Planning Analytics however are disabled by default. To enable hierarchies this parameter must be set to T. If you plan on implementing hierarchies make sure you also read our blog posts on the changes you need to consider to rules and TI processes as this can have far reaching impacts on your model.
  • MTCubeLoad (dynamic, default=F) This parameter replaces MaximumCubeLoadThreads. With MTCubeLoad cubes are still loaded in serial but multiple threads are used to load each cube. We recommend enabling this feature as this dramatically decreases server load times. The feature is disabled if MTQ=0 or MTQ=1.
  • IndexStoreDirectory (dynamic, default=DataDirectory) When MTCubeLoad is enabled this parameter defines the directory where index and bookmarks are stored to further optimize the server load phase. We recommend setting a directory other than the data directory.
  • MTFeeders (dynamic, default=F) Enables multi-threaded feeder evaluation during saving of rules or running the CubeProcessFeeders function. Enabling this function is recommended as this can significantly speed up saving of rules.
  • MTFeeders.AtStartup (dynamic default=F) Enables multi-threaded feeder evaluation during the server load phase. Combined with MTCubeLoad this can lead to dramatic reduction of startup times to a level comparable to the load time achieved with Persistent Feeders. We recommend enabling this feature.
  • PreallocatedMemory.Size, PreallocatedMemory.BeforeLoad (dynamic, default=F) These parameters can be used to fine-tune the behaviour of MTCubeLoad.
  • CreateNewCAMClients (dynamic, default=T) This parameter was introduced in 10.2.2 FP5 so is not strictly new with PA but due to the recent introduction is probably still unknown to most administrators. Setting this parameter to F disables the creation of new clients in a server using CAM authentication. This can reduce locking contention experienced when new clients attempt to log in.
  • EnableSandboxDimension (dynamic, default=F) Enabling this parameter allows sandboxes to be used as a virtual dimension when querying cubes. This is a fantastic feature however should be used with caution as once enabled all Rest and MDX queries will expect sandbox to be addressed as the 1st dimension in all cube queries.
  • EnableTIDebugging (dynamic, default=F) Debug mode for TurboIntegrator is another valuable new feature which is disabled by default. We recommend enabling this parameter on all non-production servers. The default value can be left for production environments where step-through debugging is not required. However, as the parameter is dynamic it can be enabled if needed.

In summary

Planning Analytics has several new features which require administrators to change default server configurations to access the new features. Even if you don’t plan on implementing alternate hierarchies we recommend using all of the new parameters listed above. When launching new features IBM has a history of first disabling new features by default and in later releases changing the default behaviour. Examples of this can be seen with ParallelInteraction, PersistentFeeders and MTQ which were all initially disabled by default but are now enabled by default. We would expect the default values for EnableNewHierarchyCreation, MTCubeLoad & MTFeeders to all switch to being True within the next few releases.

Should you want to delve even deeper into optimizing tm1s.cfg parameters the Cubewise Code blog has a good article on this topic.

Product:
Planning Analytics 2.0.5
Microsoft Windows Server 2016

(this is a copy of text from IBM site)
https://www-01.ibm.com/support/docview.wss?uid=ibm10737313&myns=swgimgmt&mynp=OCSSD29G&mync=F&cm_sp=swgimgmt-_-OCSSD29G-_-F

Summary

The purpose of this document is to help with understanding and managing the memory utilization of the TM1 Application Server.

Understanding of Usage

Your TM1 Application Server (JAVA) memory is impacted by the following applications:

  • TM1Web (viewing/exporting data)
  • PMPSVC (viewing/exporting data)
  • Planning Analytics for Excel (PA for Excel, which uses the PMHub service on the TM1 Application Server)
  • Planning Analytics Workspace (if used to display PA for Excel/Perspectives reports)
  • Operations Console (obtaining server status/logging data)

As much as one would like to know what the best settings are or how much memory is required to host the TM1 Application Server, the answer does not come easy.  The following questions must be taken in to account and even with answers to the following questions, only load/stress testing can provide an accurate estimate to the resources required for your specific load:

  • How many concurrent users are expected?
  • What application(s) are users expected to use?
  • Are users doing reporting only or is data entry included?
  • How much data is to be consumed by a user?  (how many rows/columns/cells)
  • How much data is expected to be exported?
  • Do web/excel books contain more than one tab?  (is it really 1 report, or 20 reports under 1?)

Applications Settings / Governors

To get a better handle on your TM1 Application Server memory management, the below safeguards/settings should be introduced.  It is recommended to implement more restrictive safeguards, raising the restrictions as required and where it makes sense to do so.

TM1Web (via /tm1_64/webapps/tm1web/web-inf/configuration/tm1web_config.xml):

<!– If specified, sets a limit for the maximum number of cells in the workbook file. It will not be possible to open workbooks with a number of cells that exceeds this limit. Leaving this blank will allow all workbooks to be opened. –>
<add key=”WorkbookMaxCellCount” value=”500000” />

<!– If specified, sets a threshold for limiting of exports using the number of cells. If the set threshold is exceeded when performing an export (looped exports will reach this multiplicatively), then the export will not be allowed for that user. Is applied for all users, and for  Websheets and CubeViewer. Leaving this blank will allow for classic-style exporting with no cell count limitations. –>
<add key=”ExportCellsThreshold” value=”200000” />

<!– MaximumSheetsForExport: Maximum number of sheets allowed to Export –>
<add key=”MaximumSheetsForExport” value=”50” />

<!– Determines the maximum number of exports that can occur concurrently. If less than or equal to zero, will allow virtually unlimited concurrent exports (2147483647 max). The default value is 5. –>
<add key=”MaximumConcurrentExports” value=”2” />

<!– CubeViewerRowPageSize: Number of rows to fetch in a page of cubeviewer –>
<add key=”CubeViewerRowPageSize” value=”100” />

<!– CubeViewerColumnPageSize: Number of columns to fetch in a page of cubeviewer –>
<add key=”CubeViewerColumnPageSize” value=”20” />
Planning Analytics for Excel (via Options Menu):

  • Application Settings:
    • Auto spread consolidated input, recommended value: disabled
      •   If not required, disable this setting to prevent users from accidentally spreading
      •   If spreading is allowed, ensure users are educated so that a spread is not attempted on an intersection that spans too many cells
  • Exploration or list settings:
    •  Data display row limit, recommended value: 5000
      •   Reduce the number of rows displayed to ensure a larger than necessary query is no executed
    •  Expand member limit, recommended value: 2000
      •   Reduce the number of rows displayed to ensure a larger than necessary query is no executed
  • Planning Analytics for Excel Design Recommendations:
    • Ensure that Zero Suppression is enabled where possible, to prevent larger than necessary queries
    • Do not include too many sheets/tabs in a single book, create multiple reports where possible
    • Ensure the Excel Sheet does not contains many unnecessary blank/empty columns/rows

 

Tuning the JAVA Process

The TM1 Application Server sits on a Websphere Liberty Application Server, which utilizes the JAvA platform.  The details in this section can assist you with further tuning of the JAVA Process for the TM1 Application Server, however as mentioned earlier in this document – only a proper test/load plan can ensure a successful deployment.

To fine tune the TM1 Application Server’s JAVA Process, edit the jvm.options file (found in \tm1_64\wlp\usr\servers\tm1).  Consider adding the following entries, below the existing jvm.options settings (or updating any settings that may already exist).  Values should be adjusted accordingly to available resources, for example – don’t allocate an Xmx of 24576m if you have less than 32GB of RAM on the server.

  • -Xmx12288m
    • Xmx specifies the maximum memory allocation pool (heap) for a Java Virtual Machine (JVM)
      • Xmx is more commonly set using Cognos Configuration.  Changes to Xmx in the jvm.options file will take precedence.
      • Xmx should NOT exceed 24576m, as additional performance problems may be encountered.
  • -Xms12288m
    • Xms specifies the initial memory allocation pool on process startup.
    • Xms should reflect the same value specified by the Xmx parameter.
  • -Xmn6144m
    • Xmn sets the size of the new (nursery) heap.
    • Xmn should be set to half of the value specified by the Xmx parameter.
  • -Xgcpolicy:balanced
    • A garbage collection policy of balanced is recommended when the Xmx exceeds 8192m
    • Depending on application usage, a different garbage collection policy may prove beneficial to performance
      • If using TM1Web, a policy of balanced is recommended
      • If Planning Analytics for Excel is the only application using the TM1 Application Server, a policy of gencon may be better suited
        • Usage of the application, size of reports, number of users all come in to effect.  Both policies should be tested accordingly.
  • -Xverbosegclog:${bin_path}/verbosegc_%Y%m%d.%H%M%S.%pid.log,1,20000
    • Enabling Verbose Garbage Collection logging is recommended
    • The performance impact of having the logging enabled is negligible
    • Should any problems arise with the JAVA process, the output of the Verbose GC logs may prove beneficial to understanding root cause

Using the settings above, the entries in the jvm.options file will look like:
-Xmx12288m
-Xms12288m
-Xmn6144m
-Xgcpolicy:balanced
-Xverbosegclog:${bin_path}/logs/verbosegc_%Y%m%d.%H%M%S.%pid.log,1,20000

 

 

Additional Information

The following documents may prove useful in troubleshooting issues related to the TM1 Application Server

How to Troubleshoot a TM1 Application Server Hang
https://www-01.ibm.com/support/docview.wss?uid=swg22002262

How to Troubleshoot a TM1 Application Server Crash
http://www-01.ibm.com/support/docview.wss?uid=swg22002285

Product:
Planning Analytics 2.0.5 workspace
Microsoft Server 2016

Problem:
You get value in to the TM1 application, from a yearly cost of company car, and you want to know what the monthly cost is. Where you have the values for start date, and the accumulated sum for the month.

Starting solution:
This show part of the solution, to get you to understand the ParseDate function.
You must expand this solution to get a fully working TI process.

You can a find better solutions here:
http://www.tm1forum.com/viewtopic.php?t=9736

 

Login to PAW (Plannig Analytics Workspace) and your application.
ti1
Click on the pencil to make it blue so you are in edit mode.
ti2
Right click on dimensions to create a new dimension
ti3
Enter the name of the dimension and click create
ti4
Click OK and click on enter new members
ti5
ti6
Enter StartDate and click on commit
ti7
click on plus sign and select after selection to add a new member of the dimension.
ti8
Enter this values – this will be part of our cube to show the result.
ti9
Select the Breakdate and right click to set format to Date, repeat for StartDate.
ti10
Click on the show members attributes to see the format.
Create a new dimension named Result.Log with only one member.
ti11
ti12
Right click on cubes to create a new cube of above dimensions.
ti13
Mark Result.Event and Result.log and click arrow to move them over.
Enter Result Cube as name of the cube, and click on Create.
ti14
Click on the plus sign to add a new sheet. Select a boxed template.
ti15
Select the cube and right click and select Add new view
ti16
Click on the box and on the icon to change row and columns value. So you get this;
ti17
ti18
Right click on process to create a new process. Name the process AmountCalculated.
ti19
Expand the code window so you see most of it below the cube view.
ti20
Click on parameters tab, and click on plus sign to add input parameters. Here we will enter the values to test the process. In your solution you may get this values from a file instead.
ti21
Enter above names and start values. (p is to show it is a parameter variable)
ti22
Enter script tab, to enter your code.
Enter below code under Prolog End: Generated Statements

# define the cube and dimensions to work with
sDimName = ‘Result.Log’;
sCubename = ‘Result Cube’;

nBreakDateA = ParseDate( pBreakDateA, ‘yyyyMMdd’,0 );
nStartDateA = ParseDate( pStartDateA, ‘yyyyMMdd’,0 );

# write variables values to txt file in servers log folder for debug
asciioutput (‘../logfiles/debuglog1.txt’, str(nBreakDateA,8,2),str(nStartDateA,8,2) , numbertostring(pTotalAmountA));

# check if it is before break date then divide with 12
if (nStartDateA < nBreakDateA);
nMonthAmountA = pTotalAmountA / 12;
else;

# check if the distance is not in the same year
nStartYear = stringtonumber( subst (pStartDateA, 1,4));
nEndYear = stringtonumber( subst (pTotalDateA, 1,4));

# write variables values to txt file in servers log folder for debug
asciioutput (‘../logfiles/debuglog3.txt’, numbertostring(nStartYear),numbertostring(nEndYear) , pStartDateA, pTotalDateA );

if (nStartYear <>  nEndYear);
nTotalDateA = ParseDate( pTotalDateA, ‘yyyyMMdd’,0 );
nResult=  ((YEAR(DATE( nTotalDateA  )) – YEAR(DATE( ( nStartDateA ))) )*12)+ (MONTH(DATE( nTotalDateA )) – MONTH(DATE( ( nStartDateA ))) ) ;

# write variables values to txt file in servers log folder for debug
asciioutput (‘../logfiles/debuglog4.txt’, numbertostring(nStartYear), numbertostring(nEndYear) , numbertostring(nResult)  );

# if total is far from start date to today – you use the same formula  but nResult is total number of months
nMonthAmountA = pTotalAmountA / nResult;
else;
# change here what values you want to use for the calculation
nStartmonth = stringtonumber( subst (pStartdateA, 5,2));
nEndmonth = stringtonumber( subst (pTotaldateA, 5,2));
nOfMonths = nEndmonth – nStartmonth +1;
# calculate value if less than a year
nMonthAmountA = pTotalAmountA / nOfMonths;
endif;
endif;
ti23
Then enter below code as Epilog

# write variables values to txt file in servers log folder for debug
asciioutput (‘../logfiles/debuglog2.txt’, str(nStartmonth,8,2) ,numbertostring(nMonthAmountA) ,numbertostring(pTotalAmountA), numbertostring(nOfMonths));

# create a new dimension element with the date and time as value to make it uniq
sDate = (timSt(now, ‘\Y-\m-\d \h:\i’));
DimensionElementInsertDirect( sDimname,’ ‘, sDate, ‘N’ );
# update the log cube with the values
CellPutN( nStartDateA+21916, sCubename,’StartDate’, sDate );
CellPutN( pTotalAmountA, sCubename,’TotalAmount’, sDate );
CellPutN( nMonthAmountA, sCubename,’MonthAmount’, sDate );
CellPutN( nBreakDateA+21916, sCubename,’BreakDate’, sDate );

# 1 in excel is 1900, but in TM1 that is 1960 – therefor the addition of 21916 days.
ti24
Click validate, and correct any ‘ that may be of wrong format.
ti25
Save and run.
ti26
Enter your values and click OK.
ti27
Above error if there is no /logfiles/ folder parallel to the data folder.
Create on folder c:\Program Files\ibm\cognos\tm1_64\samples\tm1\logfiles, if you use a sample TM1 application that do not have a separate log folder.
ti28
Connect to the server and check the log files to see variables values.
ti29
Click on refresh for the view to see the result.
ti30

Things to review:
nBreakDateA = ParseDate( pBreakDateA, ‘yyyyMMdd’,0 );
Above code change the string to a date number, but should be changed so it is not use the 1960 as start date.

if (nStartDateA < nBreakDateA);
Above code check if the date is before a specific date, you can change this to be automatic check for 12 months .

if (nofmonths > 12);
# if total is for from start date to today – you use the same formula
nMonthAmountA = pTotalAmountA / nOfMonths;
else;
Above code check if the difference is more than 12 months, then you can enhance it to calculate the value from January to today date only, if the total is only for this year.

# asciioutput (‘../logfiles/debuglog1.txt’, str(nBreakDateA,8,2),str(nStartDateA,8,2) , numbertostring(pTotalAmountA));
Remark out the debug asciioutput lines from your code.

More information:
http://www.wimgielis.com/tm1_newdateformatter_EN.htm
https://www.exploringtm1.com/date-time-functions-tm1-10-2/
https://www.exploringtm1.com/accumulating-values-in-tm1/

Operators in TI IF Statements

Product:
Planning Analytics 2.0.5 (PAX)
Microsoft Windows 2016 Server

Issue:
When you select a TM1 server in list of TM1 connections in Planning Analytics for Excel a HTTP 503 error appears.
If you test the connection to PAW in the options box it is OK.

Possible solution:
Inside the ADMINTOOL for PAW (planning analytics workspace) the value for TM1 Application Server Gateway URI is wrong, it may point to HTTPS when it should point to HTTP.
This if you have changed your setup of TM1WEB(WLP) from HTTPS to HTTP to do some testing.

Copy the URL to a web browser (IE) and test if you can surf there.
Example of URL are:
http://tm1webservername.domain.com:9510

More information:
https://www.exploringtm1.com/client-connection-to-pax/
https://www.ibm.com/communities/analytics/cognos-analytics-blog/cognosanalytics-and-planninganalytics-integration-walkthrough-part-3/

Product:
Planning Analytics 2.0.5 TM1
Microsoft Windows 2016 terminal server
TM1 package connector version 10.2.6100.8-0

Issue:
Inside TM1 architect you get a error when you run a TI process that uses the Package connector. Error happens when you start to import data to TM1 with the process.

Error Message:
Error: MetaData procedure line (0): CCognosPackage::BuildDataSource Exception: (TR3117) Unable to retrieve security objects. Please verify credentials.
– CCL-BIT-0005 A socket reported a communication error.
– CAM_Connect=0xffffffff -2113929065CAM-CRP-0321 The GSKit function ‘gsk_environment_init’ failed with the error code ‘9’

Solution:
Ensure that on the Planning Analytics server, where the TM1 applications are, that you have the GSK*.dll files that come with the TM1 package connector version. If they have been replace to solve other issues, you need to copy them back.

A clean new install of TM1 Package Connector on the PA 2.0.5 server can also help.

More information:

http://www-01.ibm.com/support/docview.wss?uid=swg21988056

Product:
Planning Analytics 2.0.5
PAX and PAW
Cognos Analytics 11.0.12
Microsoft Windows 2016 server

Problem:
When starting Excel and selecting Planning Analytics for Excel (TM1) you get a error after the login dialog.

Error message:
Cognos.Office.Tm1.Connections.Tm1WebException: The remote server returned an error: (503) Server Unavailable. —> System.Net.WebException: The remote server returned an error: (503) Server Unavailable.
at System.Net.HttpWebRequest.GetResponse()
at Cognos.Office.Tm1.Connections.Tm1Request.GetResponse()
— End of inner exception stack trace —
at Cognos.Office.Tm1.Connections.Tm1Connection.HandleWebException(Tm1WebException x, Uri path)
at Cognos.Office.Tm1.Connections.Tm1Connection.GetResponse(Tm1Request request)
at Cognos.Office.Tm1.Connections.Tm1Connection.SendRequest(Tm1Request webRequest, AcceptReturnEnum returns)
at Cognos.Office.Tm1.Connections.Tm1Connection.SendRequest(String method, String path, String postData, AcceptReturnEnum returns, Int32 timeout)
at Cognos.Office.Tm1.Connections.Tm1Connection.Get(String path, AcceptReturnEnum file, Int32 timeout)
at Cognos.Office.Tm1.Rest.Tm1RestCapabilities..ctor(Tm1Connection connection, String server, Int32 timeout)
at Cognos.Office.Tm1.Rest.Tm1RestConnection.RefreshCapabilities(String server, String path)
at Cognos.Office.Tm1.Rest.Tm1RestServer.UpdateCapabilities(String serverName)
at Cognos.Office.Tm1.Rest.Tm1RestDataSource.UpdateCapabilities(String serverName)
at Cognos.Office.Tm1.Tm1DataSource.On_LogOn(String server)
at Cognos.Office.Tm1.Tm1DataSource.Logon(LogonType logonType, DataSourcePropertiesCollection properties)
at Cognos.Office.Framework.Communications.DataSourceManager.LogOn(String dsType, String connectionString, String serverName, IDataSource& source)

Solution:
Check the values in PAW Admintool, special for the TM1 Application Server gateway URI. If that is wrong, like use HTTP instead of HTTPS when you have switch TM1WEB to use SSL, you will get above error inside PAX. To surf to PAW will still work fine.

When a user start Excel and PAX, the first connection is to the PAW server.
Then the PAW server will check the TM1S.CFG for value ClientCAMURI=http://caservername.domain.com/ibmcognos/bi/v1/disp
this URL will be used by PAX to log you in. After you login, the PMHUB.HTML file need to list the URL to PAW to allow you to get back.
You need to check the pmhub.html file in folder c:\Program Files\ibm\cognos\analytics\webcontent and c:\Program Files\ibm\cognos\analytics\webcontent\bi.
Open it in notepad++ and check that the PAW servers name or DNS alias at var pmhubURLs = [“https://tm1web.domain.com:9443” , “https://centospaw.lab.local”];
When you are back at PAW server in the communication route – pax will use the PAW ADMINTOOLS value in TM1 Application Server gateway URI before it show you are connected. If that value does not point to the CA11 server you get above error.

PAX is not using the PAW ADMINTOOLS value for IBM Cognos BI Gateway URI, instead you have the value  export TM1LoginServerURL=”https://tm1serverhost:12345″ in the paw.env file. The TM1 application that use the HTTPPortNumber = 12345, that value in its tm1s.cfg file is used.

If the value in PMHUB.HTML is wrong you get a error like “Cannot redirect back to planning analytics because the host URL (http://centospaw.lab.local) has not been whitelisted.”

Do not enter the port number for paw server in the PMHUB.HTML file.

Product:
Planning Analytics 2.0.4 (TM1)
Microsoft Windows 2016 server

Problem:
TM1 Architect is connecting the TM1 applications on the wrong network card on the Microsoft Windows server.

Possible solution:
The default network card with the lowest metric value is selected from all network cards to be the default network card. Use Get-NetIPInterface to see metrics.

Start Powershell on the Windows server
Enter: Get-NetIPInterface
This will list the network interfaces on the server.
Check the value for ifIndex and InterfaceMetric (A low value indicates a higher priority.)
The card with lowest value for InterfaceMetric will be used by TM1.
Is this the correct one? or is it for a backup network?

ifIndex InterfaceAlias AddressFamily NlMtu(Bytes) InterfaceMetric Dhcp ConnectionState PolicyStore
——- ————– ————- ———— ————— —- ————— ———–
18 Ethernet 2 IPv6 2800 20 Enabled Connected ActiveStore
1 Loopback Pseudo-Interface 1 IPv6 4294967295 50 Disabled Connected ActiveStore
18 Ethernet 2 IPv4 2800 20 Disabled Connected ActiveStore
12 Ethernet IPv4 1500 10 Enabled Connected ActiveStore
1 Loopback Pseudo-Interface 1 IPv4 4294967295 50 Disabled Connected ActiveStore

Change with Set-NetIPInterface, for example:
Set-NetIPInterface -InterfaceIndex -InterfaceMetric
Example
Set-NetIPInterface -InterfaceIndex 12 -InterfaceMetric 50
Set-NetIPInterface -InterfaceIndex 18 -InterfaceMetric 10
Above will make card 18 to be the new default.
Check the ip address of the card by enter IPCONFIG
Ethernet adapter Ethernet 2:

Connection-specific DNS Suffix . :
Link-local IPv6 Address . . . . . : fe80::a8cd:d4c4:120e:3ed1%18
Autoconfiguration IPv4 Address. . : 169.254.62.209
Subnet Mask . . . . . . . . . . . : 255.255.0.0
Default Gateway . . . . . . . . . : 25.255.255.254

Start TM1 Architect
Expand an TM1 application
See in the right side the IP address that will be used.

More information:
https://docs.microsoft.com/en-us/windows-server/networking/technologies/network-subsystem/net-sub-interface-metric
https://social.technet.microsoft.com/Forums/windows/en-US/cb8dac7f-5f04-42b1-8065-a95c946f6ec2/change-network-adapter-priority-order?forum=ws2016