Product:
Planning Analytics 2.0.5
Microsoft Windows Server 2016

Problem:
When run a TI process to export data to a csv file you get a error from inside your TM1 Application. You have recently moved from TM1 to Planning Analytics, and also to a new Operating System: Microsoft Windows 2016.
The file share you try to export to is on a Linux server.
It works fine from your old Microsoft Windows 2008 server.

Error on Windows 10:
You can’t connect to the file share because it’s not secure. This share requires the obsolete SMB1 protocol, which is unsafe and could expose your system to attack.
Your system requires SMB2 or higher. For more info on resolving the issue see: https://support.microsoft.com/en-my/help/4034314/smbv1-is-not-installed-by-default-in-windows

Possible Solution:
The New Windows Server 2016 have been setup to demand the newer more secure file share (SMB) protocol, and the Linux server is of a older OS version that does not support it.
Try to use a different file share, where both old linux program and Micosoft Windows 2016 server have access.

First check if there is not a firewall issue, with TELNET to see if the ports are open. Run below command from TM1 Windows server;
telnet linuxserver 445
telnet linuxserver 139
If above give error, check the firewall settings in the network.

You can from a CMD command try to access file share direct logged in as the service account on the TM1 server, to ensure that the account have access:
net use * \\linuxserver.domain.com\filesharename

Can say “System error 64 has occurred” if you do not have correct SMB access.

Check also that the IBM TM1 instance is run under a domain service account, and not Local System. Local System account can not access network file shares.

Red Hat Enterprise Linux 7.2, which includes samba-4.2, and later comes with proper support for SMBv2 protocol, but earlier releases of Red Hat Enterprise Linux only support SMBv1.

From the internet:
Samba is made by linux/unix
SMB/CIFS is made by windows/microsoft

NOTE: when people say I have a “CIFS share”, its better to say I have a “SMB share “or a “Samba share” – more on this below:
They use the same protocols to talk to each other.

Samba was originally made to emulate SMB, so that linux pcs could share files with Windows PCs. Now MACs also have samba, so they support SMB. So MACS Windows and Linux can all happily talk via Samba & SMB.

With each new version of Windows, a new SMB version comes out. Then Samba team has to be ready to update their code to support the new features in SMB.
Windows;
SMB 1 – Windows 2000
SMB 2 – Windows Server 2008 and WIndows Vista SP1
SMB 2.1 – Windows Server 2008 R2 and Windows 7
SMB 3.0 – Windows Server 2012 / ? and Windows 8 / 10

To identify the SMB version:
Windows 8.1 or 2012, you can use the PowerShell (in admin mode) cmdlet Get-SmbConnection

You can not interrogate which SMB it is using in Windows 7.

SMB 1 introduced in DOS days, and was also called CIFS in its later version (think of it like SMB 1.1). First versions of Samba 1.x supported SMB and CIFS
SMB 2.0 / SMB2.02 introduced with Windows Vista / 2008 is supported with Samba 3.6
SMB 2.1 introduces with Windows 7 / Windows 2008 R2 is supported with Samba 4.0.0
SMB 3.0 introduced with Windows 8 / Windows 2012 is supported by Samba 4.2
SMB 3.02 introduced in Windows 8.1 / Windows 2012 R2 is not yet supported by any version of Samba (its in the works I assume)
SMB 3.11 introduced in Windows 10 / Windows 2016 is not yet supported by any version of Samba (its in the works I assume)

The latest updates of Windows 10 and Windows Server 2016, the support for SMB1 is automatically removed by Microsoft, if SMB1 is not used.

How remove SMB support:
https://support.microsoft.com/en-us/help/2696547/how-to-detect-enable-and-disable-smbv1-smbv2-and-smbv3-in-windows-and

Start a Powershell command
1) check which SMB is enabled and which one is disabled;

Get-SmbServerConfiguration | Select EnableSMB1Protocol, EnableSMB2Protocol

ex : True=enabled

2) To enable any SMB 1 or 2 or 3 use following,
Set-SmbServerConfiguration  -EnableSMB2Protocol  $True

3) To disable any SMB 1 or 2 or 3 use following,
Set-SmbServerConfiguration  -EnableSMB2Protocol  $False

Restart computer or server after every change.
Or do this on Windows Server 2012 R2 & 2016:
SMBv1
Detect: Get-WindowsFeature FS-SMB1
Enable: Enable-WindowsOptionalFeature -Online -FeatureName smb1protocol
Disable: Disable-WindowsOptionalFeature -Online -FeatureName smb1protocol

More information:
https://access.redhat.com/articles/3164551
https://www.rootusers.com/disable-smb-version-1-0-in-windows-server-2016/
https://www.mowasay.com/2018/08/windows-10-2016-build-1709-1803-cannot-connect-to-smb-shares/
http://www.admin-magazine.com/Archive/2017/40/SMB-3.1.1-in-Windows-Server-2016

Product:
Planning Analytics 2.0.5
Microsoft Windows 2016 server
Microsoft Excel 365 Office version 1803 click-to-run
Planning Analytics for Excel from C:\Program Files\ibm\cognos\IBM for Microsoft Office\cmplst.txt
[Main Applications]

COR_APP_version=COR_APP-AW64-ML-RTM-11.0.35.14-0
COR_APP_name=Cognos 8 Analysis for Excel
CAFES_version=CAFES-AW64-ML-RTM-10.3.0.1-0
CAFES_name=Cafes for Excel
CORREDIST_version=CORREDIST-AW64-ML-RTM-11.0.35.13-0
CORREDIST_name=IBM Planning Analytics for Excel
COI_version=COI-AW64-ML-RTM-11.0.35.7-0
COI_name=IBM Cognos COI

Problem:
When insert a custom report or dynamic report on this sheet, you get #NAME? instead of the numbers for the formula =DBRW($A$1,$A10,$B$2,E$6,$B$4,$B$3).
Click on Rebuild sheet does not help.
Insert a quick report on the sheet gives you numbers.

Solution:
Check that all the add-ins for Excel are installed.
Inside Excel go to File – Options.

Click on Add-ins
at Manage: Excel Add-ins click on GO.

The TM1 part is missing.
Click on Browse and go to C:\Program Files\ibm\cognos\IBM for Microsoft Office folder.

Select CognosOfficeTM1.xll and click OK.

Now you have the Add-ins you also need. Click OK.

You need both IBM Cognos Office Reporting TM1 addin and the IBM Framework for Office COM add-in to make PAX work.

More Information:
https://www-01.ibm.com/support/docview.wss?uid=swg22004391

Product:
Cognos Planning Analytics 2.0.6
Microsoft Windows 2012 R2 Server

Problem:
The log files for opsconsole fill up the C drive, where the software is installed. Can we change the logs file to be created on a separate hard disk named L:\?

Solution:
Login to the Windows server where you run TM1WEB (also the opsconsole)
stop the service IBM Cognos TM1
cut the folder C:\Program Files\ibm\cognos\tm1_64\bin64\opsconsoledata
paste it at L:
rename L:\opsconsoledata to L:\opslogs
start a command prompt as admin
enter command:

mklink /D  “C:\Program Files\ibm\cognos\tm1_64\bin64\opsconsoledata”  L:\opslogs

start the service IBM Cognos TM1
Now the logs are on the L drive….

NOTE: This is not a supported function, and should not be used in production environments.

Surf to opsconsole at the following web address:   http://servername:9510/pmhub/pm/opsconsole

More information:
https://www.ibm.com/support/knowledgecenter/en/SSD29G_2.0.0/com.ibm.swg.ba.cognos.tm1_inst.2.0.0.doc/t_tm1_inst_installingopsconcole.html

https://www.ibm.com/support/knowledgecenter/SSD29G_2.0.0/com.ibm.swg.ba.cognos.tm1_op.2.0.0.doc/c_tm1_ops_console_configuration_tasks.html

https://skimfeed.com/blog/symbolic-links-in-windows-for-pointing-a-folder-to-another-folder-on-an-external-hard-drive-or-ssd/

Product:
Planning Analytics 2.0.5
Microsoft Windows 2016 server

Problem:
How change the number of columns shown and loaded in tm1web?

Suggested Solution:
Update the file tm1web_config.xml, that are in folder
C:\Program Files\ibm\cognos\tm1_64\webapps\tm1web\WEB-INF\configuration on the TM1WEB server.
Change below lines to a number that works for your application:

<!– CubeViewerRowPageSize: Number of rows to fetch in a page of cubeviewer –>
<add key=”CubeViewerRowPageSize” value=”250″ />
<!– CubeViewerColumnPageSize: Number of columns to fetch in a page of cubeviewer –>
<add key=”CubeViewerColumnPageSize” value=”100″ />

In TM1 10.2 FP1 two settings, controls the number of columns and rows that are loaded in TM1 Web Websheets,
WebsheetRowThreshold and WebsheetColumnThreshold.
http://www-01.ibm.com/support/docview.wss?uid=swg27040580

In the documentation for Planning Analytics 2.0 these settings are not found, instead you find settings CubeViewerRowPageSize and CubeViewerColumnPageSize.
https://www.ibm.com/support/knowledgecenter/SSD29G_2.0.0/com.ibm.swg.ba.cognos.tm1_inst.2.0.0.doc/c_modifyingtm1webconfigurationparameters_n50ce5.html

The tm1web_config.xml.new do not have WebsheetRowThreshold and WebsheetColumnThreshold settings.

The old parameters can configure the amount of rows and columns in a websheet that completely fetch on loading. If the sheet fits roughly within the number of rows and columns specified with the new parameters, it will be fully loaded on start up.
If a sheet is larger than the values set with the parameters, then Cognos TM1 Web fetches additional data on demand as you scroll beyond the buffers of the previously loaded data.

New Planning Analytics have a better TM1WEB solution, and will cache values different, and above old parameters should not be needed for new TM1WEB.

The parameters in this file, tm1web_config.xml, control the following IBM TM1 Web features.

  • View node
    Cube Viewer page size
    Number of sheets to export from a Cube Viewer
    IBM TM1 Web startup and appearance settings
    Session timeouts

https://www.ibm.com/support/knowledgecenter/en/SSD29G_2.0.0/com.ibm.swg.ba.cognos.tm1_ug.2.0.0.doc/c_tm1web_cfg_params_v10r2.html

Product:
Cognos Planning Analytics Workspace version 36
Centos Linux

Problem:
How to upgrade PAW?
https://www.ibm.com/support/knowledgecenter/en/SSD29G_2.0.0/com.ibm.swg.ba.cognos.tm1_inst.2.0.0.doc/t_paw_upgrade.html

Solution:
Download the latest version from
https://www-01.ibm.com/support/docview.wss?uid=swg27049597

First check there is space left on your linux box with command df -h

Create a new folder for your new software
cd /
mkdir ibm
cd ibm
mkdir paw36

Copy the software over to the linux box with WINSCP
https://winscp.net/eng/download.php

Place the zip file in folder /ibm/paw36.
Unzip the file with the command:
unzip ipa_workspace_local_2.0.36.1440.5.zip -d /ibm/paw36

Go to the old existing folders script folder (/ibm/paw/scripts/) and run a backup with the command:
./backup.sh

Copy the paw.env file from your current installation to the new installation location.
cp /ibm/paw/config/paw.env  /ibm/paw36/config/paw.env

Copy the /config/certs directory from your current installation to the new installation location.
cp /ibm/paw/config/certs/*.* /ibm/paw36/config/certs/

If you use SSL and have saved the certificate in the pa-workspace.pem file, you need to copy that from your current installation to the new installation location.
cd /ibm/paw36/config
mv pa-workspace.pem  pa-org-workspace.pem
cp  /ibm/paw/config/pa-workspace.pem  pa-workspace.pem

If you have changed the old Start.sh file, you need to add the same values in the new Start.sh file.
Open Start.sh in Nano, add the IP address of the Linux server as ADMINTOOL_IP, and add the folder where docker-compose are to the PATH with the command PATH=$PATH:/ibm/comp/
cd /ibm/paw
nano Start.sh
Enter values like:
export ADMINTOOL_IP=192.168.254.11
export PATH=$PATH:/ibm/comp/
Press CTRL+O and click ENTER to save file.
Press CTRL+X to exit.

Change above path to your folder where DOCKER-COMPOSE is saved.

You may need to update the access to the new files with command:
cd /ibm/paw36
chmod 744 *.*
to give all read access to the files.

Stop the running PAW with command:
cd /ibm/paw/scripts
./paw.sh stop

Go to the new folder with command:
cd /ibm/paw36

Start the installation by enter
./Start.sh

Enter “y” when you are prompted to install the Docker images. Enter “y” when you are prompted to open the administration tool.

Start a Internet Explorer from your laptop and surf to the Linux server on http://192.168.254.11:8888

Inside the Planning Analytics Workspace administration tool, check that all URL to CA11 and TM1 are correct.
https://www.ibm.com/communities/analytics/cognos-analytics-blog/cognosanalytics-and-planninganalytics-integration-walkthrough-part-3/

Some Values need to be entered again in configuration tab, do it and save them.


Click Validate.
Click Status.
Click Restart.
Wait until CPU is below 1%.
Click Refresh.

Then surf to HTTP://pawservername.domain.com

Normally the docker containers and images are in folder /var/lib/docker

The daemon.json file are in folder /etc/docker

NOTE: to prevent from having to many PAW folders, and make it hard to now what folder that is in use, do the opposite – rename the old folder /ibm/paw to /ibm/pawold. Create the new folder /ibm/paw. Copy the new zip file to that folder, and unpack it. Then copy the needed config files from /ibm/pawold to the /ibm/paw folder (as listed above). Then start the installation(upgrade) from /ibm/paw/Start.sh.

More Information:
http://blog.thoward37.me/articles/where-are-docker-images-stored/

Product:
Planning Analytics Workspace 35
Linux Red Hat

Problem:
You have no access to the paw, and a restart of the docker containers with command  ./paw.sh stop  and  ./paw.sh  did not help.

If you click “validate” inside the Workspace administration page, you get Error getaddrinfo ENOTFOUND on all the server links.

Error Message:
Proxy Error
The proxy server received an invalid response from an upstream server.
The proxy server could not handle the request GET.
Reason: DNS lookup failure for: wa-proxy

Workaround:
Restart the linux box.
Login to the linux server with PUTTY.
Change to root user with command:
sudo su – root
Restart the linux server with command:
init 6

Try again after a few minutes.

Product:
Planning Analytics 2.0.5
Microsoft Windows Server 2016

Recommend to have only the valid TM1S.CFG functions in the file, the default values change with each version of TM1. You need to check the values, and remove the unnecessary ones. https://www.ibm.com/support/knowledgecenter/en/SSD29G_2.0.0/com.ibm.swg.ba.cognos.tm1_inst.2.0.0.doc/c_parametersinthetm1s.cfgfile_n1503fe.html

(This is a copy of text on this site – they contain good information)
https://cubewise.com/blog/adjusting-server-configuration-planning-analytics/

Planning Analytics or TM1 server v11 comes with a host of new features. In order to get best performance and control behavior of the TM1 server if you are upgrading or considering upgrading to PAL then you should make sure that as a minimum you optimize your server configuration file (tm1s.cfg) to get the most out of the latest version. This article is part of our series on upgrading from TM1 to Planning Analytics. To get an overview of other areas to consider please refer to this topic.

Not all the parameters examined here are strictly brand new to v11 some have been around for a while or have changed default values or might now be redundant and therefore require a revisit.

One of the most significant changes to TM1 server behavior between versions 10.2.2 and 11.x is that almost all server configuration parameters are now dynamic. Even parameters concerning login and security are now dynamic. The parameters remaining static or immutable for the duration of the server session are now only a handful (where also it must be said that static makes sense) e.g. AdminHost, ServerName, DataDirectory, PortNumber. This is a huge change which should be acknowledged and hats off to IBM for achieving this.

tm1s.cfg parameters worth revisiting

  • HTTPPortNumber (static, default=5001) Determines the port number used by the Rest API. With the Rest API becoming the default API for the TM1 server it is important to set this parameter and not use the default value.
  • HTTPSessionTimeoutMinutes (dynamic, default=20) Determines the idle timeout for Rest sessions. We suggest matching the value of HTTPSessionTimeoutMinutes to IdleConnectionTimeoutSeconds.
  • UseSSL (static, default=T) Determines whether the server uses encrypted communication.
  • MaximumCubeLoadThreads (static, default=0) By default this parameter is disabled. In most situations Cubewise has recommended enabling. However, this recommendation changes with Planning Analytics as the new parameter MTCubeLoad should be used in its place. Having this parameter enabled can cause the server to crash during start-up in PA. MaximumCubeLoadThreads should now be set to 0 or removed from the tm1s.cfg file.
  • MTQ (dynamic, default=ALL) Multi-Threaded Query mode is now enabled by default to use all available cores (default value ALL or -1). Our recommendation however remains to set MTQ=-2 which will set MTQ to use all available cores less one which allows other operations on the server to be processed even under peak load from the TM1 server.
  • MTQQuery (dynamic, default=T) Determines whether TI processes utilize MTQ to materialize data source cube views.
  • PersistentFeeders (static, default=T) Persistent Feeders are now enabled by default allowing for faster server load times (at the cost of additional disk consumption for .feeders files and longer time to save all server data). The default value is recommended.
  • IntegratedSecurityMode (dynamic, default=1) This parameter is now dynamic, meaning for example that login method could be changed from CAM to TM1 security or vice versa while the server is in session. This has been discussed previously on our blog here.
  • UserDefinedCalculations (dynamic, default=T) We are including this parameter here because as of the latest 2.0.x PAL release user defined calculations are not supported in the new Rest based clients (Workspace, PAX) and it remains unclear whether or how UDCs will be supported in the future.

Brand new parameters

  • EnableNewHierarchyCreation (static, default=F) Alternate hierarchies are the banner new feature of Planning Analytics however are disabled by default. To enable hierarchies this parameter must be set to T. If you plan on implementing hierarchies make sure you also read our blog posts on the changes you need to consider to rules and TI processes as this can have far reaching impacts on your model.
  • MTCubeLoad (dynamic, default=F) This parameter replaces MaximumCubeLoadThreads. With MTCubeLoad cubes are still loaded in serial but multiple threads are used to load each cube. We recommend enabling this feature as this dramatically decreases server load times. The feature is disabled if MTQ=0 or MTQ=1.
  • IndexStoreDirectory (dynamic, default=DataDirectory) When MTCubeLoad is enabled this parameter defines the directory where index and bookmarks are stored to further optimize the server load phase. We recommend setting a directory other than the data directory.
  • MTFeeders (dynamic, default=F) Enables multi-threaded feeder evaluation during saving of rules or running the CubeProcessFeeders function. Enabling this function is recommended as this can significantly speed up saving of rules.
  • MTFeeders.AtStartup (dynamic default=F) Enables multi-threaded feeder evaluation during the server load phase. Combined with MTCubeLoad this can lead to dramatic reduction of startup times to a level comparable to the load time achieved with Persistent Feeders. We recommend enabling this feature.
  • PreallocatedMemory.Size, PreallocatedMemory.BeforeLoad (dynamic, default=F) These parameters can be used to fine-tune the behaviour of MTCubeLoad.
  • CreateNewCAMClients (dynamic, default=T) This parameter was introduced in 10.2.2 FP5 so is not strictly new with PA but due to the recent introduction is probably still unknown to most administrators. Setting this parameter to F disables the creation of new clients in a server using CAM authentication. This can reduce locking contention experienced when new clients attempt to log in.
  • EnableSandboxDimension (dynamic, default=F) Enabling this parameter allows sandboxes to be used as a virtual dimension when querying cubes. This is a fantastic feature however should be used with caution as once enabled all Rest and MDX queries will expect sandbox to be addressed as the 1st dimension in all cube queries.
  • EnableTIDebugging (dynamic, default=F) Debug mode for TurboIntegrator is another valuable new feature which is disabled by default. We recommend enabling this parameter on all non-production servers. The default value can be left for production environments where step-through debugging is not required. However, as the parameter is dynamic it can be enabled if needed.

In summary

Planning Analytics has several new features which require administrators to change default server configurations to access the new features. Even if you don’t plan on implementing alternate hierarchies we recommend using all of the new parameters listed above. When launching new features IBM has a history of first disabling new features by default and in later releases changing the default behaviour. Examples of this can be seen with ParallelInteraction, PersistentFeeders and MTQ which were all initially disabled by default but are now enabled by default. We would expect the default values for EnableNewHierarchyCreation, MTCubeLoad & MTFeeders to all switch to being True within the next few releases.

Should you want to delve even deeper into optimizing tm1s.cfg parameters the Cubewise Code blog has a good article on this topic.

Product:
Planning Analytics 2.0.5
Microsoft Windows Server 2016

(this is a copy of text from IBM site)
https://www-01.ibm.com/support/docview.wss?uid=ibm10737313&myns=swgimgmt&mynp=OCSSD29G&mync=F&cm_sp=swgimgmt-_-OCSSD29G-_-F

Summary

The purpose of this document is to help with understanding and managing the memory utilization of the TM1 Application Server.

Understanding of Usage

Your TM1 Application Server (JAVA) memory is impacted by the following applications:

  • TM1Web (viewing/exporting data)
  • PMPSVC (viewing/exporting data)
  • Planning Analytics for Excel (PA for Excel, which uses the PMHub service on the TM1 Application Server)
  • Planning Analytics Workspace (if used to display PA for Excel/Perspectives reports)
  • Operations Console (obtaining server status/logging data)

As much as one would like to know what the best settings are or how much memory is required to host the TM1 Application Server, the answer does not come easy.  The following questions must be taken in to account and even with answers to the following questions, only load/stress testing can provide an accurate estimate to the resources required for your specific load:

  • How many concurrent users are expected?
  • What application(s) are users expected to use?
  • Are users doing reporting only or is data entry included?
  • How much data is to be consumed by a user?  (how many rows/columns/cells)
  • How much data is expected to be exported?
  • Do web/excel books contain more than one tab?  (is it really 1 report, or 20 reports under 1?)

Applications Settings / Governors

To get a better handle on your TM1 Application Server memory management, the below safeguards/settings should be introduced.  It is recommended to implement more restrictive safeguards, raising the restrictions as required and where it makes sense to do so.

TM1Web (via /tm1_64/webapps/tm1web/web-inf/configuration/tm1web_config.xml):

<!– If specified, sets a limit for the maximum number of cells in the workbook file. It will not be possible to open workbooks with a number of cells that exceeds this limit. Leaving this blank will allow all workbooks to be opened. –>
<add key=”WorkbookMaxCellCount” value=”500000” />

<!– If specified, sets a threshold for limiting of exports using the number of cells. If the set threshold is exceeded when performing an export (looped exports will reach this multiplicatively), then the export will not be allowed for that user. Is applied for all users, and for  Websheets and CubeViewer. Leaving this blank will allow for classic-style exporting with no cell count limitations. –>
<add key=”ExportCellsThreshold” value=”200000” />

<!– MaximumSheetsForExport: Maximum number of sheets allowed to Export –>
<add key=”MaximumSheetsForExport” value=”50” />

<!– Determines the maximum number of exports that can occur concurrently. If less than or equal to zero, will allow virtually unlimited concurrent exports (2147483647 max). The default value is 5. –>
<add key=”MaximumConcurrentExports” value=”2” />

<!– CubeViewerRowPageSize: Number of rows to fetch in a page of cubeviewer –>
<add key=”CubeViewerRowPageSize” value=”100” />

<!– CubeViewerColumnPageSize: Number of columns to fetch in a page of cubeviewer –>
<add key=”CubeViewerColumnPageSize” value=”20” />
Planning Analytics for Excel (via Options Menu):

  • Application Settings:
    • Auto spread consolidated input, recommended value: disabled
      •   If not required, disable this setting to prevent users from accidentally spreading
      •   If spreading is allowed, ensure users are educated so that a spread is not attempted on an intersection that spans too many cells
  • Exploration or list settings:
    •  Data display row limit, recommended value: 5000
      •   Reduce the number of rows displayed to ensure a larger than necessary query is no executed
    •  Expand member limit, recommended value: 2000
      •   Reduce the number of rows displayed to ensure a larger than necessary query is no executed
  • Planning Analytics for Excel Design Recommendations:
    • Ensure that Zero Suppression is enabled where possible, to prevent larger than necessary queries
    • Do not include too many sheets/tabs in a single book, create multiple reports where possible
    • Ensure the Excel Sheet does not contains many unnecessary blank/empty columns/rows

 

Tuning the JAVA Process

The TM1 Application Server sits on a Websphere Liberty Application Server, which utilizes the JAvA platform.  The details in this section can assist you with further tuning of the JAVA Process for the TM1 Application Server, however as mentioned earlier in this document – only a proper test/load plan can ensure a successful deployment.

To fine tune the TM1 Application Server’s JAVA Process, edit the jvm.options file (found in \tm1_64\wlp\usr\servers\tm1).  Consider adding the following entries, below the existing jvm.options settings (or updating any settings that may already exist).  Values should be adjusted accordingly to available resources, for example – don’t allocate an Xmx of 24576m if you have less than 32GB of RAM on the server.

  • -Xmx12288m
    • Xmx specifies the maximum memory allocation pool (heap) for a Java Virtual Machine (JVM)
      • Xmx is more commonly set using Cognos Configuration.  Changes to Xmx in the jvm.options file will take precedence.
      • Xmx should NOT exceed 24576m, as additional performance problems may be encountered.
  • -Xms12288m
    • Xms specifies the initial memory allocation pool on process startup.
    • Xms should reflect the same value specified by the Xmx parameter.
  • -Xmn6144m
    • Xmn sets the size of the new (nursery) heap.
    • Xmn should be set to half of the value specified by the Xmx parameter.
  • -Xgcpolicy:balanced
    • A garbage collection policy of balanced is recommended when the Xmx exceeds 8192m
    • Depending on application usage, a different garbage collection policy may prove beneficial to performance
      • If using TM1Web, a policy of balanced is recommended
      • If Planning Analytics for Excel is the only application using the TM1 Application Server, a policy of gencon may be better suited
        • Usage of the application, size of reports, number of users all come in to effect.  Both policies should be tested accordingly.
  • -Xverbosegclog:${bin_path}/verbosegc_%Y%m%d.%H%M%S.%pid.log,1,20000
    • Enabling Verbose Garbage Collection logging is recommended
    • The performance impact of having the logging enabled is negligible
    • Should any problems arise with the JAVA process, the output of the Verbose GC logs may prove beneficial to understanding root cause

Using the settings above, the entries in the jvm.options file will look like:
-Xmx12288m
-Xms12288m
-Xmn6144m
-Xgcpolicy:balanced
-Xverbosegclog:${bin_path}/logs/verbosegc_%Y%m%d.%H%M%S.%pid.log,1,20000

 

 

Additional Information

The following documents may prove useful in troubleshooting issues related to the TM1 Application Server

How to Troubleshoot a TM1 Application Server Hang
https://www-01.ibm.com/support/docview.wss?uid=swg22002262

How to Troubleshoot a TM1 Application Server Crash
http://www-01.ibm.com/support/docview.wss?uid=swg22002285

Product:
Planning Analytics 2.0.5 workspace
Microsoft Server 2016

Problem:
You get value in to the TM1 application, from a yearly cost of company car, and you want to know what the monthly cost is. Where you have the values for start date, and the accumulated sum for the month.

Starting solution:
This show part of the solution, to get you to understand the ParseDate function.
You must expand this solution to get a fully working TI process.

You can a find better solutions here:
http://www.tm1forum.com/viewtopic.php?t=9736

 

Login to PAW (Plannig Analytics Workspace) and your application.
ti1
Click on the pencil to make it blue so you are in edit mode.
ti2
Right click on dimensions to create a new dimension
ti3
Enter the name of the dimension and click create
ti4
Click OK and click on enter new members
ti5
ti6
Enter StartDate and click on commit
ti7
click on plus sign and select after selection to add a new member of the dimension.
ti8
Enter this values – this will be part of our cube to show the result.
ti9
Select the Breakdate and right click to set format to Date, repeat for StartDate.
ti10
Click on the show members attributes to see the format.
Create a new dimension named Result.Log with only one member.
ti11
ti12
Right click on cubes to create a new cube of above dimensions.
ti13
Mark Result.Event and Result.log and click arrow to move them over.
Enter Result Cube as name of the cube, and click on Create.
ti14
Click on the plus sign to add a new sheet. Select a boxed template.
ti15
Select the cube and right click and select Add new view
ti16
Click on the box and on the icon to change row and columns value. So you get this;
ti17
ti18
Right click on process to create a new process. Name the process AmountCalculated.
ti19
Expand the code window so you see most of it below the cube view.
ti20
Click on parameters tab, and click on plus sign to add input parameters. Here we will enter the values to test the process. In your solution you may get this values from a file instead.
ti21
Enter above names and start values. (p is to show it is a parameter variable)
ti22
Enter script tab, to enter your code.
Enter below code under Prolog End: Generated Statements

# define the cube and dimensions to work with
sDimName = ‘Result.Log’;
sCubename = ‘Result Cube’;

nBreakDateA = ParseDate( pBreakDateA, ‘yyyyMMdd’,0 );
nStartDateA = ParseDate( pStartDateA, ‘yyyyMMdd’,0 );

# write variables values to txt file in servers log folder for debug
asciioutput (‘../logfiles/debuglog1.txt’, str(nBreakDateA,8,2),str(nStartDateA,8,2) , numbertostring(pTotalAmountA));

# check if it is before break date then divide with 12
if (nStartDateA < nBreakDateA);
nMonthAmountA = pTotalAmountA / 12;
else;

# check if the distance is not in the same year
nStartYear = stringtonumber( subst (pStartDateA, 1,4));
nEndYear = stringtonumber( subst (pTotalDateA, 1,4));

# write variables values to txt file in servers log folder for debug
asciioutput (‘../logfiles/debuglog3.txt’, numbertostring(nStartYear),numbertostring(nEndYear) , pStartDateA, pTotalDateA );

if (nStartYear <>  nEndYear);
nTotalDateA = ParseDate( pTotalDateA, ‘yyyyMMdd’,0 );
nResult=  ((YEAR(DATE( nTotalDateA  )) – YEAR(DATE( ( nStartDateA ))) )*12)+ (MONTH(DATE( nTotalDateA )) – MONTH(DATE( ( nStartDateA ))) ) ;

# write variables values to txt file in servers log folder for debug
asciioutput (‘../logfiles/debuglog4.txt’, numbertostring(nStartYear), numbertostring(nEndYear) , numbertostring(nResult)  );

# if total is far from start date to today – you use the same formula  but nResult is total number of months
nMonthAmountA = pTotalAmountA / nResult;
else;
# change here what values you want to use for the calculation
nStartmonth = stringtonumber( subst (pStartdateA, 5,2));
nEndmonth = stringtonumber( subst (pTotaldateA, 5,2));
nOfMonths = nEndmonth – nStartmonth +1;
# calculate value if less than a year
nMonthAmountA = pTotalAmountA / nOfMonths;
endif;
endif;
ti23
Then enter below code as Epilog

# write variables values to txt file in servers log folder for debug
asciioutput (‘../logfiles/debuglog2.txt’, str(nStartmonth,8,2) ,numbertostring(nMonthAmountA) ,numbertostring(pTotalAmountA), numbertostring(nOfMonths));

# create a new dimension element with the date and time as value to make it uniq
sDate = (timSt(now, ‘\Y-\m-\d \h:\i’));
DimensionElementInsertDirect( sDimname,’ ‘, sDate, ‘N’ );
# update the log cube with the values
CellPutN( nStartDateA+21916, sCubename,’StartDate’, sDate );
CellPutN( pTotalAmountA, sCubename,’TotalAmount’, sDate );
CellPutN( nMonthAmountA, sCubename,’MonthAmount’, sDate );
CellPutN( nBreakDateA+21916, sCubename,’BreakDate’, sDate );

# 1 in excel is 1900, but in TM1 that is 1960 – therefor the addition of 21916 days.
ti24
Click validate, and correct any ‘ that may be of wrong format.
ti25
Save and run.
ti26
Enter your values and click OK.
ti27
Above error if there is no /logfiles/ folder parallel to the data folder.
Create on folder c:\Program Files\ibm\cognos\tm1_64\samples\tm1\logfiles, if you use a sample TM1 application that do not have a separate log folder.
ti28
Connect to the server and check the log files to see variables values.
ti29
Click on refresh for the view to see the result.
ti30

Things to review:
nBreakDateA = ParseDate( pBreakDateA, ‘yyyyMMdd’,0 );
Above code change the string to a date number, but should be changed so it is not use the 1960 as start date.

if (nStartDateA < nBreakDateA);
Above code check if the date is before a specific date, you can change this to be automatic check for 12 months .

if (nofmonths > 12);
# if total is for from start date to today – you use the same formula
nMonthAmountA = pTotalAmountA / nOfMonths;
else;
Above code check if the difference is more than 12 months, then you can enhance it to calculate the value from January to today date only, if the total is only for this year.

# asciioutput (‘../logfiles/debuglog1.txt’, str(nBreakDateA,8,2),str(nStartDateA,8,2) , numbertostring(pTotalAmountA));
Remark out the debug asciioutput lines from your code.

More information:
http://www.wimgielis.com/tm1_newdateformatter_EN.htm
https://www.exploringtm1.com/date-time-functions-tm1-10-2/
https://www.exploringtm1.com/accumulating-values-in-tm1/

Operators in TI IF Statements

Product:
Planning Analytics 2.0.5 (PAX)
Microsoft Windows 2016 Server

Issue:
When you select a TM1 server in list of TM1 connections in Planning Analytics for Excel a HTTP 503 error appears.
If you test the connection to PAW in the options box it is OK.

Possible solution:
Inside the ADMINTOOL for PAW (planning analytics workspace) the value for TM1 Application Server Gateway URI is wrong, it may point to HTTPS when it should point to HTTP.
This if you have changed your setup of TM1WEB(WLP) from HTTPS to HTTP to do some testing.

Copy the URL to a web browser (IE) and test if you can surf there.
Example of URL are:
http://tm1webservername.domain.com:9510

More information:
https://www.exploringtm1.com/client-connection-to-pax/
https://www.ibm.com/communities/analytics/cognos-analytics-blog/cognosanalytics-and-planninganalytics-integration-walkthrough-part-3/