Product:

Planning Analytics 2.0.9.3

Microsoft Windows 2019 server

Issue:

How move data folder from server A to server B and keep the tm1websheets that are created in applications?

 

Solution:

The TM1WEBsheets you publish from TM1 Perspective, are stored under the data folder, in a folder named }Externals.

Take a backup of that folder ( D:\TM1 Data\tm1instancename\Data\}Externals )

Copy the data folder from server A to server B.

Put back the }Externals folder, and you will have your previous websheets the same as before.

More Information:

  • TM1 copies and saves uploaded files to the TM1 server in the following directory: <server_data_dir>\}Externals directory.
  • When a file is uploaded to the TM1 server, the file name is appended with a time/date stamp.For example, if you upload the file US Budget.xls to the TM1 server, the file is saved as US Budget.xls_20040702193054.xls.
  • When you delete an uploaded file from a TM1 application, TM1 deletes the copy of the uploaded file from the }Externals directory. The original file, outside of TM1, that the uploaded file was copied from, is not deleted.

https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=application-behavior-files-uploaded-tm1-server 

https://www.ibm.com/support/pages/http-403-forbidden-error-accessing-tm1-web-url-cognos-connection-or-external-site 

 

Product:

Planning Analytics Workspace 75

Microsoft Windows 2019 server

Issue:

When in PAW administration, and you click on the databases and select one to see the treads – you get a error message.

“There has been an issue processing this request. Please try again later.”

Possible solution:

Change the httpportnumber in use in the tm1s.cfg file to a port that is open in the network firewall between the tm1 server and the paw server.

PortNumber=12450
HTTPPortNumber=12550

If the Tm1s.cfg file does not contain the PortNumber parameter, the TM1 server uses port 5000.

When you install a new TM1 server, the default HTTP port number is 12354. Valid port values are between 5000 and 49151.

If the Tm1s.cfg file does not contain the HTTPPortNumber parameter, then you can not use the OData v4 Compliant REST API.

Or open more ports in the firewall, between the servers.

 

More information:

https://www.ibm.com/support/pages/planning-analytics-administration-paa-unable-display-status-tm1-databases-agent-status-reflected-unreachable-within-paa-console 

https://community.ibm.com/community/user/businessanalytics/discussion/warning-paw-creatededited-ti-are-not-backwards-compatible-with-architect 

https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=cmpal-parameters-in-tm1scfg-fileplanning-analytics-engine-configuration-parameters

Product:

Planning Analytics 2.0.9.13

Microsoft Windows 2019 server

Issue:

How set that a user group only see some tm1 web sheets and not all in tm1 web for a TM1 instance?

Solution:

Inside TM1 Architect, you create the Applications, and add web sheets or files or views to the application. Create a extra folder (application) for normal users and a folder for admin users. Then drag the views to the different folders, and publish the web sheet to the different folders. This give that you have two folders under the main application folder.

Right click on your first application – and select new – application, that gives you a new folder.

Then right click on the top folder and select security – security assignments.

Set NONE for everyone except for the administrator group, on the admins folder. This will give that only people part of the admin group will have access to the folder and any content inside it, like web sheets or links.

In the left you see the objects, like the folders, and then in columns you have the groups you have defined. If you use CAM security, you can add the CA groups to columns. Mark on cell for a user group and the folder you want to change security for, set NONE, if no access should exist. Click on OK.

By creating applications (folders) and set security on then, you can add views and websheet to the folder, and they will have the same access as the folder.

Remember to make the folder and tm1websheets public, after you have made them part of the application.

 

More Information:

https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=developers-organizing-objects-in-tm1-applications

Creating a TM1 Contributor Model

http://web.archive.org/web/20180425055420/http://tm1up.com/10-tm1-applications.html#more-1345

Product:

Planning Analytics 2.0.9.13

Microsoft Windows 2019 server

Issue:

How easy create a parameter cube, where we can store common values like file paths or servername, that change when we move the TM1 application between servers.

Suggested solution:

Create a TI process with this PROLOG – update the values to your need:

#--------------------------------------------------------------------
# process to create a simple parameter cube
#--------------------------------------------------------------------

CubeName1 = 'SYS.ParameterCube';
DimName1 = 'sys.value' ;
DimName2 = 'sys.function' ;

# create dimension
IF ( DimensionExists ( DimName1 ) = 0 ) ;
DimensionCreate ( DimName1 ) ;
ENDIF ;

IF ( DimensionExists ( DimName2 ) = 0 ) ;
DimensionCreate ( DimName2 ) ;
ENDIF ;

#add elements to dimension
Elname1 = 'ExportFileArea';
Elname2 ='String';
Elname3 ='Number';
Elname4 = 'Explanation' ;
ElType1 = 's'; 
ElType2 = 'n';

DimensionElementInsert (DimName1, '' ,ElName2, ElType1 ) ;
DimensionElementInsert (DimName1, '' ,ElName3, ElType2 ) ;
DimensionElementInsert (DimName1, '' ,ElName4, ElType1 ) ;
DimensionElementInsert (DimName2, '' ,ElName1, ElType1 ) ;

# then only add extra rows for other functions
# Elname5 = 'Servername';
# DimensionElementInsert (DimName2, '' ,ElName5, ElType1 ) ;

# create a cube
IF( CubeExists( CubeName1 ) = 0 ) ;
CubeCreate ( CubeName1 , DimName2 , DimName1 ) ;
ENDIF ;


# add default values to the matrix
sFirstPath = '\\servername\filesharename\' ;
sExplain = 'The path to be used in beginning of filepath used in file exchange' ;

CellPutS ( sFirstPath , CubeName1 , Elname1 , Elname2 ) ;
CellPutS ( sExplain , CubeName1 , Elname1 , Elname4 ) ;

 

 

Run the TI process to create the dimensions and cube.

Then on the TI process that should use the parameters, include a code like this:

# use the values to write a file

# define variables
CubeName1 = 'SYS.ParameterCube';
Elname1 = 'ExportFileArea';
Elname2 ='String';
vFile = 'magicfile.txt';

# get value from parameter cube
vReplacePath = cellgets ( CubeName1 , Elname1, Elname2 ) ;

# create the new path and file name
vPath = vReplacePath | 'data\' ;
vFilePath = vPath | vFile ;

# write to the file
ASCIIOUTPUT ( vFilePath , 'This is the text written to the file' );

 

This code can be improved – that is up to you.

More Information:

https://exploringtm1.com/asciioutput-tm1-function-use-syntax/ 

https://www.wimgielis.com/tm1_clearingbigchunksofdata_EN.htm 

https://exploringtm1.com/dimensionelementinsert-tm1-function-use-syntax-and-examples/ 

https://exploringtm1.com/cellputs-tm1-function-use-syntax/ 

Product:

Planning Analytics 2.0.9.3

Microsoft Windows 2019 server

Issue:

In tm1server.log file there is reference to wrong ODBC connection. Like this

TM1.Process Process “second.process” run from process “main.process” by user “AD/donald”
TM1.SQLAPI Checking Driver Capabilities for database “TEST”
TM1.SQLAPI Driver supports SQLFetchScroll

Possible solution:

As the Data Source tab is pointing to the TEST odbc connection, a test connection is made and written in the log file.

Change the ODBC name in the Data Source tab to the expected ODBC connection like: PROD

You can still have the TI process change the ODBC connection to correct one in the prolog tab, with code like this:

#--------------------------------------------------
# Set source with values from variables you have defined before
#--------------------------------------------------
DataSourceType='ODBC';
DatasourceNameForClient=sODBCConnection;
DatasourceNameForServer=sODBCConnection;
DatasourceUsername=sUser;
DatasourcePassword=sPassword;
DatasourceQuery = sDataSourceQuery ;


#-----------------------------------------------------------------------------
# open the connection to the database ODBC connection
#-----------------------------------------------------------------------------
ODBCOpen(sODBCconnection, sUser , sPassword );

 

# this will change the ODBC connection, and the metadata and data tab will use the new ODBC source.

The DatasourceNameForServer= variable will set the ODBC source used by the process when run from a chore.

More Information:

https://www.ibm.com/support/pages/tm1sqlapi-01000microsoftodbc-driver-manager-cursor-library-not-used-load-failed

https://www.ibm.com/docs/en/cognos-tm1/10.2.2?topic=variables-datasourcenameforserver 

https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=tv-turbointegrator-local-variables 

Product:

Planning Analytics 2.0.9.3

Microsoft Windows 2019 server

Issue:

You have create a TM1 WEBSHEET in excel (Tm1 perspective) with a ACTION BUTTON that run a process.

This works fine in TM1WEB, but not in TM1 APP WEB (old contributor). When you in the web page (inside contributor session) click on the button icon, nothing happens.

Solution:

To be able to click on websheet buttons in Tm1 Application Web, you need to first take ownership.

Click on the icon  take ownership.

Then the buttons in the websheet will work in the Tm1 App Web (contributor session).

When you are done, you need to relase the owner ship, so other can access that company node.

 

 

More information:

https://www.ibm.com/docs/en/cognos-tm1/10.2.2?topic=applications-ownership-bouncing-releasing 

Adding or editing data in the web client allows you to submit information to your datastore. To modify data, your system administrator must grant you access.

 

Data that you can edit has a white background. Read-only data has a gray background. If you are not the current owner, the data opens in a read-only view. To start adding or editing data or click on button, click Take Ownership Take ownership icon.

You can edit data only if it has a workflow state of Available Not started icon or Reserved Work in progress icon. The icons indicate the workflow state.

Ownership availability for a particular node can be changed depending on how the parent node is opened. For example, contributors and reviewers who open the parent node in IBM® Cognos® Insight are not able to take ownership of the node. See the TM1® Performance Modeler documentation and the Cognos Insight documentation for details on ownership and nodes.

After taking ownership, use the Release Release ownership icon icon to release the data so other people can use it . In Cognos TM1 Application Web, you must submit all nodes at the level at which you take ownership and you can only release ownership at the level you have taken ownership.

 

https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=applications-working-data

You can insert an Action button into a worksheet so users can run a TurboIntegrator process and/or navigate to another worksheet. Users can access these buttons when working with worksheets in Microsoft Excel with TM1, or with Websheets in TM1 Web.

The action buttons will work in TM1 application Web, too, even not stated in the IBM documentation.

https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=developers-using-tm1-action-buttons-build-worksheet-applications 

https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=excel-action-buttons 

https://www.ibm.com/docs/en/SSD29G_2.0.0/com.ibm.swg.ba.cognos.tm1_inst.2.0.0.doc/tm1_inst.pdf 

Product:

Planning Analytics 2.0.9

Microsoft Windows 2019 server

Issue:

After change in Windows registry to prevent use of TLS 1.0 and TLS 1.1 communication, the ODBC driver to SQL server does not work.

ODBCOpen ( vSource, vClient, vPassword );  does not work.

Error can be: DCOM was unable to communicate with the computer SQLserver using any of the configured protocols; requested by PID

SQLState: 01000
SQL Server Error:1
Microsoft ODBC SQL Server Driver DBNETLIB ConnectionOpen SECCreateCredentials()
Connection Failed
SQLState: 08001
SQL Server Error:18
SSL Security Error

Solution:

Change the ODBC driver from Microsoft SQL Server ODBC Driver Version 10.00.14393 to a new, like Microsoft SQL Server Native Client Version 11.00.7462.

Backup the registry values under [HKEY_LOCAL_MACHINE\SOFTWARE\WOW6432Node\ODBC\ODBC.INI\ODBC Data Sources], to be able to restore it.

Go to Control Panel – Administrative tools – ODBC Data Sources (32-bit), to add the new driver with the same name and selected database.

 

You can run below commands to set the values for disabled TLS 1.0 and TLS 1.1 on the server:

reg add "HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.0\Server" /v Enabled /t REG_DWORD /d 0 /f

reg add "HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.0\Server" /v DisabledByDefault /t REG_DWORD /d 1 /f

reg add "HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.0\Client" /v Enabled /t REG_DWORD /d 0 /f

reg add "HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.0\Client" /v DisabledByDefault /t REG_DWORD /d 1 /f

reg add "HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Server" /v Enabled /t REG_DWORD /d 0 /f 

reg add "HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Server" /v DisabledByDefault /t REG_DWORD /d 1 /f

reg add "HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Client" /v Enabled /t REG_DWORD /d 0 /f 

reg add "HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\SecurityProviders\SCHANNEL\Protocols\TLS 1.1\Client" /v DisabledByDefault /t REG_DWORD /d 1 /f

 

More Information:

https://thesecmaster.com/how-to-enable-tls-1-2-and-tls-1-3-on-windows-server/

https://support.microsoft.com/en-us/topic/kb3135244-tls-1-2-support-for-microsoft-sql-server-e4472ef8-90a9-13c1-e4d8-44aad198cdbe

https://support.microsoft.com/en-us/topic/october-20-2020-kb4580390-os-build-17763-1554-preview-ac4799c9-838f-8665-a968-0f19b6cb1049

https://think.unblog.ch/en/how-to-use-tls-1-2-and-tls-1-3-on-windows-server/ 

https://support.site24x7.com/portal/en/kb/articles/how-to-check-if-tls-1-2-is-enabled

Product:
Planning Analytics 2.0.9

Issue:

What are the shortcut keys in TM1 Architect?

Partly solution:

CTRL+I  = indent (tab text)

CTRL+S = save the TI process

CTRL+Z = undo last action

 

Use PAW to get a better user experience. https://pmsquare.com/analytics-blog/2022/5/5/new-cube-viewer-set-editor-in-planning-analytics

To find where the keys are on your keyboard, use this layouts.

United Kingdom

USA

More Information:

https://en.wikipedia.org/wiki/QWERTY

https://revelwood.com/ibm-planning-analytics-tips-tricks-rule-editor-keyboard-shortcuts/

https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=r-data-spread-keyboard-shortcuts

https://www.wimgielis.com/tm1_articles_EN.htm

https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=features-keyboard-navigation

Product:

Planning Analytics 2.0.9.13 TM1_version=TM1-AW64-ML-RTM-11.0.913.10-0

Microsoft Windows 2019 server

Microsoft SQL Server 2012 Native Client ODBC driver version 11.2.5058.0

Issue:

Can not open a connection to a Microsoft SQL database, via ODBCopen command. You can open the connection in the TM1 datasource tab and read the data, but not open in a ODBCopen command in the prolog tab. When you test the ODBC connection in Windows, it works fine with your SQL login.

You get similar error, when you try a 32 bit ODBC or 64 bit ODBC connection.

In Windows event log you get this error:

Faulting application name: tm1odbcproxy32.exe, version: 0.0.0.0, time stamp: 0x62586410
Faulting module name: tm1odbcproxy32.exe, version: 0.0.0.0, time stamp: 0x62586410
Exception code: 0xc0000005
Fault offset: 0x000058f8
Faulting process id: 0xfc8
Faulting application start time: 0x01d93a5fe5481324
Faulting application path: C:\Program Files\ibm\cognos\tm1_64\bin64\tm1odbcproxy32.exe
Faulting module path: C:\Program Files\ibm\cognos\tm1_64\bin64\tm1odbcproxy32.exe
Report Id: cadfaa7a-2357-4370-9ecf-3c6500029586
Faulting package full name:
Faulting package-relative application ID:

Solution:

The connection needs to use UNICODE. Use ODBCOPENex instead.

Format is: ODBCOPENEx (dataset name, dataset client name, client password, (use-Unicode-interface flag) )

https://www.ibm.com/docs/en/cognos-tm1/10.2.2?topic=functions-odbcopenex

In TM1 Architect data source tab, you get same error if you uncheck the “use unicode” checkbox.

In prolog enter this code:

 

#--- debug setup
sDEBUGfile = 'debugfile3.txt';
sDEBUGpath = 'c:\temp\';
sDEBUG1 = sDEBUGpath | sDEBUGfile ;

#-- setup the database connection
sODBCname = 'windows2016';
sUser = 'donald';
sPassword = 'Password!';

#-- open the connection to the database with unicode by adding the 1 parameter
ODBCOpenEx(sODBCname, sUser, sPassword,1);

#-- create the sql statement - adjust for your test database
sSQL = 'TRUNCATE TABLE AdventureWorksLT2019.dbo.Lista';

ASCIIOutput ( sDEBUG1, sSQL, 'check sql syntax');

#-- execute the SQL statement
ODBCOutput (sODBCname , ( sSQL) );

You can open the SQL ODBC connection in PROLOG, even when other data sources are in use.

sODBCname = ‘windows2016’; should be the name of the ODBC connection you want to use, it can be case-sensitive.

Please try to change ODBC driver in Windows Control Panel on your Windows Server. A new ODBC driver can help.

 

 More Information:

https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=tv-turbointegrator-local-variables

https://www.bi4all.pt/en/news/en-blog/cognos-tm1-dynamic-management-of-data-sources-and-connections-through-configuration-cubes/

https://www.tm1forum.com/viewtopic.php?t=11978

https://www.tm1forum.com/viewtopic.php?t=10947

https://www.tm1forum.com/viewtopic.php?t=16234

https://quebit.com/askquebit/IBM/how-to-speed-up-tm1-odbcoutput/

Product:

Planning Analytics 2.0.9x

Issue:

When is metadata tab executed?

Solution:

If the data view contain calculated cells, then you may need to add this in the prolog:

ViewExtractSkipCalcsSet ( sCubeName, sViewName, 0 );

to make the metadata be processed with the calculated values.

Metadata and data is only processed, if you have any data in the variables.

 

From Alan Kirk at https://www.tm1forum.com/viewtopic.php?t=15670

These are the roles of the tabs in a TI process:

Prolog
This runs ONCE, before the data source (if any) is connected to, read or even looked at by the process.
You can use it to:

  • Create a data source by defining a view and its subsets;
  • Check for the existence of a file and run a batch file to rename or move it;
  • Change the data source from a view to a text file to a dimension subset or whatever you need it to be;
  • Dynamically assign a different data source (a different file name, a new view name or whatever) to the process at run time;
  • Define any constants;
  • Write information to a log file or to the server log;
  • Insert new elements to a dimension if you have ones that do not come from your data source;
  • Pretty much anything that you need to do before the process even looks at the data source, if any.
  • The fact that it runs before you connect to the data source is what allows you to change the data source on this tab as mentioned above using functions like DataSourceType and DatasourceNameForServer.

There is one proviso with this; one thing that you can’t do within a TI itself is to change its variable list or the data types. Normally you would define those by using an example file or view at the time that you write the process.

Metadata
This loops through the data source (if any) ONCE. That is, for every record in the data source, the values in each column will be read into the corresponding process variables, then the code on the metadata tab will be executed, then the process will move onto the next row of the data source and the whole process repeats.

The purpose of the tab is to create any metadata (cubes, dimensions etc) that you will need to store the data that you upload on the Data tab.

When you use functions like DimensionElementInsert, changes are made to a copy of the dimension.

After the last record has been processed on the Metadata tab, the real dimension will be overwritten by the copy. If you did any insertions on the Prolog tab, these will also be added at that point.

Typically you will be using element names from your data source’s variables to do the element insertion. If you have a hard coded element name as you have in your example code, the TI will add the element on the first pass if it needs to, and spend every other pass saying “Nope, it’s already there, Nope, it’s already there, Nope, it’s already there, etc”. This is not what we call “optimum code efficiency”. That’s why insertions like that are generally done on the Prolog tab.

There are also newer functions like DimensionElementInsertDirect which will push the elements straight into the dimension without creating a copy. Information about such functions will be found in the Reference Guide.

IMPORTANT NOTE FOR NEW PLAYERS: If you don’t have a data source, or if you have a data source which has no records, then nothing that you have written in the Metadata tab will ever be executed. Ever.

Data
This will again loop through each row in the data source one at a time, assigning the values in each column to variables, and doing whatever you tell it to. This may be loading values into a TM1 cube, or it may be writing values from a TM1 cube to a text file, or to another database. If you are loading values into a cube it’s assumed that you have created any necessary elements in the Metadata tab.

Note that any attributes of elements (alias names, etc) are regarded as being data for the purposes of this exercise and need to be written on the Data tab (unless you used the Direct functions mentioned above).

Epilog
This is run AFTER the last Data record is processed. It is usually used to clean up whatever needs cleaning up, and maybe writing results into a control cube, according to taste.

 

More Information:

https://blogs.perficient.com/2015/04/29/ibm-cognos-tm1-updating-metadata-in-ti-submit-time-explore/

https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=tf-odbc-turbointegrator-functions

https://exploringtm1.com/viewextractskiprulevaluesset/

https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=vmtf-viewextractskipcalcsset