Planning Analytics TM1_version=TM1-AW64-ML-RTM-11.0.913.10-0

Microsoft Windows 2019 server

How do i give a message to Tm1Web users that they will always see?


Edit the CommonMessages.js file and the home.jsp files in Notepad++.

The first is the name of the tab in the web browser – here you can add text, like 1984 in our example by edit the file home.jsp in folder C:\Program Files\ibm\cognos\tm1web\webapps\tm1web. Change the row 47 to the text you want:

<title>IBM Cognos TM1 Web 1984</title>

Save the file and refresh your web browser to see the change.

The second value is in a different folder for each language. The language setting in your web browser will tell what text you will show.

Go to folder C:\Program Files\ibm\cognos\tm1web\webapps\tm1web\scripts\tm1web\common\nls\sv for Swedish language.

Edit the 378 or 379 row in file CommonMessages.js to your message text:

OFFERING_NAME:"Cognos® TM1® Web (please use chrome)",

Save the file and refresh the web browser. If you do not see the text, go into the settings and change the language for the web browser.

For English language, you may need to edit in the Tm1WebAppCam.js file, direct in folder C:\Program Files\ibm\cognos\tm1web\webapps\tm1web\scripts\tm1web\standalone. Search for the word cognos to find where you can edit the file.


Third, you may want that the users of TM1WEB should not see the cubes views at all. This is done by a change in the tm1web_config.xml file in folder

C:\Program Files\ibm\cognos\tm1web\webapps\tm1web\WEB-INF\configuration.  Edit the line 25 to this:

<add key="NavTreeDisplayServerView" value="N" />

Save the file and restart the service “IBM Planning Analytics Spreadsheet Services”.

To change the background picture in TM1WEB you may need to edit the file C:\Program Files\ibm\cognos\tm1web\webapps\tm1web\scripts\tm1web\themes\carbon\standalone\images\login\PA_Header.svg.

These changes is not supported by IBM.

More information:

Planning Analytics
Microsoft Windows 2019 server

How get the month before today?


In TM1 TI code you can write something similar to this in prolog:

## sMonth = 1 ;
## remove sMonth = below and test with above line

sMonth = ( MONTH ( TODAY ) ) ;
sYear = ( YEAR (TODAY) ) ;
IF ( sMonth = 1) ;
pMonth = numbertostring ( 12 ) ;
pYear = sYear - 1;
pMonth = numbertostring (sMonth - 1 ) ;
pYear = sYear ;

IF (LONG (pMonth) = 1) ;
p2Month = '0' | pMonth ;
p2Month = pMonth ;

pVersion = (p2Month) | ' - ' | numbertostring(pYear) ;

ASCIIOUTPUT ( 'd:\temp\debugtestdate.txt', numbertostring (sMonth) , numbertostring(sYear) , (p2Month) , numbertostring(pYear) ) ;

Will give a test file with this result:


If you want to have the year in 4 digits, you can add it like this;

IF ( sMonth = 1) ;
pMonth = numbertostring ( 12 ) ;
pYear = '20' | numbertostring ( sYear - 1 ) ;
pMonth = numbertostring (sMonth - 1 ) ;
pYear = '20' | numbertostring ( sYear ) ;



More Information:

Dates and Time in TM1

Microsoft SQL Azure database

How create a new database from a BACPAC file in SQL Azure?

The Azure portal only supports creating a single database in Azure SQL Database and only from a .bacpac file stored in Azure Blob storage.

Depending how your firewall and network endpoints are setup in the SQL azure, you can get different problems. 

Error: The ImportExport operation failed because of invalid storage credentials, can be that you need to click in “use private link” for the subscription.

You need to select the backup bacpac file from your blob storage.
You should select the database model you want to get a big and valuable SQL Azure database created.
Enter a name that you can easy find from the other databases.

The ImportExport operation with Request Id 'xxxxx-xxxxx' failed due to 'The server principal "donald" is not able to access the database "sqldatabasename-new" under the current security context.
Cannot open database "sqldatabasename-new" requested by the login. The login failed.

Error like above, can be that you do not have access to the database, the user account is not ADMIN on the SQL Azure server.

Other common issue can be that the BLOB storage is not allowing access from the SQL Azure database.
When doing the import above, you need to monitor both Private Endpoint Connections and SQL azure private access, to Approve the request for access that the above import process will ask for.

You can also may need to add the SQL server as resource type.  But it may still fail, if you have not given access to the SQL server over the network from the BLOB storage.


If you know you have access from your computer to the SQL azure server, then you should upload the DACPAC from you c:\temp folder with a powershell script.

.\SqlPackage.exe /Action:Import /SourceFile:"C:\temp\sqldatabasfilesavedname.bacpac" /TargetConnectionString:",1433;Initial Catalog=databasename_test_restore;Authentication=Active Directory Password;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;User;Password=xxxxxxx" /p:DatabaseEdition=Standard /p:DatabaseServiceObjective=S4

First you need to install the DacFramework.msi from

Then start powershell as administrator and go to the folder C:\Program Files\Microsoft SQL Server\160\DAC\bin, and run above powershell script after you change the names to match yours.

Parameters used in script:

.\SqlPackage.exe /Action:Import
= start the process and what action we do – in this case import
= tell the location and filename of the file to restore from
= tell protocol and name of SQL server and port the script should use
Initial Catalog=databasename_test_restore;
= tell the new name of the database in SQL azure
Authentication=Active Directory Password;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;
= set parameters that you will login with active directory
= enter the name of a administrator – this account should not be specified in the database
= enter the password for the account
= set the version of SQL Azure database you want in the subscription
= set the number of DTU you will use first in your new database

When it does the import – it start with this – check the line for servername to ensure it is correct

Importing to database ‘databasename_test_restore’ on server ‘,1433’.
Creating deployment plan
Initializing deployment
Verifying deployment plan
Analyzing deployment plan
Importing package schema and data into database
Updating database
Importing data
Processing Import.
Disabling indexes.


When done it should say something like this:

Successfully imported database.
Changes to connection setting default values were incorporated in a recent release. More information is available at
Time elapsed 0:13:32.42


If you get below error, it can be that you are on VPN, and need to be at the office instead, to be allowed to connect to the Azure SQL Server.

The ImportExport operation with Request Id ‘xxxxx-xxxx’ failed due to ‘An error occurred while communicating with the SQL Server using AdPassword-login: AADSTS50076: Due to a configuration change made by your administrator, or because you moved to a new location, you must use multi-factor authentication to access ‘xxxxx-xxxxxx’

If you get below error, the account you gave, already exist in the DACPAC file, you need to remove the user from the database, and do a new extract of data to dacpac.

Error SQL72014: Framework Microsoft SqlClient Data Provider: Msg 15063, Level 16, State 1, Line 1 The login already has an account under a different user name.
Error SQL72045: Script execution error.  The executed script:

You may need to remove logins for the user account used at the import, on the database that you see exist at the server level (on the master database).

To see server accounts, run below on master database:

SELECT as userName, as login, B.Type_desc, default_database_name, B.* 
FROM sys.sysusers A 
    FULL OUTER JOIN sys.sql_logins B 
       ON A.sid = B.sid 
WHERE islogin = 1 and A.sid is not null

More Information: 

SqlPackage /a:import /tcs:”Data Source=<serverName>;Initial Catalog=<migratedDatabase>;User Id=<userId>;Password=<password>” /sf:AdventureWorks2008R2.bacpac /p:DatabaseEdition=Premium /p:DatabaseServiceObjective=P6

SqlPackage /a:Import /sf:testExport.bacpac /tdn:NewDacFX / /ua:True /tid:”” 



Microsoft Azure SQL database


How create a BACPAC, a file that contain both data and table structure (metadata) from the database? That you can import to other database later.

If you are going to import the BACPAC into SQL Azure later, the user account doing the import, can not exist inside the database. Please remove that user from inside the database.


From inside SSMS (SQL Server Management Studio) you right click on the database and select task – export data-tier application.

You can enter a filename and save the file to your computer. (downloading database from Azure can take some time)

It is faster to save the file to AZURE BLOB storage (if they are at the same location), then you select “save to Microsoft Azure”.
Select a Storage Account, from your subscription.
Enter a name for the file, or keep the default.
Leave the temporary file as is.
Click next…

Or to save the file to disk, keep the “save to local disk” settings.

Click next and finish to start download the file to your local disk.

A DAC is a logical database management entity that defines all of the SQL Server objects which associates with a user’s database. A BACPAC includes the database schema as well as the data stored in the database.

More Information: 

SQLPackage utility to export Azure SQL Databases


Microsoft SQL Azure database


Can not delete a user in the database from SSMS.

The database principal owns a schema in the database, and cannot be dropped. (Microsoft SQL Server, Error: 15138)


Go to inside SSMS, check what schema is with this command (replace donald with the login you want to check):

select * from information_schema.schemata
where schema_owner = 'donald'

Change the schema stage to use DBO instead with command:



After the schema is changed to have a different owner, then the user can be dropped in SSMS.

More Information:

To list all user in database enter:

 SELECT DB_NAME(DB_ID()) as DatabaseName, * FROM sys.sysusers


Planning Analytics TM1_version=TM1-AW64-ML-RTM-11.0.913.10-0

Microsoft Windows 2019 server


How make a drill process?


Inside sample application proven_techniques login as ADMIN and create a drill from cube “Drivers Alloc” to cube “employeedetails” (not the best example as they do not contain much information).

First right click on cube “Drivers Alloc” and select drill – create drill process.

Click Next, and then select a cube view as source and select any view from that target cube.

Click finish, and select a name like “TOemployees”. The hidden process }Drill_TOemployees is created.
Right click on the cube “Drivers Alloc” and select drill – create drill assignment rule.

Enter in below in rule, to make all cells drill-able to the TI process TOemployees :

[]=s:'TOemployees' ;

Then save it.

Then open the TI process to edit it. The first tabs i automatic populated from the creation cube.

In prolog add this code:

#****Begin: Generated Statements***
#****End: Generated Statements****

ByPass = 'YES';

# set the name of the target cube where the view is build
pCube = 'employeedetails' ;

# Specify dimension name
DimName1= 'employees' ;
DimName2 = 'budget' ;
DimName3 = 'departments' ;
DimName4 = 'employeedetails' ;

#Set Constants for Inserting into View

# Element1 = 'Total Employees' ; we should get only lowest member by MDX
Element2 = 'Budget' ;

# set the parameter we get from the drill function to make a selection in the view
Element3 = Receivers ;

# Element4 = 'Current Salary' ; we get all members by not including it in the code

# set view name to a uniq value
sView = 'Drill__' | pCube ;
sSubset = sView;

If( ViewExists( pCube, sView ) = 1 );
ViewDestroy( pCube, sView );
ViewCreate( pCube, sView );

# delete subset if it exist
If( SubsetExists( DimName1 , sSubset ) = 1 );
SubsetDestroy( DimName1 , sSubset );

# add all leaf elements to the dimension
sMDX = '{TM1FILTERBYLEVEL( {TM1SUBSETALL( [ ' | DimName1 | ' ] )}, 0)}' ;
SubsetCreateByMDX( sSubset, sMDX );

ViewSubsetAssign( pCube, sView, DimName1, sSubset );

# only include on elemet of the dimenstion in the view
If( SubsetExists( DimName2 , sSubset ) = 1 );
SubsetDeleteAllElements( DimName2 , sSubset );
SubsetCreate( DimName2 , sSubset );
SubsetElementInsert( DimName2, sSubset, Element2, 1 );

ViewSubsetAssign( pCube, sView, DimName2, sSubset );

If( SubsetExists( DimName3 , sSubset ) = 1 );
SubsetDeleteAllElements( DimName3 , sSubset );
SubsetCreate( DimName3 , sSubset );
SubsetElementInsert( DimName3, sSubset, Element3, 1 );

ViewSubsetAssign( pCube, sView, DimName3, sSubset );

# as dimname4 does not have a total, 
# we leave it out and get all values in the view for that dimension.

# set what dimensions that should be in row and columns
ViewRowDimensionSet( pCube, sView, DimName1 , 1 );
ViewColumnDimensionSet( pCube, sView, DimName4 , 1 );
ViewTitleDimensionSet( pCube, sView, DimName3 );

# set standard values to exclude some cells
ViewExtractSkipCalcsSet( pCube, sView, 1 );
ViewExtractSkipRuleValuesSet( pCube, sView, 1 );
ViewExtractSkipZeroesSet( pCube, sView, 1 );

in epilog add this code:

IF (ByPass @= 'NO' ) ;

#****Begin: Generated Statements***
#****End: Generated Statements****




Click on the lowest cell and select Drill from the popup menu.

The view is shown (that is built by the TI process) – only value that change is the receivers dimension, as that parameter is used to build the view.

You can do a better solution by checking out below links.

More Information:

Microsoft Azure Data Factory
How schedule the ADF main pipeline to run every hour?


Inside ADF  ( ) under Factory Resources select you pipeline named “main”.
Click on the Trigger(1) icon to get the dropdown that show “New/Edit”, select that. (The number show how many triggers you got running)

Click the arrow at choose trigger to get the New icon to select.

Enter the name for your trigger.
Edit the start date to sometime next day – easy to only write the numbers in this field. It will start at the time specified, so set 1:00:00 AM to make it run at every whole hour. If you set it to 1:30:00 will make it run at 1:30 and then 2:30 etc.
Select a time zone that is the one you are in – otherwise it will run at for you unexpected time.

Set the Recurrence to 1 hour, if you want it to run each hour.
Note: that it is marked to Start.
And click OK.

You need to click OK one more time.

You need to click on the Publish All button in ADF, to make the change be uploaded and activated.

To check your trigger the next day, go to Monitor icon and Trigger runs. Here will be a list of the last 24 hours trigger runs and if they where successful.

To edit a existing Trigger – go to the pipeline and select again the Trigger drop-down and New/Edit link.

Click on your trigger name to get it up in Edit dialog. Here you can change the name and time it will run – select a start date and time in the future.

Ensure it is started, before you press OK.
Then you have to Publish your changes again to ADF, to make it happen.

As Azure change the layout all the time, the dialogs may look different when you read this.

More Information: 

Azure Data Factory (ADF)

Can not connect to SQL server from ADF in same subscription.

Cannot connect to SQL Database: ‘‘, Database: ‘databasename‘, Reason: Connection was denied since Deny Public Network Access is set to Yes. To connect to this server,
1. If you persist public network access disabled, please use Managed Virtual Network IR and create private endpoint.;;
2. Otherwise you can enable public network access, set “Public network access” option to “Selected networks” on Auzre SQL Networking setting.


Prompt yourself to Owner in the subscription and ADF.
Inside ADF you need first ensure that Integration Run-times are using a Managed Virtual Network. Create a new Integration runtime setup,
select Azure. self-hosted.
select Azure
set region to your needs, and click create.

When this use Managed Virtual Network, you can go and create the Linked Service.
Select Azure SQL database.
In connect drop-down select the above created integrationruntime2 that have managed virtual network.
Select your Azure Subscription.
Select your database server name from the drop-down.
Select the database name.
Select the Authentication type to be “System Assigned Managed Identity”
Click Test, and if there is OK, click Create.

Then you may inside SSMS add the ADF managed user to the database with command similar to this:

In master database:



In user database:

CREATE USER [adf_user] FROM LOGIN [adf_user]

ALTER ROLE [db_owner] ADD MEMBER [adf_user]



More Information: 


Planning Analytics Workspace 88
Microsoft Windows 2019 server


How setup the smtp mail function?


Check that the port 25 is open from the TM1 server to the SMTP EXCHANGE server first.

Check that the SMTP EXCHANGE server accepts mail from the TM1 server – talk to the e-mail people at the company.

Update the bootstrap properties file located in folder D:\Program Files\ibm\cognos\tm1_64\paa_agent\wlp\usr\servers\kate-agent


Set these properties for un-encrypted mail internal at the company:








After changes – you need to restart the windows service: IBM Planning Analytics Administration Agent

Then go to the PAW and the administration agent tab and click on the Test email button – enter your mail address at the company.

Then you can setup Alerts, so you will get a email when the disk is full.

To get the share book function to have a email option, you need to update the paw.ps1 file on the PAW server, with information like this:



You need to stop and start the PAW for the changes in paw.ps1 should take effect. run  .\paw.ps1 stop   ,then  .\paw.ps1  in the scripts folder.

if you get error – check the log file (app.log) in folder D:\PAW88\log\share-app on the PAW server.

Error [ERR_TLS_CERT_ALTNAME_INVALID]: Hostname/IP does not match certificate’s altnames:  can be that the EMAIL_SMTP_URL server name not match the DNS altnames. Change to other server name.

From Declan, Tm1 champion:

Setup in paw server

The obvious starting points are to gather details of your SMTP server, the port it uses (for sending email) – it can often have multiple ports available in which case – find out the “preferred” one. Generally speaking:

  • 587 – for TLS
  • 465 – For SSL
  • 25 – This might work (it does for office 365 at least) but general consensus is to not use it as it’s often blocked due to SPAM.

Also, gather the relevant account’s email and password.

Relevant Material

Based on googling the examples provided in the documentation:

I believe that node.js is being used to send the emails, and has documentation covering most of the parameters used in the examples (e.g. ?Secure=true)

This doesn’t cover the tls.rejectUnauthorized=false that is used in the example, but looking that up suggests that in the event that your server can’t be verified it will go ahead and send the email anyway… so that sounds like a parameter that I would be cautious about using!

What Worked


  • $env:EMAIL_SMTP_URL=”smtp://<MyUserName>:<MyPassword>″
  • $env:EMAIL_SMTP_URL=”smtps://<MyUserName>:<MyPassword>″
  • $env:EMAIL_SMTP_URL=”smtps://<MyUserName>:<MyPassword>”

In all cases:

  • $env:ENABLE_EMAIL=”true”
  • $env:SENDGRID_API_KEY=””
  • $env:EMAIL_FROM=”<An email address that the credentials used in $env:EMAIL_SMTP_URL have access to send from>”

Planning Analytics
Microsoft Windows 2019 server

How make a TM1 Ti process who read values from a file and add them as alias to a dimension?


Create a text file with the first column with the elements in the dimension names, then the second column with the text you want to have in the alias.

Above we have two alias that we want to add to each element.

Create a new TM1 TI process and load the file similar to this: (you have to upload the file first, and then in the TI code change to location from where the file is read)

In the data source tab, enter the Variables name and change to String type.

Enter the parameters you need, to make the process work with any dimension and alias name that you need in future. When you run the process, you change filename and path etc to the values that are correct for you.

Enter a script similar to above. You can add code to check for it the alias exist, and not delete the attribute and then only update the elements listed in the text file.  Run the process and enter your filename.

The alias attribute are listed in the dimension.

You can view an icon indicating the attribute type in column headers. Click Settings icon and enable the Attribute type option to see the attribute type on column headers. Disable this option to hide the attribute type on column headers.

More Information: 

Accomplish Attibute Update in Cognos TM1 Smartly! 

AttrPutS in TM1: How to Use and Syntax