Product:

Cognos Analytics 12.1.1
Microsoft Windows 2019 server

Issue:

During over-the-top upgrade of CA, you get a error that folders in use c:\program files\ibm\cognos\analytics\cgi-bin

Solution:

Stop the IBM Cognos service, and also stop the World Wide Web Publishing Service.

Try then to run the installation again.

 

More information:

Before upgrade, backup the content store database, and export the content to a zip archive from Cognos.
Export the Cognos Configuration to a xml file.

During installation, the [Cognos Install]\deployment and the [Cognos Install]\data\search folders will be moved twice, which can cause the installer to appear as though it has stopped working. As these folders are often very large, it is advisable to move them to a location outside the Cognos Analytics installation folder prior to applying the update, then move them back after the update has completed, prior to restarting Cognos Analytics.

 

Upgrade steps on Windows:

1. Stop the Cognos Analytics Service and close down Cognos Configuration. If you have installed an optional Gateway, please stop the Apache or IIS webserver services.

2. Launch the downloaded installation file and follow the wizard.

3. Choose your Language and click Next.

4. Choose What you want to install – for an upgrade this will be IBM Cognos Analytics click Next.

5. Choose to Accept the license and click Next.

6. Choose the location. This must be the location of your Cognos Analytics instance that you would like to upgrade and also the shortcut folder name. Click Next.

7. Click Yes to confirm you are Installing in the same location and are overwriting a previous installation.

8. Click Install at the summary screen.

9. When complete click Done to complete the upgrade.

10. Open Cognos Configuration – you will be prompted that older versions of Configuration files were found and configuration files have been upgrade to the latest version. click OK and Save your configuration.

11. Repeat the steps for all instances in your distributed environment before starting the Cognos Analytics Services and start the webserver

Cognos 11.2.4 End of Support: Why You Must Upgrade to Cognos 12 Before April 2026

https://www.ibm.com/docs/en/cognos-analytics/12.0.x?topic=analytics-upgrading-cognos

Product:

Planning Analytics 2.1.14
Microsoft Windows 2022 server

Issue:

How replace \ with / in a string in a TM1 TI process?

Solution:

Enter below code in you prolog:

vTemp = '';
vChar = '';
k=1;
WHILE( k <= LONG(pFilePath) );
  vChar = SUBST(pFilePath, k, 1);
  IF(vChar @= '\');
    vChar = '/';
  ENDIF;
  vTemp = vTemp | vChar;
  k = k + 1;

END;
NewFilePath = vTemp;

Content in pFilePath will be updated to NewFilePath variable.

More information:

WHILE Function: How to Use, Syntax, and Examples

https://cubewise.com/functions-library/tm1-function-for-ti-while/ 

Product:
Planning Analytics 2.1.14
Microsoft Windows 2022 Server

Issue:

How call RUSHTI.EXE from TI process when using SSO with IntegratedSecurityMode=5  (CAM)?

https://github.com/cubewise-code/rushti

Solution:

Run the TM1 service  (IBM TM1 Admin Server x64) and TM1 application service (IBM TM1 Server x64 – budget) with a windows account in your domain, that have rights inside your TM1 application (maybe ADMIN).  The ExecuteCommand is run with the user who runs the TM1 service.
Go to Windows service, and for your TM1 service, change the log on from Local System to account (a service account that does not change password) in the domain.

To call RUSHTI from a TI process enter something like this:

sCMD = 'd:\rushti\rushti run --tm1-instance "Budget" --workflow "load data fast" --max-workers 8 --mode opt' ;
ExecuteCommand(sCMD, 1);

 

https://github.com/cubewise-code/rushti/releases

To install RUSHTI.EXE you need to download the zip file, and unpack it to a folder on your server (d:\rushti)

Installation

  1. Download rushti-windows.zip  (to start with, take this file https://github.com/cubewise-code/rushti/releases/download/v2.0.0/rushti-windows.zip )
  2. Extract to your desired location (keeps directory structure intact)
  3. Copy config/config.ini.template to config.ini and configure your TM1 connections
  4. (Optional) Copy config/settings.ini.template to settings.ini for custom defaults
  5. Run rushti.exe from command line

Note: This build uses PyInstaller’s onedir mode for fast cold starts.
Keep all files in the extracted directory together – the exe requires the bundled libraries.

You must have the files in the _internal folder, the files can not be replaced.

The config.ini contains the information to connect to your TM1 instance. Change for your environment, ports etc:

# The section name (e.g., [Budget]) is used as the instance identifier
# in task files and command-line arguments.
# For SSO set username to blank etc
# Copy this file to config.ini and update with your actual TM1 server details.
# the name of the tm1 application you should work with
[Budget]
# TM1 server hostname or IP address
address=tm1servername.domain.com
# REST API port (check your tm1s.cfg HTTPPortNumber setting)
port=12387
namespace=AD
gateway=http://caservername.domain.com:80/ibmcognos/bi/v1/disp
user=""
password=""
# Set to True if password is base64 encoded
decode_b64=False
# Set to True if TM1 server uses HTTPS (check UseSSL in tm1s.cfg)
ssl=True

 

Settings.ini is the file for defaults of the system to activate collection of data to database etc. It can look like this:

# RushTI Settings Configuration
# ==============================
#
# This file configures RushTI behavior settings, separate from TM1 connection
# settings in config.ini.
#
# Settings Precedence (highest to lowest):
# 1. CLI arguments (e.g., --max-workers 8)
# 2. JSON task file settings section
# 3. This settings.ini file
# 4. Built-in defaults
#
# To use this file:
# 1. Copy to settings.ini in the directory called config
# 2. Uncomment and modify the settings you need
# 3. Delete or leave commented any settings you want to use defaults for
# ------------------------------------------------------------------------------
# [defaults] - Common execution settings
# ------------------------------------------------------------------------------
[defaults]
# Maximum number of parallel workers
# Valid range: 1-100 (start with the same amount of CPU cores you have on the server)
# Default: 4
max_workers = 8

# Number of retries for failed process executions
# Valid range: 0-10
# Default: 0
# retries = 0

# Output file path for execution results summary CSV
# Set to a file path to create a summary CSV (e.g. rushti.csv)
# Leave empty or omit to skip creating the summary CSV
# Default: (empty - no CSV created)
result_file = rushti.csv

# Execution mode: 'norm' (normal) or 'opt' (optimized with dependencies)
# Default: norm
mode = opt

# ------------------------------------------------------------------------------
# [optimization] - Automatic task ordering optimization
# ------------------------------------------------------------------------------
[optimization]

# Enable automatic task optimization during execution.
# When enabled, ready tasks are sorted by estimated runtime (longest first)
# to maximize parallel efficiency. Dependencies are always preserved.
#
# Requires: [stats] enabled = true (needs historical data for estimates)
# Override: Use --no-optimize flag to disable for a specific run
#
# Default: false
# enabled = false

# ------------------------------------------------------------------------------
# [logging] - Enhanced logging settings / you must adjust this file to make it work
# ------------------------------------------------------------------------------
[logging]
# level = INFO
# file = /log/rushti.log

# ------------------------------------------------------------------------------
# [tm1_integration] - TM1 integration for reading taskfiles and logging results
# ------------------------------------------------------------------------------
[tm1_integration]

# Push execution results to TM1
# When enabled, the results CSV is uploaded to TM1 files after each run
# as: rushti_{workflow}_{run_id}.csv
#
# To set up TM1 integration:
# 1. Run: python rushti.py build --tm1-instance tm1srv01
# This creates the required dimensions and cube automatically.
# 2. Set push_results = true and configure default_tm1_instance below
#
# Default: false
push_results = true

# Automatically load results into TM1 cube after push
# When enabled (and push_results = true), calls }rushti.load.results TI process
# on the target TM1 instance after uploading the results CSV.
# The TI process must exist on the target instance.
#
# Default: false
# auto_load_results = false

# Default TM1 instance for reading taskfiles and writing results
# Used when --tm1-instance is specified without an instance, and for auto-upload
# Must be defined in config.ini, set the name for the section in config file.
default_tm1_instance = Budget

# Default cube name for task definitions and results
# The build command creates this cube with dimensions:
# - rushti_workflow: Workflow identifiers
# - rushti_task_id: Task sequence (1-5000 default elements)
# - rushti_run_id: "Input" for definitions, timestamps for results
# - rushti_measure: Task field measures
#
# Default: rushti
default_rushti_cube = rushti

# ------------------------------------------------------------------------------
# [stats] - SQLite stats database for execution history
# ------------------------------------------------------------------------------
[stats]

# Enable the stats database for storing execution history
# The stats database stores execution statistics for:
# - Optimization features (EWMA runtime estimation)
# - TM1 cube logging data source
# - Historical analysis via 'rushti db' commands
# Default: false
enabled = true

# Path to the SQLite database file
# Relative paths are resolved from the application directory
# Default: data/rushti_stats.db
db_path = data/rushti_stats.db

# Number of days to retain execution history
# Valid range: 1-365
# Default: 90
retention_days = 90


To create the RUSHTI cube, you run the command:

rushti build --tm1-instance "Budget"

If not using a CUBE for the tasks, you can use a TXT file. See link.
https://code.cubewise.com/open-source/tm1py/rushti/

To create an HTML file of the run information, enter command like below in the DOS prompt:

rushti tasks visualize  --tm1-instance "Budget"  --workflow "load data fast"  --output dag.html

Open the dag.htlm file in your web browser (with javascript enabled).

In the RUSHTI cube, you can use the predecessors column to tell what process should be finish before this process start. In below example the process number 3 starts first when process 2 is finished.

in process filed, enter the name of the TI process to run.

In parameters field, when you have more parameters than one , they can be formatted like this:

{ "pVersion": "Prognos 1", "pYear": "2025" , "pMonth": "03" }

There is a colon (:) between the prompt and the value, and a comma (,) between the parameters, or the simpler format shown below will work (you need the space before ,):

pVersion=Budget , pYear=2026 , pMonth=01

If you have a path in your parameter, then you need to enter / (forward slash) instead of \. So it should look like this:

{ "pNewCube": "thecubename", "pFilePath": "D:/Tm1 Data/budget/Export/to_other/read/Budget_2026_01.csv" }

You need to create a element in dim Rushti_workflow for your tasks settings, this i called from the rush command. For it to know what task list it should execute on, in the cube. e.g. –workflow “load data fast”

 

 

More Information:

https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=pctf-executecommand

https://code.cubewise.com/blog/introducing-the-tm1py-universe/

Full documentation is available at cubewise-code.github.io/rushti/docs

https://code.cubewise.com/downloads/

https://code.cubewise.com/blog/run-processes-in-parallel-using-only-connection/

Product:
Microsoft Windows 2022 server

Issue:

List users who have access to a folder from script?

Solution:

Start a powershell window and enter something like this:

$path = "C:\Your\Folder\Path"
(Get-Acl $path).Access |
    Select-Object IdentityReference, FileSystemRights, AccessControlType, IsInherited |
    Format-Table -AutoSize
More information:

 

Product:
Mirantis Docker
Microsoft Windows 2022 server

Issue:

How update the lic file on the docker installation on a Windows server?

Solution:

Buy a new license (Mirantis Container Runtime with LabCare Support) from Mirantis, one per PAW server. You will get a mail with a link, that will download the MCR file direct to your computer.

Then rename the file from like MCR 3xNode LabCare.lic  to docker.lic.

Copy the docker.lic file to your docker folder – normally that is D:\docker on your Planning Analytics Workspace (PAW) server.

 

To check that you have a license, enter below in CMD:

docker info --format '{{.ProductLicense}}'

You should get a text like this ‘Valid until 2026-04-09 for sf0643587264358638568230756MAC’ if all is fine.

 

More Information:

https://docs.mirantis.com/mcr/25.0/single/index.html#install-the-license

You can update the license in your cluster at any time during the week before your license expire.

As per our terms of service – “Users are not authorized to run MCR without a valid license.”, but even if you do not replace the license after the 10th, the docker service will not be forcefully stopped and there won’t be any interruptions to your applications and workloads.

You download Mirantis license (.lic) files from the Mirantis CloudCare Portal (or from a purchase confirmation email, depending on the product):

For MKE / MSR / MKE 4k:

  1. Find the email from Mirantis Support with subject similar to “Welcome to Mirantis’ CloudCare Portal” and log in as instructed. [MKE 3.7 licenseMSR 2.9 license]
  2. In the top navigation bar, click Environments.
  3. Click the Cloud Name associated with the license you need.
  4. Scroll down to License Information and click the License File URL.
  5. In the new tab, click View file to download the .lic file. [MKE 3.8 licenseMSR 2.9 single]

For MCR (Mirantis Container Runtime):

  • If you bought via the Mirantis Store, the license file is available from the link in your purchase confirmation email. Otherwise, you must contact Mirantis Sales/Support to get the file. [MCR 25.0 license]  

For Mirantis Container Runtime (MCR) / Mirantis Docker Engine, the license file must be placed in the daemon data directory and named docker.lic.

On Windows (default path):

Install MKE offline – Mirantis Kubernetes Engine

Index of win/static/stable/x86_64

By default, MCR automatically records and transmits data to Mirantis for monitoring and analysis purposes. The data collected provides the Mirantis Customer Success Organization with information that helps us to better understand the operational use of MCR by our customers. It also provides key feedback in the form of product usage statistics, which enable our product teams to enhance Mirantis products and services.

When antivirus and antimalware software products scan files in use by MCR, these files can lock in a way that causes Docker commands to hang or causes orphaned snapshots to leak disk space. To circumvent these problems, you can add the Docker data directory to the software’s exclusion list, which is by default /var/lib/docker on Linux systems and %ProgramData%\docker on Windows Server systems. As a result of this action, though, viruses or malware in local Docker images, writable layers of containers, or volumes will go undetected.

Product:
FileBeat 8.19.3

Microsoft Windows 2022 server

Issue:

Have installed filebeat service in windows, to collect logs files to elastic, but it does not read any files. And when i try to stop the services it hangs.

After you stop the filebeat service with Task Manager, you need to erase the lock file in folder C:\ProgramData\filebeat\ to make it to read the yml file at next start of the service.

Solution:

Check the filebeat.yml file. The JSON format is sensitive to spaces and other formats.

In this case there was a row:
ignore_older: ‘7d’
that made the filebeat service to stop.

It only supports minutes and hours, so you need to enter like this:
ignore_older: ‘168h’

The ignore_older: ‘168h’ function will check the timestamp of the file, and not read files that was created more than 7 days ago.

The filebeat.yml file is in folder C:\Program Files\Filebeat on windows.

Below a example of a filebeat.yml file for use with TM1 logs files – you need to add spaces in the beginning of every line to get it to work.

# ============================== Filebeat inputs ===============================
filebeat.inputs:
- type: filestream
id: tm1server
enabled: true

paths:
- D:/TM1 folder/Logs/tm1server.log
fields_under_root: true
fields:
event:
dataset: audit.plain

- type: filestream
id: tm1s2
enabled: true
ignore_older: '168h'
paths:
- D:/TM1 folder/Logs/tm1s2*.log
exclude_lines:
- '^#'
include_lines:
- 'AD'
fields_under_root: true
fields:
event:
dataset: audit.plain

# ---------------------- beats state ----------------------
- type: filestream
id: beats-logs
enabled: true
paths:
- C:/ProgramData/filebeat/logs/filebeat*.ndjson
include_lines:
- 'Non-zero metrics in the last 30s'
fields_under_root: true
fields:
event:
dataset: beats.state
processors:
- dissect:
tokenizer: '%{}"@timestamp":"%{event.start}"'
field: message
target_prefix: ""
ignore_failure: true
setup.template.settings:
index.number_of_shards: 1
fields:
system:
env: prod
id: SystemTM1
fields_under_root: true
max_procs: 1
processors:
- add_host_metadata:
when.not.contains.tags: forwarded
- add_cloud_metadata: ~
output.logstash:
hosts: ["elasticservername.domain.com:9999"]
ssl.enabled: true
ttl: 5m
pipelining: 0



More Information:

Filebeat sends log files to Logstash or directly to Elasticsearch.

## Getting Started

To get started with Filebeat, you need to set up Elasticsearch on
your localhost first. After that, start Filebeat with:

./filebeat -c filebeat.yml -e

This will start Filebeat and send the data to your Elasticsearch
instance. To load the dashboards for Filebeat into Kibana, run:

./filebeat setup -e

For further steps visit the (https://www.elastic.co/guide/en/beats/filebeat/8.19/filebeat-installation-configuration.html) guide.

## Documentation

Visit (https://www.elastic.co/guide/en/beats/filebeat/8.19/index.html) for the full Filebeat documentation.

## Release notes

https://www.elastic.co/guide/en/beats/libbeat/8.19/release-notes-8.19.3.html

https://www.elastic.co/beats/filebeat

https://www.elastic.co/downloads/beats/filebeat

https://github.com/elastic/beats

Product:

Microsoft Power Bi dataflow

Issue:

Get a error on all dataflow, Error: An authentication problem is preventing the dataflow from being refreshed. Please have the owner of the dataflow sign out of Power BI and sign back in and try again.

or error message like  {“code”:”DMTS_OAuthTokenRefreshFailedError”,”details”:[{“code”:”DM_ErrorDetailNameCode_UnderlyingErrorMessage”,”detail”:{“type”:1,”value”:”AADSTS135010: UserPrincipal doesn’t have the key ID configured} from the dataset that use the dataflow.

Can happen if you change you windows pincode login from microsoft, and use the same account to run the dataflow with.

Solution:

Go to the dataflow in your workspace in https://app.powerbi.com/

Go to settings

Click on data source credentials

Click on edit credentials

If the value is correct, click Sign in without a change.

Confirm the login question you will get, from you company.

Go to the workspace.

Try to refresh the dataflow again.

 

More information:

This error indicates that the cached credentials for the dataflow owner have expired, changed, or are no longer authorized to access the data source, causing a Power BI refresh failure. To resolve it, the owner must sign out of the Power BI service, clear browser cache, sign back in, and re-enter credentials in the dataflow settings.
Actionable Steps to Resolve:
  • Re-authenticate Dataflow Credentials: The dataflow owner should go to the Power BI Service, locate the dataflow, go to Settings > Data source credentials, and click “Edit credentials” to re-authenticate.
  • Sign Out/In: Sign out of app.powerbi.com completely, clear your browser cache/cookies, and sign back in to refresh token authentication.
  • Check Data Source: If using a Gateway, ensure the credentials in the gateway settings are valid.
  • Verify Permissions: Ensure the user who created the dataflow still has permissions to the underlying data sources (SQL, SharePoint, etc.).

 

https://www.fourmoo.com/2019/02/13/why-is-my-powerbi-dataflow-refresh-failing/ 

Product:
Cognos Analytics
Microsoft Windows 2019 server

Issue:

Does the cert expire in 2026 on old Cognos installations?

Solution:

Upgrade to later version of Cognos. The new version of Cognos Analytics and Planning Analytics have updated cert files as of IBM.

“IBM will provide updates SSL certificates as part of the next Planning Analytics Local 2.0.9 release. The best course of action is to configure your own SSL certificates for use with any version of TM1. You really do not need to wait until IBM provides updates certificates.”

If you have older version of TM1 client software and CA11 services, you may need to replace the certificates.

 

You can check the date of the cert in Cognos Analytics (CA11) with the IKEYMAN.EXE program, then open the camkeystore and check the date on your certificates as picture above show.
The case is that some certificates end date are updated, when you from inside Cognos Configuration program do a save.

Open the Cognos Configuration (as administrator) right click the root element –> “Test” then “Save” (re-save) and restart the service. Besides updating the cryptographic keys, you can also see transparency checking the details, during this process.

To check the Tm1 cert, copy C:\Program Files\ibm\cognos\tm1_64\bin64\ssl\ibmtm1.arm file to a c:\temp folder.
Rename the file to ibmtm1.crt. Right click on the ibmtm1.crt file and select OPEN.

There you see the certificate end date for your version of Tm1.

This document describes the process to renew and import updated certificates into Cognos Analytics, where Third-Party certificate authority has been enabled.

Objective

The purpose of this document, is to provide a straightforward process for obtaining and importing updated certificates into Cognos Analytics in order to minimize downtime, and reduce the need for a support case.
Depending on the certificate authority used by your organization, certificate renewal can typically be managed in one-of-two ways:
  1. The certificate authority allows you to resubmit your original Certificate Signing Request (CSR) to obtain a renewed certificate.
  2. The certificate authority requires that you present a new Certificate Signing Request (CSR) to obtain a renewed certificate.
Both of these methods can be achieved without repeating the original configuration steps to enable Third-Party certificate authority, and, with appropriate planning and backups in place, can be managed within a planned maintenance schedule.

Environment

Cognos Analytics 11.1.x and 11.2.x
All Operating System platforms
Third-Party certificate authority enabled

Steps

Part 1 – Back up the Cognos Analytics configuration.
  1. Stop Cognos Analytics
  2. Open Cognos Configuration
  3. Export the Configuration to plain text by clicking File –> Export As (recommended file name: decrypt.xml)
  4. Close Cognos Configuration
  5. Take a backup of the ‘configuration’ folder (recommended name: configuration-existingcerts)
Part 2 – Generate a Certificate Signing Request (CSR) if required.
  1. Open iKeyman (located in <COGNOS_HOME>\ibm-jre\jre\bin)
  2. Click Key Database File –> Open
  3. Navigate to <COGNOS_HOME>\configuration\certs
  4. Choose the CAMKeystore file and click Open
  5. Set the Key Database Type to PKCS12S2 and click OK
  6. Enter the keystore password (default is: NoPassWordSet) and click OK
  7. Click the current ‘encryption’ certificate, and click “Re-create Request”
  8. Provide a filename, and click OK
  9. Exit iKeyman
  10. Submit the generated CSR to your certificate authority for signing, and obtain your new certificate.
NOTE: If you need to have Cognos Analytics running while the certificate request is being processed, you can achieve this by renaming the ‘configuration’ folder to ‘configuration-csk’ and copying the ‘configuration-existingcerts’ folder back to ‘configuration’
Part 3 – Receiving the updated certificate
  1. Stop Cognos Analytics
  2. Ensure that the correct ‘configuration’ folder is in-place (if you used Step 2 of this process, ensure that the ‘configuration-csk’ folder is renamed to ‘configuration’
  3. Open iKeyman
  4. Open the <COGNOS_INSTALL>\configuration\certs\CAMKeystore file
  5. Click ‘Receive’
  6. Locate the newly received certificate, select, and click OK. You should receive a “Validation Successful” message
  7. Close iKeyman
  8. Open Cognos Configuration
  9. Save the configuration
  10. Start Cognos Analytics

 

 

More Information:

https://community.ibm.com/community/user/question/use-our-own-certificats 

For planning analytics this are the files used for certificates;

\bin64\ssl\ibmtm1.arm is the default certificate and it does not expire until 2035. ibmtm1.arm has been in use for a few years now, and even in 2020, its expiration was 2035. The “applixca” files in that folder are just there for some historical/nostalgic reason in my opinion, but I’m always focused on the latest releases, so there is that.

Inside of the \bin64\ssl\ibmtm1.kdb keystore, ibmtm1.arm has already been imported as “ibmtm1_server” and has been set as the default personal certificate. The “keystore” is what the components of PA will use to access the stored certificates and ibmtm1.sth is an encrypted password that PA uses to have access into the ibmtm1.kdb keystore while it is running.

New file \bin64\config.tm1admsrv.json  replaces the Cognos Configuration node that accepted TM1 Admin server settings. It defaults to using the \bin64\ssl\ibmtm1.kdb keystore and referring to the server certificate label ibmtm1_server that represents the imported ibmtm1.arm certificate IBM provided. A fresh install should be all set using ibmtm1.arm inside of ibmtm1.kdb.

You may have a \bin\tm1api.config or \bin64\tm1api.config where Architect and Perspectives have been installed on the users machines. This text file just refers to where Architect/Perspectives can find the keystore if it has been moved from \bin64\ssl\ibmtm1.kdb, if it is at a networked shared location, or it has been renamed. You likely don’t have this tm1api.config file if you never used custom certificates, but I mentioned it just in case. See link but ignore the stale top half. Most of the old Architect and Perspectives SSL Options are deprecated, so the top half is stale.

If you want to see the inside of the \bin64\ssl\ibmtm1.arm cert yourself, make a copy of it, and rename the copy to ibmtm1.crt. Then right-click on it and select “Open”…not “Install”. Microsoft will show you the date, etc. 2035.

Custom certificates for PA are not impossible, but not simple to implement.

 

NEW for 2.0.9.21/2.1.8/2.1.9 default \bin64\config.tm1admsrv.json file:

{
“tm1AdminNonSSLPortNumber”: 5495,
“tm1AdminSSLPortNumber”: 5498,
“tm1AdminHTTPPortNumber”: 5895,
“tm1AdminHTTPSPortNumber”: 5898,
“tm1AdminSupportNonSSLClients”: false,
    “tm1AdminKeyFile”: “./ssl/ibmtm1.kdb”,
    “tm1AdminKeyStashFile”: “./ssl/ibmtm1.sth”,
    “tm1AdminKeyLabel”: “ibmtm1_server”,
“tm1AdminTLSCipherList”: [],
“tm1AdminFIPSOperationMode”: 2,
“tm1AdminSupportPreTLSv12Clients”: false,
“tm1AdminNIST_SP800_131A_MODE”: false,
“tm1AdminIPVersion”: “IPv4”,
“tm1AdminActivityInterval”: 10,
“tm1AdminInactivityTimeout”: 10,
“tm1AdminRESTAPIToken”: “”
}

From the online documentation:

applixca.der
The original default certificate in DER format used for Java™ certificate stores.
applixca.pem
The original root authority certificate.
ibmtm1.arm
The default certificate file.
ibmtm1.crl
The certificate revocation list.
ibmtm1.kdb
The key database file, which contains the server certificate and trusted certificate authorities.
ibmtm1.rdb
The requested key pair and the certificate request data.
ibmtm1.sth
The key store, which contains passwords to the key database file.
tm1ca_v2.der
The updated default certificate.
tm1ca_v2.pem
The updated default root authority certificate.
tm1store
The Java certificate store containing the public root authority certificate.

https://www.ibm.com/support/pages/how-renew-third-party-ca-certificates-cognos-analytics 

https://community.ibm.com/community/user/discussion/what-happens-when-configurationscerts-expire 

https://community.ibm.com/community/user/groups/community-home/digestviewer/viewthread?GroupId=3067&MessageKey=5a98b967-835b-482b-998c-20bba9d68119&CommunityKey=8fde0600-e22b-4178-acf5-bf4eda43146b&tab=digestviewer&hlmlt=QT 

https://www.tm1forum.com/viewtopic.php?t=16287

Product:
Planning Analytics Workspace 2.1.14
Microsoft Windows 2022 server

Issue:

How do i easy copy one book in PAW to other TM1 instance?

Solution:

Copy the JSON code between the TM1 applications.

Go to the book, click on the Swedish keyboard CTRL+Q+’     (That is the * key left of the carriage-return key.)

This opens up the JSON code that describe the book. This can be copied to notepad for editing, or direct to other PAW book.

 

More Information:

https://www.acgi.com/blog/copying-tabs-across-books-in-ibm-planning-analytics-workspace

https://community.ibm.com/community/user/blogs/george-tonkin/2023/02/18/copying-a-tab-from-one-paw-book-to-another

https://www.ibm.com/docs/da/planning-analytics/2.0.0?topic=books-create-book

Product:

Power BI Dataflow

Issue:

After user account is changed (owner), the dataflow is not updated.  (this is only when you use a Power BI gateway to access SQL database on prem.)

 

Solution:

You have on the dataflow gone to settings, and change the owner to a other person. (by press the Take Over button)

On the dataflow tab go to EDIT

Click EDIT on one table.

You get a error message, that it can not contact SQL server.

Click on OPTIONS icon.

Click on project and data load, and change the data gateway to one in the dropdown list.

Click OK.

Now the table should be refreshed automatic, and you can save and close the dataflow.

 

More Information:

https://learn.microsoft.com/en-us/power-bi/transform-model/dataflows/dataflows-configure-consume

https://learn.microsoft.com/en-us/power-query/dataflows/data-sources?tabs=power-bi-service

https://data-marc.com/2022/08/02/what-you-must-know-when-building-power-bi-dataflows-routing-over-the-on-premises-data-gateway/