Product:
Microsoft SQL server

Issue:
How to in a view only list the data rows for the last 3 years?

Solution:

You need to have a column with the date in your fact table. If the date column in your fact table is a int, you have to join it with a date conversion table or use cast/convert.

We have used a date table (DM.dimdate).

The key_dimdate is a integer, and the date column is in the date format in SQL. The date format make it easy to compare it to a date question in SQL.

Create a SQL similar to this:

SELECT a.[Customer]
,a.[Account]
,a.[Order]
,a.[key_dimDate]
,a.[key_dimVersion]
,a.[Amount]
,a.[startDate] as [startDate]
,a.[endDate] as [endDate]
,a.[IsActive] as [IsActive]
FROM [DM].[facttable] a
inner JOIN [DM].[dimDate] f on a.[key_dimDate] = f.[key_dimDate]
where 1=1
and a.[IsActive] = 'Y'
-- and DATEADD(year, -3, GETDATE() ) < f.[Date] -- will list 3 years
and DATEADD(Quarter, -13, GETDATE() ) < f.[Date]  -- will list 39 months

 

If you get error like Column ‘id’ in field list is ambiguous, then you have missed to set the alias letter in front of all the columns references in the SQL query.

If you are getting an error: “Arithmetic overflow error converting expression to data type datetime.” Is that the F.Date in above SQL is a int, you have to convert it to a date somehow.

Only using CONVERT(DATETIME,[key_dimDate],103) can give overflow error.

Change the SQL to reflect your columns and tables.

 

More Information:

https://www.w3schools.com/sql/func_sqlserver_convert.asp 

https://www.sqlshack.com/sql-server-functions-for-converting-string-to-date/ 

https://www.w3schools.com/sql/func_sqlserver_dateadd.asp 

Let’s say you need to add five months to current date, use this:

SELECT * FROM YourTable
WHERE YourDate < DATEADD(month, 5, GETDATE())

I used function GETDATE() for getting current DateTime.

If you need to subtract some time, just add minus to second parameter:

SELECT * FROM YourTable
WHERE YourDate < DATEADD(month, -5, GETDATE())

https://koz.tv/sql-query-for-todays-date-minus-year-month-day-or-minute/ 

https://www.mssqltips.com/sqlservertip/2509/add-and-subtract-dates-using-dateadd-in-sql-server/ 

https://www.sqlshack.com/how-to-add-or-subtract-dates-in-sql-server/ 

 

To see the SQL job log for a time period, try this in SQL server (will not work in Azure SQL)

SELECT
@@SERVERNAME as ‘Server’, j.name as ‘Job Name’,
jh.run_date as ‘Run Date’, jh.run_status as ‘Job Status’
FROM msdb.dbo.sysjobs j
LEFT OUTER JOIN (
SELECT ROW_NUMBER() OVER(PARTITION BY jh.job_id ORDER BY jh.run_date DESC) AS row_num, jh.*
FROM msdb.dbo.sysjobhistory jh
WHERE
jh.step_id = 0 AND
jh.run_date >= CONVERT(varchar(8), DATEADD(DAY, -7, GETDATE()), 112) AND
jh.run_date <= CONVERT(varchar(8), GETDATE() + 1, 112)
–ORDER BY jh.run_date DESC
) AS jh ON jh.job_id = j.job_id
WHERE j.enabled = 1 AND jh.run_status = 0
ORDER BY j.name, jh.run_date;

 

https://www.w3schools.com/sql/sql_join.asp 

Here are the different types of the JOINs in SQL:

  • (INNER) JOIN: Returns records that have matching values in both tables
  • LEFT (OUTER) JOIN: Returns all records from the left table, and the matched records from the right table
  • RIGHT (OUTER) JOIN: Returns all records from the right table, and the matched records from the left table
  • FULL (OUTER) JOIN: Returns all records when there is a match in either left or right table

Product:

Linux Mint
Microsoft Windows 10

Problem:

How make a bootable usb to install a OS to old computer?  Old computer only have CDROM drive, and modern OS does not fit on that. They need GB of space.

Solution:

Download the ISO you need – for example Linux Mint from here:

https://www.linuxmint.com/download.php 

Download software to create USB drive, like

https://etcher.balena.io/#download-etcher 

https://rufus.ie/en/ 

How to do it:

https://youtu.be/764JLB13GjE?si=Gby6iZeC79FGZLll 

Start the program balenaEtcher.

Select the iso file to use.

Select the USB stick to use.

Click on Flash to create the usb media.

If you get error, reboot your laptop, and try again.

Ensure that the computer you are going to install Linux to, have in BOOT setup, that it will start from USB stick, before hard drive.

https://www.zdnet.com/article/how-to-install-linux-on-an-old-laptop/ 

https://itsfoss.com/install-linux-mint/ 

 

For old computers with 32 bit processor , you need to install the 32 version of linux.

https://www.linuxmint.com/edition.php?id=308 

https://www.debugpoint.com/32-bit-linux-distributions/

 

What does the name on Windows OS ISO file mean?

This post is not made to list every possible filename there is, only summarize what other have written about the subject. They iso code may vary slightly between products and editions.

We believe that J_CPRA_X64FRE is Windows 10 Pro 64 bit.

The coding does not seem to be formally or comprehensively documented, but you can assemble hints from various scattered observations around the web.

J_CPRA_X64F looks like a short form version designed to fit in the legacy 11 character disk label from the old FAT volume label limits. J is the Windows 10 release. CPRA is the Pro edition, where the “C” is for “client” (as opposed to “server”) “PR” is for “Professional” and “A” is the variant of the professional edition (most times there is only one). X64 is the CPU architecture. F is short for FRE is a final-released (checked) build as opposed to a CHK (checked) build, used for debugging. (source: https://forums.whirlpool.net.au/archive/2468594).

The typical full version of an ISO file name is longer, including the language and region, and sometimes the target medium. For example, JM1_CCSA_X64FRE_EN-US_DV9 was the Windows 10 Technical Preview install DVD (dual layer, 8.5GB aka DVD9) and JM1_CCSA_X64FRE_EN-US_DV5 is a Windows 10 install DVD (single layer, 4.7GB, aka DVD5). EN-US is US English. X64 is for the x86 64-bit version. JM1 is a pre-release of “Redstone” (Windows 10); CCSA is the “Windows Technical Preview Edition.”

Other Windows 10 edition examples include:

CENA (Client, ENterprise, “A”)
CEDA (Client, EDucational, “A”)

You can seem some historic product to volume label mapping at https://support.microsoft.com/en-us/help/889713/how-to-determine-the-channel-that-your-copy-of-windows-server-2003-was

and also

Original CD/DVD Volume Labels for Windows

Some of the Windows 7 media labels can be found at Get Windows 7 SP1-U Media Refresh MSDN/TechNet ISO (Download or Convert) « My Digital Life. and Official Windows 7 SP1 ISO from Digital River « My Digital Life.

Windows 7 releases started with the letter “G”.

Windows 8 releases started with the letter “H”. (Windows 8 Enterprise x86 Volume Licensing ISO Leaked « My Digital Life)

Windows 10 uses the letter “J”. I would presume that “I” was skipped to avoid confusion with “1”.

https://www.quora.com/How-do-I-determine-what-version-of-Windows-installation-is-on-my-USB-drive-like-J_CPRA_X64F 

The two Windows 7 ISO file names, can be explained like this:
——–
Image

  1. Language of the OS. Always two letters (with one exception). Example: en, en-gb, cs, da, no, sv.
  2. Minor version build of the OS.
  3. Compile date of the OS (not of the ISO). Indicates YYMMDD-HHMM.
  4. Platform architechture and target. x86 = 32bit 8086-based, x64 = 32bit 8086-based with 64bit AMD-64 extensions. fre = Free, for end user. chk = Checked, debug version. Examples: x86fre, x64chk.
  5. SKU version. Examples: enterprise, enterprisen, professional, starter.
  6. Full language tag of the OS. Must match (1). Examples: en-us, en-gb, sv-se.
  7. Volume license identifier. Optional.
  8. Upgrade identifier. Optional.
  9. Original equipment manufacturer identifier. Optional.
  10. Matches (5).
  11. Volume label of the mounted ISO volume. Contains various codes to identify language, SKU, volume/OEM and media.

——–
Image

  1. Language of the OS. Always two letters (with one exception). Example: en, en-gb, cs, da, no, sv.
  2. Product name. Examples: windows_7, windows_8, windows_server_2012.
  3. SKU version. Examples: enterprise, enterprisen, professional, starter.
  4. Integrated service pack level. Optional.
  5. Platform architechture and target. x86 = 32bit 8086-based, x64 = 32bit 8086-based with 64bit AMD-64 extensions. chk = Checked, debug version. Examples: x86, x64chk.
  6. Storage media
  7. Update flag. The ISO was updated with some critical patch.
  8. Unique MSDN/Technet image number ID.

——–
Image

  1. Language of the OS. Always two letters. Example: en, cs, da, no, sv.
  2. ?
  3. Single or double layer DVD. DVD5 = 4.7GB
  4. Product name. Win = Windows.
  5. SKU version. Examples: Pro, Pro KN, Ent.
  6. 7 With. Means it’s an integrated installation with a service pack. Combined with (7) for full service pack level information. Optional.
  7. Integrated service pack level. Optional.
  8. Platform target. 32BIT = x86, 64BIT = x64.
  9. Language of the OS.
  10. Updated ISO. Base ISO with added KB update.
  11. Microsoft Licensing Fulfillment
  12. Microsoft Part Number as it appears on the physical installation media.

https://www.betaarchive.com/forum/viewtopic.php?t=26026 

More information:

https://itsfoss.com/watch-tv-channels-on-ubuntu-or-linux-mint/ 

Downloads

https://kodi.tv/ 

https://www.tecmint.com/linux-media-center-distros/ 

https://www.computerhope.com/history/processor.htm 

Have fun!

Product:
PowerBI Portal service

Problem:
How to import a csv file to dataflow from sharepoint area.

If you use sharepoint connection, you may get error like this;

an exception occurred: DataSource.Error: Microsoft.Mashup.Engine1.Library.Resources.HttpResource: Request failed:
OData Version: 3 and 4, Error: The remote server returned an error: (404) Not Found. (Not Found)
OData Version: 4, Error: The remote server returned an error: (404) Not Found. (Not Found)

Solution:

There exist different solutions to this issue, they way that work for you can depend on how your company have set up the security.

Go to your power bi portal https://app.powerbi.com/home?experience=power-bi

Open up your workspace area where you have administrator rights.

Click NEW DATAFLOW

Click Add new tables

Select csv file, not SharePoint file.

From you SharePoint folder, where you have stored your file, copy the link.

Then edit the link in notepad, so you remove /:x:/r/ between  .sharepoint.com/  and  /teams, also remove all garbage after ?.

Then you get a “clean” url path that will work. Like this (replace with your company info):

https://company.sharepoint.com/teams/powerbiworkspacename/foldername/General/enkel.csv

Paste the adjusted url to link to file field.

You may need to select data gateway to be “none”.

Enter “Organizational account” at authentication kind, if your SharePoint are part of you company, you will be prompted with your company azure login. If you are not already logged in to azure in your web browser.

If all works, you get a preview of the file.  Change the file origin to ensure that the special characters are handled correctly in the file. UNICODE-7 (utf-7) will support Swedish characters.

Click on transform data.

You will now have a similar look to power-bi desktop transform, where you can change the data before it is loaded into the cache.

The code is similar to below:

Csv.Document(Web.Contents(“https://company.sharepoint.com/teams/powerbiworkspacename/foldername/General/enkel.csv”), [Delimiter = “,”, Columns = 2, Encoding = 65001, QuoteStyle = QuoteStyle.None])

More Information:

https://www.linkedin.com/pulse/analytics-tips-connecting-data-from-sharepoint-folder-diane-zhu

https://learn.microsoft.com/en-us/power-query/connectors/sharepoint-folder 

https://learn.microsoft.com/en-us/power-bi/transform-model/dataflows/dataflows-create 

https://www.phdata.io/blog/how-and-when-to-use-dataflows-in-power-bi/ 

Dataflow Gen2 (fabric) is indeed an enhancement over the original Dataflow. One of the key improvements is the ability to separate your Extract, Transform, Load (ETL) logic from the destination storage, providing more flexibility. Gen2 also comes with a more streamlined authoring experience and improved performance.

For a more detailed comparison, you can refer to this link:

Differences between Dataflow Gen1 and Dataflow Gen2 – Microsoft Fabric | Microsoft Learn

Datamart primarily utilizes data streaming technology to import data into Azure SQL Server. Datamart then automatically generates and links datasets. You can then actually create data streams that connect to the Datamart, which can be used for DirectQuery or import if the Advanced Compute Engine is enabled.

For a more detailed comparison, you can refer to this link:

Introduction to datamarts – Power BI | Microsoft Learn

Product:
Cognos Controller 11.0.1200
Windows Server 2022

Issue:

When press the excel icon in Cognos Controller, excel starts but there is no controller toolbar inside.

Solution:

Check that you have two lines for cognos controller in options-add in.

if not, ensure that one is not disabled.

1. Launch Excel

2. Click “File – Options”

3. Click “Add-ins”:

4. Change “Manage” to “Disabled Items” and click ‘Go’:

5. Highlight the Controller add-in (“cognos controller link for Microsoft excel (adxloader.Controller.ExcelLink.dll)”), and click “Enable“:

6. Test.

If that not helps, try to add the add-in manually.

More information:

https://www.ibm.com/support/pages/missing-menu-item-controller-excel-excel-add-not-visible-active-caused-com-add-inside-disabled-items 

 

Product:
Planning Analytics Workspace 95
Microsoft Windows 2019 server

Issue:

In PAW administration page, for the agents, the version number is not showing the PAA agent version installed.

Version 2.0.93.1280

When you have installed PAA_Agent version 95. By command similar to this:

UpdatePAAAgent.bat “D:\Program Files\ibm\cognos\tm1_64”

Solution:

This value is from the file D:\Program Files\ibm\cognos\tm1_64\paa_agent\paaAgentCache\serversInfo.json

Test to erase the folder D:\Program Files\ibm\cognos\tm1_64\paa_agent\paaAgentCache in your lab environment, before you install the paa-agent, to see if that solves the issue. In most cases this does not work, and you have to reinstall PAL.

Our guess is that the different version of PAA Agent use different scripts/folders and therefor this file is not correct updated.

 

 

More Information:

https://www.ibm.com/docs/en/planning-analytics/2.1.0?topic=components-planning-analytics-administration-agent-local-only 

 

 

Product:
Microsoft Azure

Issue:
You get a mail about that some of your subscriptions parts are expire in a few days like “Your Storage File Data Privileged Contributor role in the prod subscription will expire in 1 day(s)”

Solution:

Depending how your Azure accounts are setup and handled by your organisation, but it should be similar to this:

(you need to be owner of the subscription, to be able to extend the role for your self)

Go to azure  Home – Microsoft Azure

Search for PIM.  Click on Privileged Identity Management

Expand tasks and click on my roles.

Go to Azure Resources, and for the subscriptions you need to extend, go to owner line and activate you as owner.

click on activate. After someone have approved, go to next step.

Click on Privileged Identity Management, to get back to the start page of PIM.

Click now on Azure resources under Manage, and from the new page drop down select the subscription you want to handled.

Click on Manage button, after you have selected a subscription.

Click on Assignments under Manage. Then browse the lines to the right, to find what roles you can extend. Click on extend link.

Set a date, at least a year in the future. Press save.  Repeat for all roles you need to extend.

If you do not find your role, check under the Expired assignments tab – to see if it is there, and you can extend it there.

Repeat for all subscriptions you have in Azure.

 

More Information:

Renew Azure resource role assignments in PIM – Microsoft Entra ID Governance | Microsoft Learn

Extend or renew PIM for groups assignments – Microsoft Entra ID Governance | Microsoft Learn

What is Privileged Identity Management? – Microsoft Entra ID Governance | Microsoft Learn

 

Product:
Planning Analytics 2.0.9.19
Microsoft Windows 2019 server

Issue:
We change the password for the system account that we use when calling TI processes inside TM1 instances, and now Tm1RunTi.exe does not work.

Solution:

Find the file that contain the password and key, run below command to create the new password files:

tm1crypt.exe  -keyfile  d:\arkiv\CognosUser_key.dat  -outfile  d:\arkiv\CognosUser_cipher.dat  -validate

(above will create the files in the folder d:\arkiv\, then you have to copy them to correct folder that you referee in your code)

You have to enter the new password for your service account twice.

#======================
# Parameters
#======================

sQuote = Char(34);

sRunTIPath=sQuote |'D:\Program Files\ibm\cognos\tm1_64\bin64\\TM1RunTI.exe'|sQuote ;
sPwdFile='CognosUser_cipher.dat';
sPwdKeyFile='CognosUser_key.dat';
sPasswordFile=sQuote |'D:\TM1 data\Tm1_Trigger\'|sPwdFile|sQuote ;
sPasswordKeyFile=sQuote |'D:\TM1 data\Tm1_Trigger\'|sPwdKeyFile|sQuote ;
sUserName='Cognos';
sUser=sQuote | sUserName | sQuote ;
sProcess = sQuote | pProcessName | sQuote ;
sInstance=sQuote|pInstance|sQuote;
sNameSpace=sQuote|'domain'|sQuote;
sAdminHost=sQuote|'servername'|sQuote;


#=====================
# Execute Command
#=====================

sCommand = sRunTIPath | ' -Process ' | sProcess | ' -Adminhost ' | sAdminHost | ' -Server ' | sInstance | ' -CAMNamespace ' | sNameSpace |
' -User ' | sUser | ' -passwordfile ' | sPasswordFile | ' -passwordkeyfile ' | sPasswordKeyFile ;
ExecuteCommand(sCommand , 0 );

 

Above in a prolog dialog will execute the TM1runTI.exe with the parameters you provide for TI process ( pProcessName ) to run and TM1 instance (pInstance) to use.

You need to update above TI code with the correct windows server name and Active Directory domain name, also point to the correct folder where you save the dat files, and the windows user account that you use for the access to the tm1 models.

 

More Information:

https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=encryption-run-tm1crypt-utility 

https://code.cubewise.com/blog/4-ways-to-speed-up-your-processes-in-ibm-tm1-and-planning-analytics/

 

Product:
Planning Analytics 2.0.9.19
Microsoft Windows 2019 server
Planning Analytics Workspace version 95

Problem:

Where is the files uploaded to?

When you try to create a new folder in PAW version  95, then you get a error like below, then something with the installation went wrong.

The following administration agent error occurred – {“errorMessage”:{},”query”:”/pakate-agent/v0/filemanager/Planning%20Sample/files”,”httpStatusCode”:500}

 

Suggestion:
The file manager will only access a folder under your applications data folder called model_upload like in C:\Program Files\ibm\cognos\tm1_64\samples\tm1\PlanSamp\model_upload. (i think you can edit this folder name in the paa-agent bootstrap.properties file.)  Try change this value:

MODEL_UPLOAD=model_upload

To upload a file go to PAW and WORKBENCH

Right click on the application you want to work with files and select file manager.

Click on the upload icon, and then drag the file to the area, and press upload button.

When the file is uploaded, close above dialog.

The file is put in a folder below your data folder, for your tm1 instance.

 

 

If the paa-agent not work as expected, then the files are created here, for all tm1 instances.

D:\Program Files\ibm\cognos\tm1_64\paa_agent\wlp\usr\servers\kate-agent\model_upload

Check the messages.log file above for more information why it does not work.

Reinstall PAA agent, for you to try resolving the issue.

The problem could be related to one or more old files in the paa_agent folder structure, test below steps.
1. Stop the “IBM Planning Analytics Administration Agent” service
2. Remove the the cache folders: “.logs” and “.classCache” in ../tm1_64/paa_agent/wlp/usr/servers
3. Remove the folder: “workarea” in ../tm1_64/paa_agent/wlp/usr/servers/kate-agent
4. Restart the “IBM Planning Analytics Administration Agent” service
Wait 5 minutes time as the Agent starts, before you test again.
Do not Remove the D:\Program Files\ibm\cognos\tm1_64\paa_agent\paaAgentCache folder, as it looks that it corrupts the PAA agent installation.

More Information:

https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=model-using-file-manager 

https://www.ibm.com/support/pages/planning-analytics-administration-agent-didnt-work-error-exception-javalangnullpointerexception 

Working with files and folders in File manager

​You can use the toolbar to create folders and sub-folders, upload files, refresh files, and search for a folder or file. Use the Actions button Actions button​ or select a file or folder to access options​such as Download, Delete, Cut, Copy, and Compress.

Deleting a file or folder
To delete an item, select the file or folder and click Delete. You can also click the Action button for the item and click Delete.

Warning: When you delete a folder or sub-folder, all files within that folder are also deleted.
Cut, copy, and paste files
You can move or copy files or folders to a different folder within the File manager. The Cut and Copy options are available from the Actions button or display in the toolbar when you select one or more items.
To cut or copy, select one or more items and click Cut or Copy. Your selections display in alphabetical order as a horizontal list for a quick review before you cut or copy the items.
Selected files for cutting or copying display in the File manager
After you review the list of items that you want to cut or copy, you can paste the items to a different folder. To paste a cut or copied item, go to the destination folder and click Paste.
Compress and expand files
Select one or more files or folders and click Compress to compress the items into a .zip file. You can compress a file and move it to a new location without using Cut. To do this, select a file and click Compress. Then, go to the destination folder and click Compress again.
To expand a compressed file, select the file and click Expand. As with compressing a file to a new location, you can expand a file to a new folder.

 

Product:
Planning Analytics 2.0.9.19
Microsoft Windows 2019 server
Microsoft® Excel® for Microsoft 365 (Version 2308 Build 16.0.16731.20542) 64-bit

Issue:

When using the drill function in TM1 Perspective, you get a error, if you do the same task in a different version of TM1 Perspective it works.

Excel crash, and then restarts recovering your xls file. The drill down does not work.

Solution:

Install and use a different version of TM1 Perspective.

The issue started with version 2.0.9.7 and was first solved with version 2.0.9.16 of TM1 Planning Analytics Perspective.

So if you use TM1 Perspective 2.0.9.19 with 64 bit excel, you do not have a issue.

Or if you use TM1 Perspective 2.0.9.6 with 32 bit excel, you should also be fine.

Planning Analytics 2.0.9.11 is the last release to include 32-bit versions of the TM1 clients (Architect and TM1 Perspectives).

More Information:

Determine the version of IBM Planning Analytics (cubewise.com)

Using the Drill option in 64-bit IBM Cognos TM1 Perspectives causes an “Excel Stack Exceeded” error.

IBM Planning Analytics 2.0 Fix Lists

IBM Planning Analytics – Setting up Basic DRILL THROUGH Functionality – QueBIT

Drill through cells to see detailed data – IBM Documentation

Product:
Planning Analytics 2.0.9.19
Microsoft Windows 2016 server

Issue:
The security scan program reports that the port 12345 is not secure. Cause ssl-cve-2016-2183-sweet32 issue.

Possible solution:

Try to implement this limit in the ciphers that TM1 internally can use for communications.

Login to the PAL (planning analytics) server as admin.

Stop TM1 admin service and all the other TM1 instance services.

Open Cognos Configuration for TM1.

Add below to Supported ciphersuites:

TLS_RSA_WITH_AES_128_CBC_SHA256,TLS_RSA_WITH_AES_256_CBC_SHA256, TLS_RSA_WITH_AES_128_GCM_SHA256,TLS_RSA_WITH_AES_256_GCM_SHA384, TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384, TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384, TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256,TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384, TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384

 

Then open each TM1S.CFG file for each instance, and add below last in the file:

tlsCipherList=TLS_RSA_WITH_AES_128_CBC_SHA256, TLS_RSA_WITH_AES_256_CBC_SHA256,TLS_RSA_WITH_AES_128_GCM_SHA256, TLS_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256, TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256, TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256, TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256, TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384

 

Save the file.

Start the TM1 Admin service first, and then the other TM1 instances one by one.

Check that you can login.

Wait and see if the security scan report a problem less.

More Information:

https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=pa2f82-disable-des-3des-ciphers-in-planning-analytics-mitigate-false-positive-security-scans 

https://www.rapid7.com/db/vulnerabilities/ssl-cve-2016-2183-sweet32/ 

https://en.wikipedia.org/wiki/List_of_TCP_and_UDP_port_numbers 

https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=pitf-tlscipherlist