Product:

Azure Data Factory

Issue:

How connect with managed identity to SQL private endpoint?

Solution:

In your azure subscription, ensure that both ADF and SQL are in the same subnet. Then there should not be any firewalls that need to be open between them.

On your AZURE SQL server, set Networking – public access to disable. Ensure you have created private endpoints for you SQL AZURE resource.

Set up a Managed Identity for your Azure Data Factory:  This will create a ID that can be used by other Azure resources to get access.

    • In the Azure portal, go to your Azure Data Factory resource.
    • Under the “Settings” section, select ” managed identity “.
    • Enable the system-assigned managed identity for your Data Factory.

Grant the Managed Identity access to the SQL Azure Database:

    • Go to your SQL Azure Database resource.
    • Under the “Settings” section, select “Access control (IAM)”.
    • Click on “+ Add” and add a role assignment.
    • Select the appropriate role (e.g., “Contributor” or “SQL Server Contributor”) and search for the name of your Azure Data Factory.
    • Select the Data Factory name from the search results and click “Save”.

You can also give the ADF access inside SQL server by giving it access with this commands in SSMS:

-- run in master database

CREATE LOGIN [adf-name] FROM EXTERNAL PROVIDER

CREATE USER [adf-name] FROM LOGIN [adf-name] WITH DEFAULT_SCHEMA=[dbo]

-- run in sql database

CREATE USER [adf-name] FROM LOGIN [adf-name]

ALTER ROLE [db_owner] ADD MEMBER [adf-name]

 

Configure the Linked Service in Azure Data Factory:

    • Open your Azure Data Factory resource in the Azure portal.
    • Click on launch Studio
    • Go to the “manage” section.
    • Click on the “Linked service” tab and select “New”.
    • Choose the appropriate SQL Server connector (e.g., “Azure SQL Database”).
    • Provide the required connection details such as server name, database name, authentication type like:
      integrationRuntime2 (Managed Virtual Network)
      connection string
      Account selection method – Enter manually.
      Enter SQL server name (Fully qualified domain name) like: sql-name.database.windows.net
      Enter database name
      For authentication type, under “Managed private endpoint”, Select System Assigned Managed Identity – then all values should come up automatic.
    • Click on “Test Connection” to validate the connection.

 

Use the Linked Service in Azure Data Factory:

      • Now, you can use the configured Linked Service to connect to the SQL Azure Database private endpoint in your dataset, that are the integration pipelines within Azure Data Factory.

By following these steps, you’ll be able to establish a connection to a SQL Azure Database private endpoint from Azure Data Factory using a managed identity.

More information:

https://learn.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/tutorial-windows-vm-access-sql 

https://techcommunity.microsoft.com/t5/azure-sql-blog/private-endpoints-for-azure-sql-managed-instance/ba-p/3782015 

https://learn.microsoft.com/en-us/azure/data-factory/tutorial-copy-data-portal-private

Product:

Microsoft Azure Storage Account

Issue:

When doing the Microsoft learn section:

https://learn.microsoft.com/en-us/training/modules/connect-an-app-to-azure-storage/9-initialize-the-storage-account-model?pivots=csharp

Try to list content in a blob, you get a message:

There are no credentials provided in your command and environment, we will query for account key for your storage account.
It is recommended to provide –connection-string, –account-key or –sas-token in your command as credentials.

You also can add `–auth-mode login` in your command to use Azure Active Directory (Azure AD) for authorization if your login account is assigned required RBAC roles.
For more information about RBAC roles in storage, visit https://docs.microsoft.com/azure/storage/common/storage-auth-aad-rbac-cli.

In addition, setting the corresponding environment variables can avoid inputting credentials in your command. Please use –help to get more information about environment variable usage.

 

Solution:

The blob photos have been created, if you check direct in your subscription storageaccount. But you can not verify it with the command:

az storage container list \
--account-name <name>

The <name> should be replaced with your uniq storage account name.

The result returned is cryptic, but if you find below lines, then there is a success. Keep in mind that you need to enter the CLI command in one line.

 },
"immutableStorageWithVersioningEnabled": false,
"metadata": null,
"name": "photos",
"properties": {

As this message is a warning, and not a error. You can add –only-show-errors  to suppress warnings. Like this;

az storage container list --only-show-errors --account-key  <your key> --account-name <your account>

The --auth-mode key will be deprecated in the future, try to use other method.

As this warning message is new – it may break your scripts, as they do not expect the message to come.

You can get more information with adding –debug  to the command, like;

az storage container list --debug --account-name  <name>

 

More Information:

https://learn.microsoft.com/en-us/azure/storage/blobs/authorize-data-operations-cli 

 

When you don’t specify the authentication type, it will try yo get the access key of the storage account: This requires Microsoft.Storage/storageAccounts/listkeys/action permission. If you have contributor role or the storage account, you have the required permission.

--auth-mode login means it will use AAD auth to connect to the storage. You can use of the built-in roles to access the storage (see documentation):

  • Storage Table Data Contributor
  • Storage Table Data Reader

When using AAD Auth, you could also disable access key authentication.

There is an good article related to RBAC management and data plane model: Assign an Azure role for access to blob data.

Product:
Microsoft Azure File share

Issue:

How use POSTMAN to send up a file to AZURE file store with REST API?

Solution:

Download POSTMAN program from https://www.postman.com/downloads/

Go into your AZURE subscription and to your storage account, to get the shared access signature (SAS), that is a URI that grants restricted access rights to Azure Storage resources.

As this is a file share you should select allowed service to be FILE, and allowed resource types to be OBJECT.

Set the end date for expiry to a year from now.

Leave the Allowed IP addresses blank – to allow any computer to access the account.  (Keep the DNS and FIREWALL setup so that only computers from your company can reach the azure area).

Allowed protocols should be HTTPS only.

Click on Generate SAS and connection string.  This strings you must copy and save in notepad. You can not show them again when you have left this Azure page.

The connection string contain all info you need. Inside notepad split it up to only have the information you need in one string for file share. Copy text after FileEndpoint=.

You should get something like this:

https://xyz.file.core.windows.net/?sv=2022-11-02&ss=f&srt=o&sp=rwdlc&se=2025-07-31T22:45:45Z&st=2023-05-30T14:45:45Z&spr=https&sig=xyzsecretkeyxyz

sv= is the version of REST API, the value you may need to add as; x-ms-version: 2022-11-02

se= is the end date for the connection key to work, like 2025-07-31T22:45:45Z

st= is the start date for the connection key to work, like 2023-05-30T14:45:45Z

sig= is the key value, that gives you full access to the area. Do not share it with others.

sp= is what kind of rights you have given, e.g. read write delete list create.

In your storage account file share, you may have created some subfolders like testfiles. Click on File Shares to find the name and any sub-folders you have underneath it. Click on file share and click on browse to find the folder name where you have Authentication method: Access key. This is the folder you can access.

Update your url to contain the path and the filename of the file you want to create. Like https://xyz.file.core.windows.net/testfiles/test.txt?sv=2022-11-02……

Start POSTMAN. Do not login, skip that. Create a New HTTP dialog.

Select PUT and paste in your URL. Then POSTMAN will interpreter your values and list them as parameters.

With file share, you must do the REST API in two steps. First create the file in correct size, and then do a second call to fill the file with data. This is different from BLOB storage, where you can do it in one REST API CALL.

In POSTMAN go to the Headers tab and add two keys:

x-ms-type = file

x-ms-content-length = 1

Here we set the length of the file to 1 character (1 byte). (this will work as long you only use a-z characters and UTF-8 coding).

Click on SEND button, and if all is correct you should get:  201 created.

Browse to you AZURE file storage and check that the file was created, with a size of 1.

To write to the file, add in headers this two keys:

x-ms-write = update

x-ms-range = bytes=0-0

The x-ms-range should always start with 0 and then be one number less than the total of characters in your file. If the file is 42 characters, then the value should be bytes=0-41.

Important, in the params tab you must add a key as below (this to active the range function – otherwise the x-ms-range is not used);

comp = range

Then we need to add some data in POSTMAN to write to the file, go to the Body tab, and select RAW – text and enter a letter.

The text should be the same size as the file you have created. File size and text you put into the file must match exactly on the byte.

Click on SEND, and you should get a Status: 201 Created if all is fine.

Common errors you can see in POSTMAN are:

Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. Authentication scheme Bearer for files is not supported in this version.

This is solved by adding the correct version, like: x-ms-version: 2022-11-02

You should also try to have headers like:

Authorization: Bearer
x-ms-type:file
<Error><Code>UnsupportedHttpVerb</Code><Message>The resource doesn't support specified Http Verb.

This is solved by using PUT instead of POST.

<Error><Code>ShareNotFound</Code><Message>The specified share does not exist.

This is solved by you enter the correct path and filename to the URL.

HTTP Error 400.  The request host name is invalid.

Solved by enter the correct host.

HTTP Error 411. The request must be chunked or have a content length

Solved by enter the header Content-Length.

HTTP Error 404 The specified resource does not exist.

HTTP Error 400 Value for one of the query parameters specified in the request URI is invalid

HTTP Error 404 The specified parent path does not exist
Solved by enter the correct path to the files in AZURE.
One of the HTTP headers specified in the request is not supported.
<HeaderName>x-ms-range</HeaderName>

Solved by adding the key comp in params tab.

An HTTP header that's mandatory for this request is not specified.
<HeaderName>x-ms-content-length</HeaderName>

Solved by adding the key x-ms-content-length.

An HTTP header that's mandatory for this request is not specified.
<HeaderName>x-ms-write</HeaderName>

Solved by adding the key x-ms-write.

The value for one of the HTTP headers is not in the correct format.
<HeaderName>Content-Length</HeaderName>
<HeaderValue>1</HeaderValue>
This is solved by enter correct value in the x-ms-range key or that you add comp = range in the params tab.

 

More Information:

https://www.mikaelsand.se/2019/11/simple-how-to-upload-a-file-to-azure-storage-using-rest-api/
https://www.petecodes.co.uk/uploading-files-to-azure-blob-storage-using-the-rest-api-and-postman/
https://www.serverless360.com/blog/azure-blob-storage-vs-file-storage
https://raaviblog.com/how-to-use-azure-blob-storage-service-rest-api-operations-using-postman/
http://www.mikaelsand.se/2020/06/oauth-with-azure-the-just-make-it-work-edition/

https://azureops.org/articles/connect-azure-sql-from-data-factory-using-managed-identity/ 
https://www.datahai.co.uk/power-bi/connecting-power-bi-to-azure-sql-database-using-private-endpoints/ 

https://en.wikipedia.org/wiki/Spy_vs._Spy

 

Product:
Cognos Controller 10.4.2
Microsoft Windows Server 2022

Issue:

Suddenly user can not login to cognos controller.

They get a error like this:

System.Web.Services.Protocols.SoapException: Server was unable to process request. —> System.ArgumentNullException: Value cannot be null.
Parameter name: uriString
at System.Uri..ctor(String uriString)
at System.Web.Services.Protocols.WebClientProtocol.set_Url(String value)
at Cognos.Controller.Common.CRNBridge.CRNBridge.set_EndPointURL(String sURL)
at Cognos.Controller.Proxy.CCRWS.GetUserInfo(String sGuid, String sUser, String passportId)
— End of inner exception stack trace —
at Cognos.Controller.Forms.Common.Main.DoLoginCognos8(Form& frm, Boolean runtimCheck)
at CCR.AppContext.DoLogin()
at CCR.AppContext.Login()

Solution:

Restart the IIS service on the Cognos Controller server.

 

Steps to check the issue:

Login to the Cognos Controller server (via remote desktop).

Check that all IBM Cognos service are running.

Start IE and browse to CA11 – does it work?

http://controllerserver.domain.com/ibmcognos

Start IIS manager and check that all application pools are running.

Go to Components service from control panel administration. Expand COM+ components.

Ensure that “IBM Cognos Controller Consolidation” is running. If not, then restart the IIS from inside IIS manager program.

Check the windows event log for any error message, that can explain why any of above processes have stopped.

More Information:

https://blog.ittoby.com/2014/07/why-schannel-eventid-36888-36874-occurs.html 

https://allthingscognos.wordpress.com/category/cognos-controller/

Product:

Planning Analytics 2.0.9.16

Microsoft Windows 2019 server

Issue:

How change TM1 Application web (pmpsvc) to use CAM SSO security when it have been setup to use Native TM1 security before?

Solution:

If you only have one tm1 application connected to tm1 app web (common in new installations – like planning sample in our example).

Stop the “IBM Cognos TM1” service (pmpsvc web server).

Open the file fpmsvc_config.xml from D:\Program Files\ibm\cognos\tm1_64\webapps\pmpsvc\WEB-INF\configuration folder.

Remove the planning sample line from between servers section.

So it looks like this;

<servers>
</servers>
</admin_host>
</tm1>

Save the file.

Change you planning sample tm1s.cfg file to have correct values like below:

IntegratedSecurityMode=5

ServerCAMURI=http://cognosserver.domain.com:9300/p2pd/servlet/dispatch
ClientCAMURI=http://cognosserver.domain.com:80/ibmcognos/bi/v1/disp

Save tm1s.cfg file and restart the planning sample service.

Test to login to planning sample in Tm1 Architect, it should work with CAM SSO if all is correct.

Then browse to your tm1 app web on:

http://planninganalyticsserver.domain.com:9510/pmpsvc

If all works well – you should get to the configuration page – where you can select Tm1 instance. Select a TM1 instance that will be up and use CAM security. All Tm1 applications that are used inside Tm1 Application web (contributor) must all have the same security settings, most common is IntegratedSecurityMode=5.

Save the settings and you should get into the IBM Cognos Tm1 Applications portal. If you need to edit the configuration later, click on the tools icon.

In Cognos TM1 Applications Configuration web page, click on Edit link, below the selected server names.

Then fill out the fields – for any change needed.

Admin host = should be the server-name of windows server where Tm1 Admin service is running (normally the tm1 server itself).

Server Name = should be the selected TM1 instance, that user will first be authenticated against. Should be using CAM SSO, as describe above.

Cognos BI Gateway URI = should be to the gateway like http://cognosserver.domain.com:80/ibmcognos/bi/v1/disp

Cognos BI Dispatchers URI = should be point to the CA11 server on port 9300 like http://cognosserver.domain.com:9300/p2pd/servlet/dispatch

Click OK to save, and there should not be any errors. If errors check in IE on server if you can browse to above URL for Cognos BI.

 

If you turn off the planning sample application and change it to IntegratedSecurityMode=5, without change in fpmsvc_config.xml file, then you get a Tm1 login dialog in TM1 App Web when you connect and you can not login. This becouse “pmpsvc” is setup for Native Security. You need to change planning sample back to IntegratedSecurityMode=1 to be able to login to Tm1 App Web again.

To clear a TM1 Application from TM1 APP WEB connections, so you can work with the TM1 instance in TM1WEB, you need to run TI process:

}tp_admin_delete_all

 

More Information:

https://www.ibm.com/docs/sr/planning-analytics/2.0.0?topic=web-configuring-tm1-application 

https://allthingscognos.wordpress.com/2014/08/26/configuring-performance-modeller-and-tm1-web-with-cam-security-for-tm1-10-2-n/ 

You can maybe edit pmpsvc_path\WEB_INF\configuration\log4j.properties file in the same folder to change log level to INFO or DEBUG to get a lot more info in WEB_INF\logs\pmpsvc.log… or check folder D:\Program Files\ibm\cognos\tm1_64\wlp\usr\servers\tm1\logs for messages.log files.

https://www.ibm.com/support/pages/how-manually-reset-deployed-tm1-applications 

https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=mctaip-resetting-application-in-portal 

https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=security-standard-cognos-tm1-authentication

Product:

Microsoft SQL Azure database

Issue:

A new created SQL native user [Kalle] can not see the tables in the database, but he can login to SSMS.

Solution:

Please do not use the role db_datareader or db_datawriter or their deny equivalents. They are for backwards compatibility only.

Remove the user from the role with below command, did not help.

EXEC sp_droprolemember 'db_datareader', 'Kalle'

You have to drop the user and create him again;

DROP USER Kalle

DROP LOGIN Kalle

Use Master

CREATE LOGIN Kalle WITH PASSWORD = 'advancedpasswordhere'

CREATE USER Kalle FOR LOGIN Kalle
-- to be able to login from SSMS you need to have the user in master database --

CREATE USER Kalle FOR LOGIN Kalle

-- gives the user Kalle access to see all tables in the DM schema --

GRANT SELECT ON SCHEMA::DM TO Kalle

This should give that the user only have read access to all tables and views that are part of the DM schema in the database.

To list members of built in roles use:

 SELECT DP1.name AS DatabaseRoleName, 
isnull (DP2.name, 'No members') AS DatabaseUserName 
FROM sys.database_role_members AS DRM 
RIGHT OUTER JOIN sys.database_principals AS DP1 
ON DRM.role_principal_id = DP1.principal_id 
LEFT OUTER JOIN sys.database_principals AS DP2 
ON DRM.member_principal_id = DP2.principal_id 
WHERE DP1.type = 'R'
ORDER BY DP1.name;

To list if any user have DENY rights use:

SELECT l.name as grantee_name, p.state_desc, p.permission_name, o.name
FROM sys.database_permissions AS p JOIN sys.database_principals AS l 
ON p.grantee_principal_id = l.principal_id
JOIN sys.sysobjects O 
ON p.major_id = O.id 
WHERE p.state_desc ='DENY'

 

More information:

https://learn.microsoft.com/en-us/sql/relational-databases/system-stored-procedures/sp-droprolemember-transact-sql?view=sql-server-ver16 

https://blog.sqlauthority.com/2017/03/02/sql-server-unable-see-tables-objects-ssms/ 

https://www.mssqltips.com/sqlservertip/2385/unable-to-see-a-sql-server-table/

Product:
Cognos Controller 10.4.2

Issue:

After a reboot of the TM1 server where the FAP service is running, the FAP does not connect to the TM1 services with error in D:\Program Files\ibm\cognos\ccr_64\Server\FAP\error.log file.

DEBUG [fap.service.schedule.ConnectionsPoller] [pool-5-thread-2], Starting connection poller for datamart ‘FAP’ with a 3600000 ms timeout.

ERROR [fap.service.schedule.Scheduler] [pool-5-thread-2], Could not find the TM1 server [tm1:FAP]

Suggested solution:

Change the FAP service to have “Automatic (Delayed Start)” in the windows services.

The Tm1 instance must be started before the FAP service is started.

More Information:

https://www.ibm.com/docs/en/cognos-controller/11.0.0?topic=administration-fap-web-configuration

https://www.ibm.com/support/pages/fap-connection-tm1-server-renewed

Product:

Microsoft AZURE file storage

Issue:

How upload a file to AZURE file storage?

Suggestion:

Download the AZURE STORAGE EXPLORER and install it.  https://learn.microsoft.com/en-us/azure/vs-azure-tools-storage-manage-with-storage-explorer?tabs=windows 

Connect to Azure with your account from inside Azure Storage Explorer.

Expand in the left to your file share.

Click on Upload icon on the right.

Find a example file and upload from the correct folder you want to upload files from.

Click upload and watch the program work.

When finish in lower right corner click on link : ‘Copy AzCopy Command to Clipboard’ next to the log message.

Paste this into NOTEPAD.

Edit the string, to be as you want it to be.

Download azcopy.exe to a folder like d:\script from https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10 

Open a powershell command window.

Go to the d:\script folder.

Paste in your azcopy command from the notepad into powershell session, and it will copy the files you defined.

 

Maybe you can programatically change the powershell script for azcopy to use it from a schedule program.

 

More Information:

https://learn.microsoft.com/en-us/azure/active-directory/develop/howto-create-service-principal-portal 

https://www.51sec.org/2022/08/12/using-azcopy-to-migrate-local-files-to-azure-blob-storage/

https://learn.microsoft.com/en-us/azure/storage/common/storage-configure-connection-string 

https://www.sqlshack.com/getting-started-with-azure-storage-explorer/ 

https://learn.microsoft.com/en-us/azure/storage/blobs/quickstart-storage-explorer

https://youtu.be/owXHtmQLQNY

Product:

SQL AZURE server

Issue:

Can not connect to AZURE SQL server in SSMS.

Error like:

the server was not found – msg 110001

Solution:

You need a new DNS record to point to the server-name in azure (db.database.windows.net).

There are more than one DNS zone where you need to update DNS records for your resource (privatelink.database.windows.net).

 

If you get a different error that, the server is connected but refuse to allow your user, then try with some other login method like:

Azure Active Directory – integrated

Azure Active Directory with MFA

SQL Server authentication (native login)

The allowed login method change with from where you try to connect and how firewall/DNS is setup.

 

More information:

https://learn.microsoft.com/en-us/azure/private-link/private-endpoint-dns 

https://learn.microsoft.com/en-us/sql/relational-databases/errors-events/mssqlserver-11001-database-engine-error?view=sql-server-ver16

https://learn.microsoft.com/en-us/azure/azure-sql/database/troubleshoot-common-errors-issues?view=azuresql

https://learn.microsoft.com/en-us/azure/azure-sql/database/authentication-aad-overview?view=azuresql

https://learn.microsoft.com/en-us/azure/azure-sql/database/authentication-aad-configure?view=azuresql&tabs=azure-powershell 

Product:

Cognos Analytics 11.1.7  kit_version=11.1.7-2304260612

Issue:

How apply a fix pack for CA 11.1.7?

Solution:

Download the fix pack from IBM. https://www.ibm.com/support/pages/node/6985631 

 

Make a backup of the content store by export it from inside Cognos Connection:

On the Cognos Analytics ‘Welcome’ dashboard, click the ‘Manage’ tab and select ‘Administration Console’.

Select the ‘Configuration’ tab and click ‘Content Administration’ on the left-hand side.

In the top right-hand side, click the icon ‘New Export’.

Specify a name for your new export (full backup contentstore) and click ‘Next’.

Click the ‘Select the entire Content Store’ radio button and select ‘Next’.

Choose the location where you want to store the deployment archive. Click ‘Next’.

Assign a password to your archive. The password must contain at least 8 characters. Re-enter your password for confirmation and then click ‘OK’.

Verify the details you have input before clicking ‘Next’.

Select ‘Save and run once’.

Specify a time when you want to run the export and click ‘Run’.

Click ‘OK’ to complete the export.

 

To save time, move content in folder D:\Program Files\ibm\cognos\analytics\deployment to d:\temp before the upgrade. Copy the few files you need back after the upgrade.

Backup the configuration in cognos configuration:

  1. Open Cognos Configuration.
  2. Click File > Export As.
  3. Select a location and enter a file name for the XML file.
  4. Click Save.

Make a backup of the following files to a d:\temp folder:
D:\Program Files\ibm\cognos\analytics\webcontent\planning.html
D:\Program Files\ibm\cognos\analytics\webcontent\pmhub.html
D:\Program Files\ibm\cognos\analytics\webcontent\web.config
D:\Program Files\ibm\cognos\analytics\webcontent\tm1\web\tm1web.html

D:\Program Files\ibm\cognos\analytics\webcontent\bi\planning.html
D:\Program Files\ibm\cognos\analytics\webcontent\bi\pmhub.html
D:\Program Files\ibm\cognos\analytics\webcontent\bi\web.config
D:\Program Files\ibm\cognos\analytics\webcontent\bi\tm1\web\tm1web.html

D:\Program Files\ibm\cognos\analytics\templates\ps\portal\variables_CCRWeb.xml
D:\Program Files\ibm\cognos\analytics\templates\ps\portal\variables_plan.xml
D:\Program Files\ibm\cognos\analytics\templates\ps\portal\variables_TM1.xml

D:\Program Files\ibm\cognos\analytics\configuration\cclWinSEHConfig.xml

Restore only the missing files after the installation.
Files to be preserved during an upgrade are listed in the D:\Program Files\ibm\cognos\analytics\configuration\preserve\.ca_base_preserve.txt file. Do not edit this file. Instead, edit the D:\Program Files\ibm\cognos\analytics\configuration\preserve\preserve.txt file if you want to remove or preserve certain files or directories when upgrading.

################################################################
#
# IBM Confidential
#
# IBM Cognos Products: Preserve Files by the Install
#
# (C) Copyright IBM Corp. 2017
#
# Edit this file (preserve.txt) to remove or preserve files or directories when upgrading. 
#
# 
# Instructions:
#
# - Edit preserve.txt before running an upgrade on an existing install.
# - Use '#' at the beginning of a line to insert a comment.
# - The keyword "exclude:" can be used to remove files inside a preserved directory (see examples below).
# - List directories or files relative to the installation root directory (see examples below).
#
#
# e.g.: To remove this file: <installdir>/media/samples.doc, add this line:
# exclude:media/samples.doc 
#
# e.g.: To preserve the file <installdir>/msgsdk/cm_ldkspec.xml, add this line:
# msgsdk/cm_ldkspec.xml
#
# e.g.: To preserve the contents of the folder: <installdir>/cps/sap/webapps, add this line
# cps/sap/webapps
#
# Note on order of precedence: Files to be excluded should be specified first (before the directories which contain them).
#
################################################################

# Specify files to exclude first


# Specify files or folders to preserve

If you have changed security or use certificates, then you need to also backup all the certificates store files.

 

Stop the Cognos Analytics Service and close down Cognos Configuration. Stop the Apache or IIS webserver services.

Launch the downloaded installation file (analytics-installer-2.2.27-win.exe) and follow the wizard.

Choose your Language and click Next.

Choose What you want to install – for an upgrade this will be IBM Cognos Analytics click Next.

Choose to Accept the license and click Next.

Choose the location. This must be the location of your Cognos Analytics instance that you would like to upgrade and also the shortcut folder name. Click Next.

Click Yes to confirm you are Installing in the same location and are overwriting a previous installation.

Click Install at the summary screen.

When complete click Done to complete the upgrade.

Open Cognos Configuration – you will be prompted that older versions of Configuration files were found and configuration files have been upgrade to the latest version. click OK and Save your configuration.

Repeat the steps for all servers in your distributed environment, before starting the Cognos Analytics Content Manager Services first and then the rest.

Check the file D:\Program Files\ibm\cognos\analytics\cmplst.txt to see what version is installed.

 

More Information:

https://www.ibm.com/support/pages/ibm-cognos-analytics-11x-fix-lists 

https://www.ibm.com/support/pages/how-export-entire-content-store-cognos-analytics-11 

https://pmsquare.com/analytics-blog/2022/6/8/how-to-find-your-cognos-version-build-and-common-name

https://www.ibm.com/docs/en/cognos-analytics/11.1.0?topic=servers-copying-cognos-analytics-certificate-another-server

https://www.ibm.com/support/pages/how-add-3rd-party-ca-allow-ssl-between-components-ibm-cognos-analytics-11