Product:
Microsoft SQL 2016 server

Issue:

How create a new database from a BACPAK file in SSMS?

Solution:

Start SSMS and connect to the target SQL server.

On the database select “Import Data-Tier Application”

Click Next

Select the BACPAC file to import and click Next

Enter the name of the new database and click Next

Click Finish

Wait during the import.

When Operation Complete click Close.

You have your new database in the new server, including data.

 

More Information:

https://4sysops.com/archives/dacpac-and-bacpac-in-sql-server/ 

https://blogs.msmvps.com/deborahk/deploying-a-dacpac-with-sql-server-management-studio/ 

https://www.sqlshack.com/importing-a-bacpac-file-for-a-sql-database-using-ssms/ 

A DAC is a logical database management entity that defines all of the SQL Server objects which associates with a user’s database. A BACPAC includes the database schema as well as the data stored in the database.

Product:
Microsoft SQL server 2016 standard
Microsoft Windows 2016 server

Problem:
How load data from csv file into SQL server, where the table contains more columns than the text file?

The BULK INSERT command will fill the not used target columns with the next row of data, and you get a inconsistent table.

Solution:

Create a view of the table, where you have less columns shown, and then bulk insert to the view.

From inside SSMS script out a select of the table, and add create view to first line, remove the columns you do not want.

CREATE VIEW StorageView AS
SELECT [Organisation]
,[Version]
,[Belopp]
FROM StorageTable

Then in your BULK INSERT us the view instead, so the csv files number of columns match the (view) target tables number of columns.

BULK INSERT [StorageView]
FROM 'C:\temp\storagefile.csv'
WITH (
FIELDTERMINATOR = ';',
ROWTERMINATOR = '\r',
FIRSTROW = 2
);

 

FIELDTERMINATOR = set the separator for the columns in the csv file
ROWTERMINATOR = set the character to skip to next row/record
FIRSTROW = tell that first line in the csv file is headers and should not be read

If you get a error like this:

Msg 4832, Level 16, State 1, Line 16
Bulk load: An unexpected end of file was encountered in the data file.
Msg 7399, Level 16, State 1, Line 16
The OLE DB provider “BULK” for linked server “(null)” reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 16
Cannot fetch a row from OLE DB provider “BULK” for linked server “(null)”.

Then the csv file have a blank line as the last line, edit your csv file to not contain any empty lines.

 

More Information:

https://www.sqlservertutorial.net/sql-server-administration/sql-server-bulk-insert/ 

https://www.w3schools.com/sql/sql_view.asp

https://www.w3schools.com/sql/sql_update.asp

https://www.mssqltips.com/sqlservertip/6109/bulk-insert-data-into-sql-server/

Product:
Planning Analytics 2.0.9.18
Microsoft Windows 2019 Server

Issue:

When running TI process that have worked before we get error random like this;

Error in process completion: Unable to save subset “computed subsetname here” in dimension “Version”

Possible Solution:

That the a new anti-virus software is installed, that blocks the TM1 TI process from delete old subset files in data folder.

If you have MS Defender, ask to exclude the data folder and subfolders from the scanning. E.g. d:\tm1\budget\data

The anti-virus software is running from folder C:\ProgramData\Microsoft\Windows Defender\Platform\4.18.23080.2006-0

More Information:

https://answers.microsoft.com/en-us/windows/forum/all/windows-defender-real-time-protection-service/fda3f73e-cc0a-4946-9b9d-3c05057ef90c

https://www.softeng.es/en/blog/microsoft-defender-for-endpoint-the-solution-to-protect-detect-and-respond-to-the-most-advanced-attacks/ 

https://code.cubewise.com/bedrock 

Product:

Microsoft SQL Azure server

Issue:

Need to change a column to have nvarchar(20) instead of nvarchar(200). How change nvarchar in table?

Solution:

Check the data length in the column (nvarchar is unicode, that uses 2 bytes for a single character):

SELECT DATALENGTH(Column_Name) AS FIELDSIZE, Column_Name 
FROM Table_Name
ORDER by FIELDSIZE DESC

Change the columns value with:

Update Table_Name set Column_Name = left(coalesce(Column_Name ,''),20);
Alter table Table_Name alter column Column_Name nvarchar(20) not null;

If you have constrains, you need to drop them first.  DROP INDEX index_name  ON table_name  ;

https://www.w3schools.com/sql/sql_ref_drop_constraint.asp

Otherwise, rename the table;

sp_rename 'old_table_name', 'new_table_name'

Creata a new table with the old name, in SSMS you select the table and from menu select “Script table as” – create to – new query editor window. Adjust the code to have the new nvarchar value, example below:

CREATE TABLE [dbo].[table2](
[index] [int] NOT NULL,
[Name] [nvarchar](20) NULL,
[Country] [nvarchar](50) NULL,
[Employess] [int] NULL
) ON [PRIMARY]

 

Copy the data over to the new table

INSERT INTO table2 (column1, column2, column3, ...)
SELECT column1, column2, column3, ...
FROM table1
WHERE condition; 

For above table the example is as below, the left(coalesce([Name] ,”),20) make that we only copy the 20 first characters.

 insert into table2 ( [index],[Name]
,[Country]
,[Employess])
select [index] ,left(coalesce([Name] ,''),20)
,[Country]
,[Employess] from table1

 

If you not use the LEFT function you may get a error like “String or binary data would be truncated”.

More Information:

https://javarevisited.blogspot.com/2016/03/how-to-increase-length-of-existing-VARCHAR-column-in-SQL-Server.html 

https://www.sqlservertutorial.net/sql-server-basics/sql-server-nvarchar/

https://learn.microsoft.com/en-us/sql/relational-databases/system-stored-procedures/sp-rename-transact-sql?view=sql-server-ver16 

How to rename tables in SQL Server with the sp_rename command

https://www.w3schools.com/sql/sql_insert_into_select.asp

https://www.tutorialrepublic.com/sql-tutorial/sql-cloning-tables.php

https://www.w3schools.com/sql/sql_ref_insert_into_select.asp

https://www.w3schools.com/sql/func_sqlserver_left.asp

As index is a reserved command in SQL, you need to have it inside [ ] for SQL to understand it is a table name.

Product:
Microsoft SQL Azure

Issue:
System is slow, and it looks like TEMPDB is working hard. What can we do?

Solution:

Check what is going on in SQL TEMPDB with this query:

SELECT [Source] = 'database_transactions',
[session_id] = ST.session_id,
[transaction_id] = ST.transaction_id,
[login_name] = S.login_name,
[database_id] = S.database_id,
[program_name] = S.program_name,
[host_name] = S.host_name,
[database_id] = DT.database_id,
[database_name] = CASE
WHEN D.name IS NULL AND DT.database_id = 2 THEN 'TEMPDB'
ELSE D.name
END,

[log_reuse_wait_desc] = D.log_reuse_wait_desc,
[database_transaction_log_used_Kb] = CONVERT(numeric(18,2), DT.database_transaction_log_bytes_used / 1024.0 ),
[database_transaction_begin_time] = DT.database_transaction_begin_time,
[transaction_type_desc] = CASE DT.database_transaction_type
WHEN 1 THEN 'Read/write transaction'
WHEN 2 THEN 'Read-only transaction'
WHEN 3 THEN 'System transaction'
WHEN 4 THEN 'Distributed transaction'
END,

[transaction_state_desc] = CASE DT.database_transaction_state
WHEN 1 THEN 'The transaction has not been initialized.'
WHEN 2 THEN 'The transaction is active'
WHEN 3 THEN 'The transaction has been initialized but has not generated any log records.'
WHEN 4 THEN 'The transaction has generated log records.'
WHEN 5 THEN 'The transaction has been prepared.'
WHEN 10 THEN 'The transaction has been committed.'
WHEN 11 THEN 'The transaction has been rolled back.'
WHEN 12 THEN 'The transaction is being committed. (The log record is being generated, but has not been materialized or persisted.)'
END,

[active_transaction_type_desc] = CASE AT.transaction_type
WHEN 1 THEN 'Read/write transaction'
WHEN 2 THEN 'Read-only transaction'
WHEN 3 THEN 'System transaction'
WHEN 4 THEN 'Distributed transaction'
END,

[active_transaction_state_desc] = CASE AT.transaction_state
WHEN 0 THEN 'The transaction has not been completely initialized yet.'
WHEN 1 THEN 'The transaction has been initialized but has not started.'
WHEN 2 THEN 'The transaction is active'
WHEN 3 THEN 'The transaction has ended. This is used for read-only transactions.'
WHEN 4 THEN 'The commit process has been initiated on the distributed transaction.'
WHEN 5 THEN 'The transaction is in a prepared state and waiting resolution.'
WHEN 6 THEN 'The transaction has been committed.'
WHEN 7 THEN 'The transaction is being rolled back.'
WHEN 8 THEN 'The transaction has been rolled back.'
END

FROM sys.dm_tran_database_transactions DT
INNER JOIN sys.dm_tran_session_transactions ST ON DT.transaction_id = ST.transaction_id
INNER JOIN sys.dm_tran_active_transactions AT ON DT.transaction_id = AT.transaction_id
INNER JOIN sys.dm_exec_sessions S ON ST.session_id = S.session_id
LEFT JOIN sys.databases D ON DT.database_id = D.database_id
WHERE DT.database_id = 2 -- tempdb
ORDER BY ST.session_id, DT.database_id;

 

 

More Information:

https://techcommunity.microsoft.com/t5/azure-database-support-blog/azure-sql-db-and-tempdb-usage-tracking/ba-p/1573220 

https://techcommunity.microsoft.com/t5/azure-database-support-blog/resolve-tempdb-related-errors-in-azure-sql-database/ba-p/3597944 

 

Product:
Microsoft Azure Database SQL

Issue:
When running a Store Procedure (SP) you get an error after a long time,

Sql error number: 40544. Error Message: The database ‘tempdb’ has reached its size quota.

Suggested solution:

Increase, if possible, your DTU, this will reset the tempdb and you will free space to start over.

In SSMS check what SQL Azure level you are with:

SELECT Edition = DATABASEPROPERTYEX('databasename', 'EDITION'),

        ServiceObjective = DATABASEPROPERTYEX('databasename', 'ServiceObjective'),

        MaxSizeInBytes =  DATABASEPROPERTYEX('databasename', 'MaxSizeInBytes');

 

In SSMS enter to see the file size of TEMPDB at the moment:

SELECT [Source] = 'database_files', 
[TEMPDB_max_size_MB] = SUM(max_size) * 8 / 1027.0, 
[TEMPDB_current_size_MB] = SUM(size) * 8 / 1027.0, 
[FileCount] = COUNT(FILE_ID)
FROM tempdb.sys.database_files
WHERE type = 0 --ROWS

Above we used up the file size limit of 13.9 GB for TEMPDB that exist in the first tiers. But with below SQL query can you see how much space is used inside:

SELECT 
(SUM(unallocated_extent_page_count)*1.0/128) AS [Free space(MB)]
,(SUM(version_store_reserved_page_count)*1.0/128) AS [Used Space by VersionStore(MB)]
,(SUM(internal_object_reserved_page_count)*1.0/128) AS [Used Space by InternalObjects(MB)]
,(SUM(user_object_reserved_page_count)*1.0/128) AS [Used Space by UserObjects(MB)]
FROM tempdb.sys.dm_db_file_space_usage;

 

Service-level objective Maximum tempdb data file size (GB) Number of tempdb data files Maximum tempdb data size (GB)
Basic 13.9 1 13.9
S0 13.9 1 13.9
S1 13.9 1 13.9
S2 13.9 1 13.9
S3 32 1 32

 

Recommended solution, is to check what is causing the creation of large use of TEMPDB space, by check your query plans in SSMS.

Then improve you table columns format, to only what you need. Use nvarchar(50) instead of nvarchar(max) etc.

Check you store procedures query’s, and insert a index on the columns that you thing will make the selection smaller fastest.

DTU in azure is a combination of CPU usage per second and read/write I/O per second to disc. When you have used up your quota, there is a limit on how much bytes you can write to disc per second, so your process will succeed but it will take much longer as a small amount of data is processed at each second.

A database transaction unit (DTU) represents a blended measure of CPU, memory, reads, and writes. Service tiers in the DTU-based purchasing model are differentiated by a range of compute sizes with a fixed amount of included storage, fixed retention period for backups, and fixed price.

https://blog.atwork.at/post/Azure-Subscription-and-Service-Limits 

https://www.spotlightcloud.io/blog/what-is-dtu-in-azure-sql-database-and-how-much-do-we-need

More information:

https://knowledge-base.havit.eu/2018/02/19/azure-sql-the-database-tempdb-has-reached-its-size-quota-partition-or-delete-data-drop-indexes-or-consult-the-documentation-for-possible-resolutions-microsoft-sql-server-error-40544/ 

https://techcommunity.microsoft.com/t5/azure-database-support-blog/resolve-tempdb-related-errors-in-azure-sql-database/ba-p/3597944 

https://sqlcoffee.com/Azure_0013.htm 

https://www.brentozar.com/archive/2018/02/memory-grants-sql-servers-public-toilet/

Top five considerations for SQL Server index design

https://learn.microsoft.com/en-us/azure/azure-sql/database/resource-limits-dtu-single-databases?view=azuresql#tempdb-sizes 

https://www.brentozar.com/archive/2023/09/oops-i-lost-my-indexes-in-azure-sql-db/ 

https://www.sqlshack.com/sql-index-overview-and-strategy/ 

https://learn.microsoft.com/en-us/sql/relational-databases/sql-server-index-design-guide?view=sql-server-ver16  

https://datasimantics.com/2018/08/24/sql-servers-nvarcharmax-and-how-to-wield-it/ 

https://www.ibm.com/support/pages/only-first-1024-characters-nvarcharmax-column-are-presented-report-based-dqm-package

https://learn.microsoft.com/en-us/azure/azure-sql/database/service-tiers-dtu?view=azuresql

https://learn.microsoft.com/en-us/azure/azure-sql/database/resource-limits-dtu-single-databases?view=azuresql 

To see sessions that use TEMPDB:

-- Sessions with open transactions in tempdb
SELECT [Source] = 'database_transactions', 
[session_id] = ST.session_id, 
[transaction_id] = ST.transaction_id, 
[login_name] = S.login_name, 
[database_id] = S.database_id, 
[program_name] = S.program_name, 
[host_name] = S.host_name, 
[database_id] = DT.database_id, 
[database_name] = CASE
WHEN D.name IS NULL AND DT.database_id = 2 THEN 'TEMPDB'
ELSE D.name
END, 
[log_reuse_wait_desc] = D.log_reuse_wait_desc, 
[database_transaction_log_used_Kb] = CONVERT(numeric(18,2), DT.database_transaction_log_bytes_used / 1024.0 ), 
[database_transaction_begin_time] = DT.database_transaction_begin_time, 
[transaction_type_desc] = CASE DT.database_transaction_type
WHEN 1 THEN 'Read/write transaction'
WHEN 2 THEN 'Read-only transaction'
WHEN 3 THEN 'System transaction'
WHEN 4 THEN 'Distributed transaction'
END, 
[transaction_state_desc] = CASE DT.database_transaction_state
WHEN 1 THEN 'The transaction has not been initialized.'
WHEN 2 THEN 'The transaction is active'
WHEN 3 THEN 'The transaction has been initialized but has not generated any log records.'
WHEN 4 THEN 'The transaction has generated log records.'
WHEN 5 THEN 'The transaction has been prepared.'
WHEN 10 THEN 'The transaction has been committed.'
WHEN 11 THEN 'The transaction has been rolled back.'
WHEN 12 THEN 'The transaction is being committed. (The log record is being generated, but has not been materialized or persisted.)'
END,
[active_transaction_type_desc] = CASE AT.transaction_type
WHEN 1 THEN 'Read/write transaction'
WHEN 2 THEN 'Read-only transaction'
WHEN 3 THEN 'System transaction'
WHEN 4 THEN 'Distributed transaction'
END, 
[active_transaction_state_desc] = CASE AT.transaction_state
WHEN 0 THEN 'The transaction has not been completely initialized yet.'
WHEN 1 THEN 'The transaction has been initialized but has not started.'
WHEN 2 THEN 'The transaction is active'
WHEN 3 THEN 'The transaction has ended. This is used for read-only transactions.'
WHEN 4 THEN 'The commit process has been initiated on the distributed transaction.'
WHEN 5 THEN 'The transaction is in a prepared state and waiting resolution.'
WHEN 6 THEN 'The transaction has been committed.'
WHEN 7 THEN 'The transaction is being rolled back.'
WHEN 8 THEN 'The transaction has been rolled back.'
END
FROM sys.dm_tran_database_transactions DT
INNER JOIN sys.dm_tran_session_transactions ST ON DT.transaction_id = ST.transaction_id
INNER JOIN sys.dm_tran_active_transactions AT ON DT.transaction_id = AT.transaction_id
INNER JOIN sys.dm_exec_sessions S ON ST.session_id = S.session_id
LEFT JOIN sys.databases D ON DT.database_id = D.database_id
WHERE DT.database_id = 2 -- tempdb
ORDER BY ST.session_id, DT.database_id;



 

Product:
Microsoft SQL Azure

Issue:
How many rows are it in a database table?

Solution:

Enter in SSMS:

sp_spaceused 'dbo.tablename';

This will give you both number of rows and space used by the table.

 

More Information:

https://www.brentozar.com/archive/2014/02/count-number-rows-table-sql-server/

https://www.sqlshack.com/capture-databases-usage-stats-using-sp_spaceused-powershell/ 

Product:

Planning Analytics 2.0.9.18

Microsoft Windows 2019 server

Issue:

TM1 TI process that have worked before, crash now with error:

Error: Data procedure line (0): Exception Occurred during process execution: TM1MemoryException: Temporary pool exceeded

Possible Solution:

Change in the Tm1s.cfg file for your application to have a bigger value for MaximumViewSize, like MaximumViewSize=1000.

Restart the TM1 service, and run your process again.

MaximumViewSize is a per thread limitation.

More information:

https://www.ibm.com/support/pages/warning-noted-tm1-server-log-memorytemppoolexceeded 

https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=pitf-applymaximumviewsizetoentiretransaction#concept_ur1_bbp_rq 

By default MaximumViewSize checks individual view processing. For example, if 10 views are processed in a single transaction, the threshold is crossed only if the processing of any single view crosses the threshold. See MaximumViewSize.

With ApplyMaximumViewSizeToEntireTransaction=T parameter set to True, the cumulative memory usage of all views processed in a single transaction is compared against the threshold value. This allows the memory size threshold to catch more transactions that consume large amounts of memory.

Product:
Planning Analytics Local 2.0.9.18
TM1_version=TM1-AW64-ML-RTM-11.0.918.3-0
Microsoft Windows 2019 server

Problem:

After upgrade of Planning Analytics Local installation (PAL), the TM1 App Web is not showing the applications.

Suggested Solution:

Stop the service IBM Cognos TM1

Go to folder D:\Program Files\ibm\cognos\tm1_64\webapps\pmpsvc\WEB-INF\configuration

Rename fpmsvc_config.xml to fpmsvc_config.xml.old.txt

Rename fpmsvc_config.xml.new to fpmsvc_config.xml

Copy the lines between, the tm1 markers, that contain the name of the applications:

<tm1>
 </tm1>

from the file fpmsvc_config.xml.old.txt to the file fpmsvc_config.xml

Now the applications and gateway uri should be listed in the new file.

Start the service IBM Cognos TM1

More information:

https://www.ibm.com/support/pages/tm1-application-does-not-appear-tm1-applications-list-after-deployment

Product:

Planning Analytics Workspace 88  (file version.ps1 in folder paw\config contains a version number)
$env:PAW_BUILD=”121″
$env:PAW_RELEASE=”2.0.88″
Microsoft Windows 2019 server

Issue:

What containers should be running in a working PAW installation?

Solution:

Start powershell as administrator and enter command:

docker ps

This will list all running containers – should be these:

CONTAINER ID IMAGE                                                               PORTS                        NAMES
b6874749d0a5 127.0.0.1:5000/planninganalytics/prism-platform:3.0.2365.2-ltsc2019 9080/tcp                     prism-platform
5f104714f851 127.0.0.1:5000/planninganalytics/bss:1.0.1397-ltsc2019              8082/tcp                     bss
a946de8db063 127.0.0.1:5000/planninganalytics/pa-gateway:1.0.1098-ltsc2019       0.0.0.0:80->80/tcp, 0.0.0.0:443->443/tcp pa-gateway
83e22e0be1f8 127.0.0.1:5000/planninganalytics/neo-provision:1.0.392-ltsc2019     8083/tcp                     neo-provision
6d11be2a5fd7 127.0.0.1:5000/planninganalytics/neo-idviz:9.0.982-ltsc2019         9060/tcp                     neo-idviz
e8c3bd52ca54 127.0.0.1:5000/planninganalytics/monitor:2.0.88.3035-ltsc2019       9082/tcp                     monitor
bfa3ef090459 127.0.0.1:5000/planninganalytics/wa-proxy:1.0.1020-ltsc2019         1338/tcp                     wa-proxy
fb7bd169d5b3 127.0.0.1:5000/planninganalytics/async-service:1.0.387-ltsc2019     9666/tcp                     async-service
fef9418814e5 127.0.0.1:5000/planninganalytics/share-platform:1.0.425-ltsc2019    9110/tcp                     share-platform
afaeb58c897c 127.0.0.1:5000/planninganalytics/pa-plan-service:1.0.2023051901-ltsc2019 9080/tcp                plan-service
b84bc681967b 127.0.0.1:5000/planninganalytics/paw-ui-api:1.0.260-ltsc2019        3000/tcp                     paw-ui-api
31ac3574ea06 127.0.0.1:5000/planninganalytics/user-admin:1.0.737-ltsc2019        3333/tcp                     user-admin
5e55f009d40f 127.0.0.1:5000/planninganalytics/pa-glass:3.0.9788-ltsc2019         9080/tcp                     glass
8dc5e074265d 127.0.0.1:5000/planninganalytics/atlas-service:125-ltsc2019         9076/tcp                     atlas
5d482449ac97 127.0.0.1:5000/planninganalytics/prism-proxy:3.0.11099-ltsc2019     9090/tcp, 9100/tcp           prism-proxy
59b615950e2b 127.0.0.1:5000/planninganalytics/couchdb:234-ltsc2019               5984/tcp                     couchdb
1c0305d7d945 127.0.0.1:5000/planninganalytics/share-app:1.0.605-ltsc2019         9700/tcp                     share-app
010ac1fac8db 127.0.0.1:5000/planninganalytics/mongo:1.0.205-ltsc2019             27017/tcp                    mongo
481b8cb26b29 127.0.0.1:5000/planninganalytics/prism-app:3.0.5536-ltsc2019        9600/tcp                     prism-app
78709cb2d0d1 127.0.0.1:5000/planninganalytics/palm-service:1.0.340-ltsc2019      9085/tcp                     palm-service
545cf6eb986f 127.0.0.1:5000/planninganalytics/pa-content-service:1.0.310-ltsc2019 9191/tcp                    pa-content
61093630f145 127.0.0.1:5000/planninganalytics/redis:1.0.206-ltsc2019             6379/tcp                     redis
4e149a0ae6ea 127.0.0.1:5000/planninganalytics/pa-cdn:3.0.9788-ltsc2019           8080/tcp                     cdn
927748e22277 127.0.0.1:5000/planninganalytics/share-proxy:1.0.605-ltsc2019       9070/tcp                     share-proxy
8004ed0c651d 127.0.0.1:5000/planninganalytics/pa-predict-svc:1.0.1054-ltsc2019   9610/tcp                     pa-predict
0b084764c655 127.0.0.1:5000/planninganalytics/tm1proxy:1.0.411-ltsc2019          1339/tcp                     tm1proxy
394e473a354d 127.0.0.1:5000/planninganalytics/admintool:1.0.340-ltsc2019         8888/tcp                     admintool

Command docker images will list all installed images, should normally be these:

REPOSITORY                                       TAG                  IMAGE ID     CREATED      SIZE
127.0.0.1:5000/planninganalytics/monitor         2.0.88.3035-ltsc2019 cdd1118734aa 7 weeks ago  5.35GB
127.0.0.1:5000/planninganalytics/pa-glass        3.0.9788-ltsc2019    eda31db65292 2 months ago 5.27GB
127.0.0.1:5000/planninganalytics/pa-cdn          3.0.9788-ltsc2019    e30ce3e584eb 2 months ago 5.34GB
127.0.0.1:5000/planninganalytics/prism-proxy     3.0.11099-ltsc2019   ad0700244d5f 2 months ago 4.77GB
127.0.0.1:5000/planninganalytics/share-proxy     1.0.605-ltsc2019     7c0f57f0564c 2 months ago 4.71GB
127.0.0.1:5000/planninganalytics/share-app       1.0.605-ltsc2019     30028e59100d 2 months ago 4.72GB
127.0.0.1:5000/planninganalytics/pa-gateway      1.0.1098-ltsc2019    bc188fbdba7e 2 months ago 4.71GB
127.0.0.1:5000/planninganalytics/prism-platform  3.0.2365.2-ltsc2019  1af1c76c5ebe 2 months ago 5.66GB
127.0.0.1:5000/planninganalytics/prism-app       3.0.5536-ltsc2019    149080a8fc2d 2 months ago 4.73GB
127.0.0.1:5000/planninganalytics/palm-service    1.0.340-ltsc2019     5870fb15710e 2 months ago 4.76GB
127.0.0.1:5000/planninganalytics/tm1proxy        1.0.411-ltsc2019     301bf315ca8b 2 months ago 4.63GB
127.0.0.1:5000/planninganalytics/bss             1.0.1397-ltsc2019    16a9f3403a03 2 months ago 5.32GB
127.0.0.1:5000/planninganalytics/wa-proxy        1.0.1020-ltsc2019    d365a00fbcb0 2 months ago 4.73GB
127.0.0.1:5000/planninganalytics/paw-ui-api      1.0.260-ltsc2019     9e3f8ca98062 2 months ago 4.82GB
127.0.0.1:5000/planninganalytics/async-service   1.0.387-ltsc2019     f7e111569a61 2 months ago 4.82GB
127.0.0.1:5000/planninganalytics/pa-predict-svc  1.0.1054-ltsc2019    6dabd0bfa755 2 months ago 5.34GB
127.0.0.1:5000/planninganalytics/user-admin      1.0.737-ltsc2019     e4ea0b9f71cf 2 months ago 4.82GB
127.0.0.1:5000/planninganalytics/neo-provision   1.0.392-ltsc2019     7c04066c0fed 2 months ago 5.35GB
127.0.0.1:5000/planninganalytics/couchdb         234-ltsc2019         7832c5ecc13c 2 months ago 4.93GB
127.0.0.1:5000/planninganalytics/neo-idviz       9.0.982-ltsc2019     6847d7d6725d 2 months ago 5.55GB
127.0.0.1:5000/planninganalytics/pa-plan-service 1.0.2023051901-ltsc2019 04c8517f2f3a 2 months ago 5.28GB
127.0.0.1:5000/planninganalytics/admintool       1.0.340-ltsc2019     428093815025 2 months ago 4.72GB
127.0.0.1:5000/planninganalytics/pa-content-service 1.0.310-ltsc2019  bfa8b7dcb3f4 2 months ago 4.73GB
127.0.0.1:5000/planninganalytics/atlas-service   125-ltsc2019         f012ea094ed4 2 months ago 5.31GB
127.0.0.1:5000/planninganalytics/share-platform  1.0.425-ltsc2019     9ff87c1b417a 2 months ago 5.29GB
127.0.0.1:5000/planninganalytics/bss-init        1.0.342-ltsc2019     caf9f295cdcb 2 months ago 4.71GB
127.0.0.1:5000/planninganalytics/redis           1.0.206-ltsc2019     c5f67bbdafc5 2 months ago 4.64GB
127.0.0.1:5000/planninganalytics/mongo           1.0.205-ltsc2019     852817ecb6fe 2 months ago 4.85GB
127.0.0.1:5000/planninganalytics/ibm-java8       225-ltsc2019         7d2f0cc5bd1f 2 months ago 5.11GB
127.0.0.1:5000/planninganalytics/couchdb-init    1.0.835-ltsc2019     85873eaa9a40 2 months ago 4.62GB

 

If you are missing a image, the installation have failed. If not all containers is running, then you may have issues with your Anti-virus software.

Uninstall Trellix/McAfee and reboot server and try again to install PAW.

 

More Information:

https://www.ibm.com/support/pages/download-ibm-planning-analytics-local-v20-planning-analytics-workspace-release-88-fix-central

https://exploringtm1.com/how-to-install-planning-analytics-workspace-to-windows-server-2019/

https://www.ibm.com/support/pages/after-each-reboot-some-planning-analytics-workspace-containers-are-failing-start

https://circleci.com/blog/docker-and-cicd-tutorial-a-deep-dive-into-containers/

https://gist.github.com/danijeljw/a7a2553bd06742648172363ce3983a9a