Product:

Microsoft SQL Azure server

Issue:

Need to change a column to have nvarchar(20) instead of nvarchar(200). How change nvarchar in table?

Solution:

Check the data length in the column (nvarchar is unicode, that uses 2 bytes for a single character):

SELECT DATALENGTH(Column_Name) AS FIELDSIZE, Column_Name 
FROM Table_Name
ORDER by FIELDSIZE DESC

Change the columns value with:

Update Table_Name set Column_Name = left(coalesce(Column_Name ,''),20);
Alter table Table_Name alter column Column_Name nvarchar(20) not null;

If you have constrains, you need to drop them first.  DROP INDEX index_name  ON table_name  ;

https://www.w3schools.com/sql/sql_ref_drop_constraint.asp

Otherwise, rename the table;

sp_rename 'old_table_name', 'new_table_name'

Creata a new table with the old name, in SSMS you select the table and from menu select “Script table as” – create to – new query editor window. Adjust the code to have the new nvarchar value, example below:

CREATE TABLE [dbo].[table2](
[index] [int] NOT NULL,
[Name] [nvarchar](20) NULL,
[Country] [nvarchar](50) NULL,
[Employess] [int] NULL
) ON [PRIMARY]

 

Copy the data over to the new table

INSERT INTO table2 (column1, column2, column3, ...)
SELECT column1, column2, column3, ...
FROM table1
WHERE condition; 

For above table the example is as below, the left(coalesce([Name] ,”),20) make that we only copy the 20 first characters.

 insert into table2 ( [index],[Name]
,[Country]
,[Employess])
select [index] ,left(coalesce([Name] ,''),20)
,[Country]
,[Employess] from table1

 

If you not use the LEFT function you may get a error like “String or binary data would be truncated”.

More Information:

https://javarevisited.blogspot.com/2016/03/how-to-increase-length-of-existing-VARCHAR-column-in-SQL-Server.html 

https://www.sqlservertutorial.net/sql-server-basics/sql-server-nvarchar/

https://learn.microsoft.com/en-us/sql/relational-databases/system-stored-procedures/sp-rename-transact-sql?view=sql-server-ver16 

How to rename tables in SQL Server with the sp_rename command

https://www.w3schools.com/sql/sql_insert_into_select.asp

https://www.tutorialrepublic.com/sql-tutorial/sql-cloning-tables.php

https://www.w3schools.com/sql/sql_ref_insert_into_select.asp

https://www.w3schools.com/sql/func_sqlserver_left.asp

As index is a reserved command in SQL, you need to have it inside [ ] for SQL to understand it is a table name.

Product:
Microsoft SQL Azure

Issue:
System is slow, and it looks like TEMPDB is working hard. What can we do?

Solution:

Check what is going on in SQL TEMPDB with this query:

SELECT [Source] = 'database_transactions',
[session_id] = ST.session_id,
[transaction_id] = ST.transaction_id,
[login_name] = S.login_name,
[database_id] = S.database_id,
[program_name] = S.program_name,
[host_name] = S.host_name,
[database_id] = DT.database_id,
[database_name] = CASE
WHEN D.name IS NULL AND DT.database_id = 2 THEN 'TEMPDB'
ELSE D.name
END,

[log_reuse_wait_desc] = D.log_reuse_wait_desc,
[database_transaction_log_used_Kb] = CONVERT(numeric(18,2), DT.database_transaction_log_bytes_used / 1024.0 ),
[database_transaction_begin_time] = DT.database_transaction_begin_time,
[transaction_type_desc] = CASE DT.database_transaction_type
WHEN 1 THEN 'Read/write transaction'
WHEN 2 THEN 'Read-only transaction'
WHEN 3 THEN 'System transaction'
WHEN 4 THEN 'Distributed transaction'
END,

[transaction_state_desc] = CASE DT.database_transaction_state
WHEN 1 THEN 'The transaction has not been initialized.'
WHEN 2 THEN 'The transaction is active'
WHEN 3 THEN 'The transaction has been initialized but has not generated any log records.'
WHEN 4 THEN 'The transaction has generated log records.'
WHEN 5 THEN 'The transaction has been prepared.'
WHEN 10 THEN 'The transaction has been committed.'
WHEN 11 THEN 'The transaction has been rolled back.'
WHEN 12 THEN 'The transaction is being committed. (The log record is being generated, but has not been materialized or persisted.)'
END,

[active_transaction_type_desc] = CASE AT.transaction_type
WHEN 1 THEN 'Read/write transaction'
WHEN 2 THEN 'Read-only transaction'
WHEN 3 THEN 'System transaction'
WHEN 4 THEN 'Distributed transaction'
END,

[active_transaction_state_desc] = CASE AT.transaction_state
WHEN 0 THEN 'The transaction has not been completely initialized yet.'
WHEN 1 THEN 'The transaction has been initialized but has not started.'
WHEN 2 THEN 'The transaction is active'
WHEN 3 THEN 'The transaction has ended. This is used for read-only transactions.'
WHEN 4 THEN 'The commit process has been initiated on the distributed transaction.'
WHEN 5 THEN 'The transaction is in a prepared state and waiting resolution.'
WHEN 6 THEN 'The transaction has been committed.'
WHEN 7 THEN 'The transaction is being rolled back.'
WHEN 8 THEN 'The transaction has been rolled back.'
END

FROM sys.dm_tran_database_transactions DT
INNER JOIN sys.dm_tran_session_transactions ST ON DT.transaction_id = ST.transaction_id
INNER JOIN sys.dm_tran_active_transactions AT ON DT.transaction_id = AT.transaction_id
INNER JOIN sys.dm_exec_sessions S ON ST.session_id = S.session_id
LEFT JOIN sys.databases D ON DT.database_id = D.database_id
WHERE DT.database_id = 2 -- tempdb
ORDER BY ST.session_id, DT.database_id;

 

 

More Information:

https://techcommunity.microsoft.com/t5/azure-database-support-blog/azure-sql-db-and-tempdb-usage-tracking/ba-p/1573220 

https://techcommunity.microsoft.com/t5/azure-database-support-blog/resolve-tempdb-related-errors-in-azure-sql-database/ba-p/3597944 

 

Product:
Microsoft Azure Database SQL

Issue:
When running a Store Procedure (SP) you get an error after a long time,

Sql error number: 40544. Error Message: The database ‘tempdb’ has reached its size quota.

Suggested solution:

Increase, if possible, your DTU, this will reset the tempdb and you will free space to start over.

In SSMS check what SQL Azure level you are with:

SELECT Edition = DATABASEPROPERTYEX('databasename', 'EDITION'),

        ServiceObjective = DATABASEPROPERTYEX('databasename', 'ServiceObjective'),

        MaxSizeInBytes =  DATABASEPROPERTYEX('databasename', 'MaxSizeInBytes');

 

In SSMS enter to see the file size of TEMPDB at the moment:

SELECT [Source] = 'database_files', 
[TEMPDB_max_size_MB] = SUM(max_size) * 8 / 1027.0, 
[TEMPDB_current_size_MB] = SUM(size) * 8 / 1027.0, 
[FileCount] = COUNT(FILE_ID)
FROM tempdb.sys.database_files
WHERE type = 0 --ROWS

Above we used up the file size limit of 13.9 GB for TEMPDB that exist in the first tiers. But with below SQL query can you see how much space is used inside:

SELECT 
(SUM(unallocated_extent_page_count)*1.0/128) AS [Free space(MB)]
,(SUM(version_store_reserved_page_count)*1.0/128) AS [Used Space by VersionStore(MB)]
,(SUM(internal_object_reserved_page_count)*1.0/128) AS [Used Space by InternalObjects(MB)]
,(SUM(user_object_reserved_page_count)*1.0/128) AS [Used Space by UserObjects(MB)]
FROM tempdb.sys.dm_db_file_space_usage;

 

Service-level objective Maximum tempdb data file size (GB) Number of tempdb data files Maximum tempdb data size (GB)
Basic 13.9 1 13.9
S0 13.9 1 13.9
S1 13.9 1 13.9
S2 13.9 1 13.9
S3 32 1 32

 

Recommended solution, is to check what is causing the creation of large use of TEMPDB space, by check your query plans in SSMS.

Then improve you table columns format, to only what you need. Use nvarchar(50) instead of nvarchar(max) etc.

Check you store procedures query’s, and insert a index on the columns that you thing will make the selection smaller fastest.

DTU in azure is a combination of CPU usage per second and read/write I/O per second to disc. When you have used up your quota, there is a limit on how much bytes you can write to disc per second, so your process will succeed but it will take much longer as a small amount of data is processed at each second.

A database transaction unit (DTU) represents a blended measure of CPU, memory, reads, and writes. Service tiers in the DTU-based purchasing model are differentiated by a range of compute sizes with a fixed amount of included storage, fixed retention period for backups, and fixed price.

https://blog.atwork.at/post/Azure-Subscription-and-Service-Limits 

https://www.spotlightcloud.io/blog/what-is-dtu-in-azure-sql-database-and-how-much-do-we-need

More information:

https://knowledge-base.havit.eu/2018/02/19/azure-sql-the-database-tempdb-has-reached-its-size-quota-partition-or-delete-data-drop-indexes-or-consult-the-documentation-for-possible-resolutions-microsoft-sql-server-error-40544/ 

https://techcommunity.microsoft.com/t5/azure-database-support-blog/resolve-tempdb-related-errors-in-azure-sql-database/ba-p/3597944 

https://sqlcoffee.com/Azure_0013.htm 

https://www.brentozar.com/archive/2018/02/memory-grants-sql-servers-public-toilet/

Top five considerations for SQL Server index design

https://learn.microsoft.com/en-us/azure/azure-sql/database/resource-limits-dtu-single-databases?view=azuresql#tempdb-sizes 

https://www.brentozar.com/archive/2023/09/oops-i-lost-my-indexes-in-azure-sql-db/ 

https://www.sqlshack.com/sql-index-overview-and-strategy/ 

https://learn.microsoft.com/en-us/sql/relational-databases/sql-server-index-design-guide?view=sql-server-ver16  

https://datasimantics.com/2018/08/24/sql-servers-nvarcharmax-and-how-to-wield-it/ 

https://www.ibm.com/support/pages/only-first-1024-characters-nvarcharmax-column-are-presented-report-based-dqm-package

https://learn.microsoft.com/en-us/azure/azure-sql/database/service-tiers-dtu?view=azuresql

https://learn.microsoft.com/en-us/azure/azure-sql/database/resource-limits-dtu-single-databases?view=azuresql 

To see sessions that use TEMPDB:

-- Sessions with open transactions in tempdb
SELECT [Source] = 'database_transactions', 
[session_id] = ST.session_id, 
[transaction_id] = ST.transaction_id, 
[login_name] = S.login_name, 
[database_id] = S.database_id, 
[program_name] = S.program_name, 
[host_name] = S.host_name, 
[database_id] = DT.database_id, 
[database_name] = CASE
WHEN D.name IS NULL AND DT.database_id = 2 THEN 'TEMPDB'
ELSE D.name
END, 
[log_reuse_wait_desc] = D.log_reuse_wait_desc, 
[database_transaction_log_used_Kb] = CONVERT(numeric(18,2), DT.database_transaction_log_bytes_used / 1024.0 ), 
[database_transaction_begin_time] = DT.database_transaction_begin_time, 
[transaction_type_desc] = CASE DT.database_transaction_type
WHEN 1 THEN 'Read/write transaction'
WHEN 2 THEN 'Read-only transaction'
WHEN 3 THEN 'System transaction'
WHEN 4 THEN 'Distributed transaction'
END, 
[transaction_state_desc] = CASE DT.database_transaction_state
WHEN 1 THEN 'The transaction has not been initialized.'
WHEN 2 THEN 'The transaction is active'
WHEN 3 THEN 'The transaction has been initialized but has not generated any log records.'
WHEN 4 THEN 'The transaction has generated log records.'
WHEN 5 THEN 'The transaction has been prepared.'
WHEN 10 THEN 'The transaction has been committed.'
WHEN 11 THEN 'The transaction has been rolled back.'
WHEN 12 THEN 'The transaction is being committed. (The log record is being generated, but has not been materialized or persisted.)'
END,
[active_transaction_type_desc] = CASE AT.transaction_type
WHEN 1 THEN 'Read/write transaction'
WHEN 2 THEN 'Read-only transaction'
WHEN 3 THEN 'System transaction'
WHEN 4 THEN 'Distributed transaction'
END, 
[active_transaction_state_desc] = CASE AT.transaction_state
WHEN 0 THEN 'The transaction has not been completely initialized yet.'
WHEN 1 THEN 'The transaction has been initialized but has not started.'
WHEN 2 THEN 'The transaction is active'
WHEN 3 THEN 'The transaction has ended. This is used for read-only transactions.'
WHEN 4 THEN 'The commit process has been initiated on the distributed transaction.'
WHEN 5 THEN 'The transaction is in a prepared state and waiting resolution.'
WHEN 6 THEN 'The transaction has been committed.'
WHEN 7 THEN 'The transaction is being rolled back.'
WHEN 8 THEN 'The transaction has been rolled back.'
END
FROM sys.dm_tran_database_transactions DT
INNER JOIN sys.dm_tran_session_transactions ST ON DT.transaction_id = ST.transaction_id
INNER JOIN sys.dm_tran_active_transactions AT ON DT.transaction_id = AT.transaction_id
INNER JOIN sys.dm_exec_sessions S ON ST.session_id = S.session_id
LEFT JOIN sys.databases D ON DT.database_id = D.database_id
WHERE DT.database_id = 2 -- tempdb
ORDER BY ST.session_id, DT.database_id;



 

Product:
Microsoft SQL Azure

Issue:
How many rows are it in a database table?

Solution:

Enter in SSMS:

sp_spaceused 'dbo.tablename';

This will give you both number of rows and space used by the table.

 

More Information:

https://www.brentozar.com/archive/2014/02/count-number-rows-table-sql-server/

https://www.sqlshack.com/capture-databases-usage-stats-using-sp_spaceused-powershell/ 

Product:

Microsoft Azure Blob storage

Problem:

Put a file on blob storage with POSTMAN.  https://www.postman.com/product/rest-client/

Get error  404 The specified blob does not exist

 

Solution:

Change to PUT instead of GET in postman program, to place a file on the blob storage.

As the file you are asking for does not exist – as you have not put it there yet – you get a 404 error.

To put a file on the blob storage you need to add headers like:

x-ms-blob-type  = BlockBlob

x-ms-date = 2022-11-02

 

Error like 400 Value for one of the query parameters specified in the request URI is invalid. can be that you miss the container name, only have the URL.

Error like 400 The requested URI does not represent any resource on the server. can be that you have not listed the file, only the container name in the URL.

Error like 403 This request is not authorized to perform this operation, can be that you are not logged into the VPN to give you access to the azure storage.

Error like 400 The requested URI does not represent any resource on the server. can be that the url does not contain the file to read in a get statement.

Error like 400 An HTTP header that’s mandatory for this request is not specified. can be that you are missing the x-ms-blob-type header.

 

More Information:

https://learn.microsoft.com/en-us/azure/storage/blobs/authorize-access-azure-active-directory

https://learn.microsoft.com/en-us/azure/data-explorer/kusto/api/connection-strings/storage-authentication-methods

Product:

Microsoft PowerShell
Planning Analytics 2.0.9.17
Microsoft Windows 2019 server

Issue:
I have backup zip files in a folder that i want to move to a different drive, to save space on the first drive.

 

Solution:

Create a d:\script folder and in notepad++ create a copyoldfile.ps1 file with this content:

# powershell script to copy old files
# older than a month from today's date

# set common start values
$abortFlag = 0

# Get a input parameter https://ss64.com/ps/syntax-args.html
# get from folder as parameter to call to script
[String]$FromFolder = $Args[0]
[String]$ToFolder = $Args[1]

# debug lines 
Write-Host [$FromFolder] 
Write-Host [$ToFolder] 

# check if args are empty and then break the code
if ( $FromFolder -eq '' ) {
 Write-Host "Fromfolder is missing" -ForegroundColor Red
            $abortFlag = 1
}

if ( $ToFolder -eq '' ) {
 Write-Host "Tofolder is missing" -ForegroundColor Red
            $abortFlag = 1
}

 if ($abortFlag -eq 1) {
        Write-Host "The abort flag has been tripped. Exit script." -ForegroundColor Red
        break
}

# for each file in folder check date and move file older than 30 days
Get-Childitem -File -Path $FromFolder -Filter *.zip | Where-Object {$_.LastWriteTime -lt (Get-Date).AddDays(-30)}  |     Move-Item  -Destination $ToFolder 

# 
#
The above script take two values the from folder and the target folder. It will then check the creation date on each file in folder and move the files that are older than 30 days. It will only move files that end with .zip.
When testing a powershell script, you should at the end of line for execution add: -Verbose -WhatIf
The script have a check that the values entered is not empty (null), and then stop the script.  The output is not shown if you call the ps1 script from a other program. To call this script from a TM1 process enter in prolog like this:
DatasourceASCIIQuoteCharacter= Char(39);

SC_QUOTE_COMMAND = Char(34);
sExec = 'Powershell -ExecutionPolicy ByPass -file ' | SC_QUOTE_COMMAND | 'd:\script\copyoldfile.ps1' | SC_QUOTE_COMMAND | ' ' ;
sSourceDir = SC_QUOTE_COMMAND | 'D:\TM1 Data\Back Up' | SC_QUOTE_COMMAND | ' ' ;
sTargetDir = SC_QUOTE_COMMAND | 'd:\arkiv' | SC_QUOTE_COMMAND | ' ' ;

sCommand = sExec | ' ' | sSourceDir | ' ' | sTargetDir ;

## remove to debug the string sent to cmd
## ASCIIOUTPUT ( 'd:\temp\result.txt' , sCommand);

EXECUTECOMMAND(sCommand,0);

The powershell only accepts ” around its file paths, we have to change the QuoteCharacter to be ‘. Change the folders in the TI process to match your backup folders. Do your developing and testing in your TM1 development servers first.

https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=tv-turbointegrator-local-variables

 

More Information:

https://www.techtarget.com/searchwindowsserver/tutorial/PowerShell-Move-Item-examples-for-file-folder-management 

https://linuxhint.com/loop-through-files-in-a-directory-using-powershell/ 

https://theitbros.com/powershell-script-for-loop-through-files-and-folders/ 

https://learn.microsoft.com/en-us/powershell/scripting/learn/deep-dives/everything-about-null?view=powershell-7.3 

https://learn.microsoft.com/en-us/powershell/scripting/learn/deep-dives/everything-about-if?view=powershell-7.3 

https://adamtheautomator.com/powershell-if-statement/ 

https://ss64.com/ps/syntax-args.html 

https://blog.ironmansoftware.com/daily-powershell/assign-powershell-variables/ 

https://linuxhint.com/get-current-date-powershell/

https://www.red-gate.com/simple-talk/sysadmin/powershell/how-to-use-parameters-in-powershell/ 

 

Product:

Microsoft PowerShell

Microsoft Windows 2019 Server

Issue:

How get the month value from datetime function in powershell?

Solution:

Enter a ps1 script like this:

# get today's day number: Get-Date -UFormat "%d"
# set month value in a variable: [String]$Manaden = (Get-Date -UFormat "%m")

[String]$Dagen = (Get-Date -UFormat "%d")
[String]$Manaden = (Get-Date -UFormat "%m")

# debug lines 
Write-Host [$Manaden]

 

UFormat specifiers to format the DateTime result of Get-Date in PowerShell. Return a formatted string and not a date object.

 

More information:

Get-Date – How to get and use Dates in PowerShell

https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/set-variable?view=powershell-7.3 

https://devblogs.microsoft.com/scripting/use-powershell-to-work-with-any-ini-file/

https://ss64.com/ps/get-date.html

Product:

Microsoft SQL server 2016 database

Issue:

How let the user select what rows should be shown in the view?

Solution:

Not possible direct in a SQL view.

https://www.mssqltips.com/sqlservertip/5147/limitations-when-working-with-sql-server-views/

A workaround is to have a table that the user update with the value they want to select on, and then in the view join on that table and filter on that particulare value.

If you want the ID column to match a PERIOD value, create a parameter table first

CREATE TABLE [admin].[Parameter](
[key_Parameter] [int] IDENTITY(1,1) NOT NULL,
[Period] [nvarchar](200) NULL,
) ON [PRIMARY]

Add the select value into the the table.

Then create a view – you have to update the view code below to reflect your table columns;

CREATE VIEW [DM].[Konto-Period]
AS
SELECT 
Kontonummer, 
ID
FROM DM.kontotabell a
JOIN [admin].[Parameter] b on a.[ID] = b.[Period]
-- WHERE a.[ID] = b.[Period]

This will when run check against the value in the parameter table.

Then you have to figure out a way for end user to update the parameter table.

More Information:

https://www.sqltutorial.org/sql-date-functions/sql-convert-string-to-date-functions/ 

https://learn.microsoft.com/en-us/sql/relational-databases/stored-procedures/specify-parameters?view=sql-server-ver16 

https://learn.microsoft.com/en-us/sql/relational-databases/tables/use-table-valued-parameters-database-engine?view=sql-server-ver16

Product:

Azure Data Factory

Issue:

How connect with managed identity to SQL private endpoint?

Solution:

In your azure subscription, ensure that both ADF and SQL are in the same subnet. Then there should not be any firewalls that need to be open between them.

On your AZURE SQL server, set Networking – public access to disable. Ensure you have created private endpoints for you SQL AZURE resource.

Set up a Managed Identity for your Azure Data Factory:  This will create a ID that can be used by other Azure resources to get access.

    • In the Azure portal, go to your Azure Data Factory resource.
    • Under the “Settings” section, select ” managed identity “.
    • Enable the system-assigned managed identity for your Data Factory.

Grant the Managed Identity access to the SQL Azure Database:

    • Go to your SQL Azure Database resource.
    • Under the “Settings” section, select “Access control (IAM)”.
    • Click on “+ Add” and add a role assignment.
    • Select the appropriate role (e.g., “Contributor” or “SQL Server Contributor”) and search for the name of your Azure Data Factory.
    • Select the Data Factory name from the search results and click “Save”.

You can also give the ADF access inside SQL server by giving it access with this commands in SSMS:

-- run in master database

CREATE LOGIN [adf-name] FROM EXTERNAL PROVIDER

CREATE USER [adf-name] FROM LOGIN [adf-name] WITH DEFAULT_SCHEMA=[dbo]

-- run in sql database

CREATE USER [adf-name] FROM LOGIN [adf-name]

ALTER ROLE [db_owner] ADD MEMBER [adf-name]

 

Configure the Linked Service in Azure Data Factory:

    • Open your Azure Data Factory resource in the Azure portal.
    • Click on launch Studio
    • Go to the “manage” section.
    • Click on the “Linked service” tab and select “New”.
    • Choose the appropriate SQL Server connector (e.g., “Azure SQL Database”).
    • Provide the required connection details such as server name, database name, authentication type like:
      integrationRuntime2 (Managed Virtual Network)
      connection string
      Account selection method – Enter manually.
      Enter SQL server name (Fully qualified domain name) like: sql-name.database.windows.net
      Enter database name
      For authentication type, under “Managed private endpoint”, Select System Assigned Managed Identity – then all values should come up automatic.
    • Click on “Test Connection” to validate the connection.

 

Use the Linked Service in Azure Data Factory:

      • Now, you can use the configured Linked Service to connect to the SQL Azure Database private endpoint in your dataset, that are the integration pipelines within Azure Data Factory.

By following these steps, you’ll be able to establish a connection to a SQL Azure Database private endpoint from Azure Data Factory using a managed identity.

More information:

https://learn.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/tutorial-windows-vm-access-sql 

https://techcommunity.microsoft.com/t5/azure-sql-blog/private-endpoints-for-azure-sql-managed-instance/ba-p/3782015 

https://learn.microsoft.com/en-us/azure/data-factory/tutorial-copy-data-portal-private

Product:

Microsoft Azure Storage Account

Issue:

When doing the Microsoft learn section:

https://learn.microsoft.com/en-us/training/modules/connect-an-app-to-azure-storage/9-initialize-the-storage-account-model?pivots=csharp

Try to list content in a blob, you get a message:

There are no credentials provided in your command and environment, we will query for account key for your storage account.
It is recommended to provide –connection-string, –account-key or –sas-token in your command as credentials.

You also can add `–auth-mode login` in your command to use Azure Active Directory (Azure AD) for authorization if your login account is assigned required RBAC roles.
For more information about RBAC roles in storage, visit https://docs.microsoft.com/azure/storage/common/storage-auth-aad-rbac-cli.

In addition, setting the corresponding environment variables can avoid inputting credentials in your command. Please use –help to get more information about environment variable usage.

 

Solution:

The blob photos have been created, if you check direct in your subscription storageaccount. But you can not verify it with the command:

az storage container list \
--account-name <name>

The <name> should be replaced with your uniq storage account name.

The result returned is cryptic, but if you find below lines, then there is a success. Keep in mind that you need to enter the CLI command in one line.

 },
"immutableStorageWithVersioningEnabled": false,
"metadata": null,
"name": "photos",
"properties": {

As this message is a warning, and not a error. You can add –only-show-errors  to suppress warnings. Like this;

az storage container list --only-show-errors --account-key  <your key> --account-name <your account>

The --auth-mode key will be deprecated in the future, try to use other method.

As this warning message is new – it may break your scripts, as they do not expect the message to come.

You can get more information with adding –debug  to the command, like;

az storage container list --debug --account-name  <name>

 

More Information:

https://learn.microsoft.com/en-us/azure/storage/blobs/authorize-data-operations-cli 

 

When you don’t specify the authentication type, it will try yo get the access key of the storage account: This requires Microsoft.Storage/storageAccounts/listkeys/action permission. If you have contributor role or the storage account, you have the required permission.

--auth-mode login means it will use AAD auth to connect to the storage. You can use of the built-in roles to access the storage (see documentation):

  • Storage Table Data Contributor
  • Storage Table Data Reader

When using AAD Auth, you could also disable access key authentication.

There is an good article related to RBAC management and data plane model: Assign an Azure role for access to blob data.