Product:
Microsoft Azure Database SQL

Issue:
When running a Store Procedure (SP) you get an error after a long time,

Sql error number: 40544. Error Message: The database ‘tempdb’ has reached its size quota.

Suggested solution:

Increase, if possible, your DTU, this will reset the tempdb and you will free space to start over.

In SSMS check what SQL Azure level you are with:

SELECT Edition = DATABASEPROPERTYEX('databasename', 'EDITION'),

        ServiceObjective = DATABASEPROPERTYEX('databasename', 'ServiceObjective'),

        MaxSizeInBytes =  DATABASEPROPERTYEX('databasename', 'MaxSizeInBytes');

 

In SSMS enter to see the file size of TEMPDB at the moment:

SELECT [Source] = 'database_files', 
[TEMPDB_max_size_MB] = SUM(max_size) * 8 / 1027.0, 
[TEMPDB_current_size_MB] = SUM(size) * 8 / 1027.0, 
[FileCount] = COUNT(FILE_ID)
FROM tempdb.sys.database_files
WHERE type = 0 --ROWS

Above we used up the file size limit of 13.9 GB for TEMPDB that exist in the first tiers. But with below SQL query can you see how much space is used inside:

SELECT 
(SUM(unallocated_extent_page_count)*1.0/128) AS [Free space(MB)]
,(SUM(version_store_reserved_page_count)*1.0/128) AS [Used Space by VersionStore(MB)]
,(SUM(internal_object_reserved_page_count)*1.0/128) AS [Used Space by InternalObjects(MB)]
,(SUM(user_object_reserved_page_count)*1.0/128) AS [Used Space by UserObjects(MB)]
FROM tempdb.sys.dm_db_file_space_usage;

 

Service-level objective Maximum tempdb data file size (GB) Number of tempdb data files Maximum tempdb data size (GB)
Basic 13.9 1 13.9
S0 13.9 1 13.9
S1 13.9 1 13.9
S2 13.9 1 13.9
S3 32 1 32

 

Recommended solution, is to check what is causing the creation of large use of TEMPDB space, by check your query plans in SSMS.

Then improve you table columns format, to only what you need. Use nvarchar(50) instead of nvarchar(max) etc.

Check you store procedures query’s, and insert a index on the columns that you thing will make the selection smaller fastest.

DTU in azure is a combination of CPU usage per second and read/write I/O per second to disc. When you have used up your quota, there is a limit on how much bytes you can write to disc per second, so your process will succeed but it will take much longer as a small amount of data is processed at each second.

A database transaction unit (DTU) represents a blended measure of CPU, memory, reads, and writes. Service tiers in the DTU-based purchasing model are differentiated by a range of compute sizes with a fixed amount of included storage, fixed retention period for backups, and fixed price.

https://blog.atwork.at/post/Azure-Subscription-and-Service-Limits 

https://www.spotlightcloud.io/blog/what-is-dtu-in-azure-sql-database-and-how-much-do-we-need

More information:

https://knowledge-base.havit.eu/2018/02/19/azure-sql-the-database-tempdb-has-reached-its-size-quota-partition-or-delete-data-drop-indexes-or-consult-the-documentation-for-possible-resolutions-microsoft-sql-server-error-40544/ 

https://techcommunity.microsoft.com/t5/azure-database-support-blog/resolve-tempdb-related-errors-in-azure-sql-database/ba-p/3597944 

https://sqlcoffee.com/Azure_0013.htm 

https://www.brentozar.com/archive/2018/02/memory-grants-sql-servers-public-toilet/

Top five considerations for SQL Server index design

https://learn.microsoft.com/en-us/azure/azure-sql/database/resource-limits-dtu-single-databases?view=azuresql#tempdb-sizes 

https://www.brentozar.com/archive/2023/09/oops-i-lost-my-indexes-in-azure-sql-db/ 

https://www.sqlshack.com/sql-index-overview-and-strategy/ 

https://learn.microsoft.com/en-us/sql/relational-databases/sql-server-index-design-guide?view=sql-server-ver16  

https://datasimantics.com/2018/08/24/sql-servers-nvarcharmax-and-how-to-wield-it/ 

https://www.ibm.com/support/pages/only-first-1024-characters-nvarcharmax-column-are-presented-report-based-dqm-package

https://learn.microsoft.com/en-us/azure/azure-sql/database/service-tiers-dtu?view=azuresql

https://learn.microsoft.com/en-us/azure/azure-sql/database/resource-limits-dtu-single-databases?view=azuresql 

To see sessions that use TEMPDB:

-- Sessions with open transactions in tempdb
SELECT [Source] = 'database_transactions', 
[session_id] = ST.session_id, 
[transaction_id] = ST.transaction_id, 
[login_name] = S.login_name, 
[database_id] = S.database_id, 
[program_name] = S.program_name, 
[host_name] = S.host_name, 
[database_id] = DT.database_id, 
[database_name] = CASE
WHEN D.name IS NULL AND DT.database_id = 2 THEN 'TEMPDB'
ELSE D.name
END, 
[log_reuse_wait_desc] = D.log_reuse_wait_desc, 
[database_transaction_log_used_Kb] = CONVERT(numeric(18,2), DT.database_transaction_log_bytes_used / 1024.0 ), 
[database_transaction_begin_time] = DT.database_transaction_begin_time, 
[transaction_type_desc] = CASE DT.database_transaction_type
WHEN 1 THEN 'Read/write transaction'
WHEN 2 THEN 'Read-only transaction'
WHEN 3 THEN 'System transaction'
WHEN 4 THEN 'Distributed transaction'
END, 
[transaction_state_desc] = CASE DT.database_transaction_state
WHEN 1 THEN 'The transaction has not been initialized.'
WHEN 2 THEN 'The transaction is active'
WHEN 3 THEN 'The transaction has been initialized but has not generated any log records.'
WHEN 4 THEN 'The transaction has generated log records.'
WHEN 5 THEN 'The transaction has been prepared.'
WHEN 10 THEN 'The transaction has been committed.'
WHEN 11 THEN 'The transaction has been rolled back.'
WHEN 12 THEN 'The transaction is being committed. (The log record is being generated, but has not been materialized or persisted.)'
END,
[active_transaction_type_desc] = CASE AT.transaction_type
WHEN 1 THEN 'Read/write transaction'
WHEN 2 THEN 'Read-only transaction'
WHEN 3 THEN 'System transaction'
WHEN 4 THEN 'Distributed transaction'
END, 
[active_transaction_state_desc] = CASE AT.transaction_state
WHEN 0 THEN 'The transaction has not been completely initialized yet.'
WHEN 1 THEN 'The transaction has been initialized but has not started.'
WHEN 2 THEN 'The transaction is active'
WHEN 3 THEN 'The transaction has ended. This is used for read-only transactions.'
WHEN 4 THEN 'The commit process has been initiated on the distributed transaction.'
WHEN 5 THEN 'The transaction is in a prepared state and waiting resolution.'
WHEN 6 THEN 'The transaction has been committed.'
WHEN 7 THEN 'The transaction is being rolled back.'
WHEN 8 THEN 'The transaction has been rolled back.'
END
FROM sys.dm_tran_database_transactions DT
INNER JOIN sys.dm_tran_session_transactions ST ON DT.transaction_id = ST.transaction_id
INNER JOIN sys.dm_tran_active_transactions AT ON DT.transaction_id = AT.transaction_id
INNER JOIN sys.dm_exec_sessions S ON ST.session_id = S.session_id
LEFT JOIN sys.databases D ON DT.database_id = D.database_id
WHERE DT.database_id = 2 -- tempdb
ORDER BY ST.session_id, DT.database_id;



 

Product:
Microsoft SQL Azure

Issue:
How many rows are it in a database table?

Solution:

Enter in SSMS:

sp_spaceused 'dbo.tablename';

This will give you both number of rows and space used by the table.

 

More Information:

https://www.brentozar.com/archive/2014/02/count-number-rows-table-sql-server/

https://www.sqlshack.com/capture-databases-usage-stats-using-sp_spaceused-powershell/ 

Product:

Microsoft Azure Blob storage

Problem:

Put a file on blob storage with POSTMAN.  https://www.postman.com/product/rest-client/

Get error  404 The specified blob does not exist

 

Solution:

Change to PUT instead of GET in postman program, to place a file on the blob storage.

As the file you are asking for does not exist – as you have not put it there yet – you get a 404 error.

To put a file on the blob storage you need to add headers like:

x-ms-blob-type  = BlockBlob

x-ms-date = 2022-11-02

 

Error like 400 Value for one of the query parameters specified in the request URI is invalid. can be that you miss the container name, only have the URL.

Error like 400 The requested URI does not represent any resource on the server. can be that you have not listed the file, only the container name in the URL.

Error like 403 This request is not authorized to perform this operation, can be that you are not logged into the VPN to give you access to the azure storage.

Error like 400 The requested URI does not represent any resource on the server. can be that the url does not contain the file to read in a get statement.

Error like 400 An HTTP header that’s mandatory for this request is not specified. can be that you are missing the x-ms-blob-type header.

 

More Information:

https://learn.microsoft.com/en-us/azure/storage/blobs/authorize-access-azure-active-directory

https://learn.microsoft.com/en-us/azure/data-explorer/kusto/api/connection-strings/storage-authentication-methods

Product:

Microsoft PowerShell
Planning Analytics 2.0.9.17
Microsoft Windows 2019 server

Issue:
I have backup zip files in a folder that i want to move to a different drive, to save space on the first drive.

 

Solution:

Create a d:\script folder and in notepad++ create a copyoldfile.ps1 file with this content:

# powershell script to copy old files
# older than a month from today's date

# set common start values
$abortFlag = 0

# Get a input parameter https://ss64.com/ps/syntax-args.html
# get from folder as parameter to call to script
[String]$FromFolder = $Args[0]
[String]$ToFolder = $Args[1]

# debug lines 
Write-Host [$FromFolder] 
Write-Host [$ToFolder] 

# check if args are empty and then break the code
if ( $FromFolder -eq '' ) {
 Write-Host "Fromfolder is missing" -ForegroundColor Red
            $abortFlag = 1
}

if ( $ToFolder -eq '' ) {
 Write-Host "Tofolder is missing" -ForegroundColor Red
            $abortFlag = 1
}

 if ($abortFlag -eq 1) {
        Write-Host "The abort flag has been tripped. Exit script." -ForegroundColor Red
        break
}

# for each file in folder check date and move file older than 30 days
Get-Childitem -File -Path $FromFolder -Filter *.zip | Where-Object {$_.LastWriteTime -lt (Get-Date).AddDays(-30)}  |     Move-Item  -Destination $ToFolder 

# 
#
The above script take two values the from folder and the target folder. It will then check the creation date on each file in folder and move the files that are older than 30 days. It will only move files that end with .zip.
When testing a powershell script, you should at the end of line for execution add: -Verbose -WhatIf
The script have a check that the values entered is not empty (null), and then stop the script.  The output is not shown if you call the ps1 script from a other program. To call this script from a TM1 process enter in prolog like this:
DatasourceASCIIQuoteCharacter= Char(39);

SC_QUOTE_COMMAND = Char(34);
sExec = 'Powershell -ExecutionPolicy ByPass -file ' | SC_QUOTE_COMMAND | 'd:\script\copyoldfile.ps1' | SC_QUOTE_COMMAND | ' ' ;
sSourceDir = SC_QUOTE_COMMAND | 'D:\TM1 Data\Back Up' | SC_QUOTE_COMMAND | ' ' ;
sTargetDir = SC_QUOTE_COMMAND | 'd:\arkiv' | SC_QUOTE_COMMAND | ' ' ;

sCommand = sExec | ' ' | sSourceDir | ' ' | sTargetDir ;

## remove to debug the string sent to cmd
## ASCIIOUTPUT ( 'd:\temp\result.txt' , sCommand);

EXECUTECOMMAND(sCommand,0);

The powershell only accepts ” around its file paths, we have to change the QuoteCharacter to be ‘. Change the folders in the TI process to match your backup folders. Do your developing and testing in your TM1 development servers first.

https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=tv-turbointegrator-local-variables

 

More Information:

https://www.techtarget.com/searchwindowsserver/tutorial/PowerShell-Move-Item-examples-for-file-folder-management 

https://linuxhint.com/loop-through-files-in-a-directory-using-powershell/ 

https://theitbros.com/powershell-script-for-loop-through-files-and-folders/ 

https://learn.microsoft.com/en-us/powershell/scripting/learn/deep-dives/everything-about-null?view=powershell-7.3 

https://learn.microsoft.com/en-us/powershell/scripting/learn/deep-dives/everything-about-if?view=powershell-7.3 

https://adamtheautomator.com/powershell-if-statement/ 

https://ss64.com/ps/syntax-args.html 

https://blog.ironmansoftware.com/daily-powershell/assign-powershell-variables/ 

https://linuxhint.com/get-current-date-powershell/

https://www.red-gate.com/simple-talk/sysadmin/powershell/how-to-use-parameters-in-powershell/ 

 

Product:

Microsoft PowerShell

Microsoft Windows 2019 Server

Issue:

How get the month value from datetime function in powershell?

Solution:

Enter a ps1 script like this:

# get today's day number: Get-Date -UFormat "%d"
# set month value in a variable: [String]$Manaden = (Get-Date -UFormat "%m")

[String]$Dagen = (Get-Date -UFormat "%d")
[String]$Manaden = (Get-Date -UFormat "%m")

# debug lines 
Write-Host [$Manaden]

 

UFormat specifiers to format the DateTime result of Get-Date in PowerShell. Return a formatted string and not a date object.

 

More information:

Get-Date – How to get and use Dates in PowerShell

https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/set-variable?view=powershell-7.3 

https://devblogs.microsoft.com/scripting/use-powershell-to-work-with-any-ini-file/

https://ss64.com/ps/get-date.html

Product:

Microsoft SQL server 2016 database

Issue:

How let the user select what rows should be shown in the view?

Solution:

Not possible direct in a SQL view.

https://www.mssqltips.com/sqlservertip/5147/limitations-when-working-with-sql-server-views/

A workaround is to have a table that the user update with the value they want to select on, and then in the view join on that table and filter on that particulare value.

If you want the ID column to match a PERIOD value, create a parameter table first

CREATE TABLE [admin].[Parameter](
[key_Parameter] [int] IDENTITY(1,1) NOT NULL,
[Period] [nvarchar](200) NULL,
) ON [PRIMARY]

Add the select value into the the table.

Then create a view – you have to update the view code below to reflect your table columns;

CREATE VIEW [DM].[Konto-Period]
AS
SELECT 
Kontonummer, 
ID
FROM DM.kontotabell a
JOIN [admin].[Parameter] b on a.[ID] = b.[Period]
-- WHERE a.[ID] = b.[Period]

This will when run check against the value in the parameter table.

Then you have to figure out a way for end user to update the parameter table.

More Information:

https://www.sqltutorial.org/sql-date-functions/sql-convert-string-to-date-functions/ 

https://learn.microsoft.com/en-us/sql/relational-databases/stored-procedures/specify-parameters?view=sql-server-ver16 

https://learn.microsoft.com/en-us/sql/relational-databases/tables/use-table-valued-parameters-database-engine?view=sql-server-ver16

Product:

Azure Data Factory

Issue:

How connect with managed identity to SQL private endpoint?

Solution:

In your azure subscription, ensure that both ADF and SQL are in the same subnet. Then there should not be any firewalls that need to be open between them.

On your AZURE SQL server, set Networking – public access to disable. Ensure you have created private endpoints for you SQL AZURE resource.

Set up a Managed Identity for your Azure Data Factory:  This will create a ID that can be used by other Azure resources to get access.

    • In the Azure portal, go to your Azure Data Factory resource.
    • Under the “Settings” section, select ” managed identity “.
    • Enable the system-assigned managed identity for your Data Factory.

Grant the Managed Identity access to the SQL Azure Database:

    • Go to your SQL Azure Database resource.
    • Under the “Settings” section, select “Access control (IAM)”.
    • Click on “+ Add” and add a role assignment.
    • Select the appropriate role (e.g., “Contributor” or “SQL Server Contributor”) and search for the name of your Azure Data Factory.
    • Select the Data Factory name from the search results and click “Save”.

You can also give the ADF access inside SQL server by giving it access with this commands in SSMS:

-- run in master database

CREATE LOGIN [adf-name] FROM EXTERNAL PROVIDER

CREATE USER [adf-name] FROM LOGIN [adf-name] WITH DEFAULT_SCHEMA=[dbo]

-- run in sql database

CREATE USER [adf-name] FROM LOGIN [adf-name]

ALTER ROLE [db_owner] ADD MEMBER [adf-name]

 

Configure the Linked Service in Azure Data Factory:

    • Open your Azure Data Factory resource in the Azure portal.
    • Click on launch Studio
    • Go to the “manage” section.
    • Click on the “Linked service” tab and select “New”.
    • Choose the appropriate SQL Server connector (e.g., “Azure SQL Database”).
    • Provide the required connection details such as server name, database name, authentication type like:
      integrationRuntime2 (Managed Virtual Network)
      connection string
      Account selection method – Enter manually.
      Enter SQL server name (Fully qualified domain name) like: sql-name.database.windows.net
      Enter database name
      For authentication type, under “Managed private endpoint”, Select System Assigned Managed Identity – then all values should come up automatic.
    • Click on “Test Connection” to validate the connection.

 

Use the Linked Service in Azure Data Factory:

      • Now, you can use the configured Linked Service to connect to the SQL Azure Database private endpoint in your dataset, that are the integration pipelines within Azure Data Factory.

By following these steps, you’ll be able to establish a connection to a SQL Azure Database private endpoint from Azure Data Factory using a managed identity.

More information:

https://learn.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/tutorial-windows-vm-access-sql 

https://techcommunity.microsoft.com/t5/azure-sql-blog/private-endpoints-for-azure-sql-managed-instance/ba-p/3782015 

https://learn.microsoft.com/en-us/azure/data-factory/tutorial-copy-data-portal-private

Product:

Microsoft Azure Storage Account

Issue:

When doing the Microsoft learn section:

https://learn.microsoft.com/en-us/training/modules/connect-an-app-to-azure-storage/9-initialize-the-storage-account-model?pivots=csharp

Try to list content in a blob, you get a message:

There are no credentials provided in your command and environment, we will query for account key for your storage account.
It is recommended to provide –connection-string, –account-key or –sas-token in your command as credentials.

You also can add `–auth-mode login` in your command to use Azure Active Directory (Azure AD) for authorization if your login account is assigned required RBAC roles.
For more information about RBAC roles in storage, visit https://docs.microsoft.com/azure/storage/common/storage-auth-aad-rbac-cli.

In addition, setting the corresponding environment variables can avoid inputting credentials in your command. Please use –help to get more information about environment variable usage.

 

Solution:

The blob photos have been created, if you check direct in your subscription storageaccount. But you can not verify it with the command:

az storage container list \
--account-name <name>

The <name> should be replaced with your uniq storage account name.

The result returned is cryptic, but if you find below lines, then there is a success. Keep in mind that you need to enter the CLI command in one line.

 },
"immutableStorageWithVersioningEnabled": false,
"metadata": null,
"name": "photos",
"properties": {

As this message is a warning, and not a error. You can add –only-show-errors  to suppress warnings. Like this;

az storage container list --only-show-errors --account-key  <your key> --account-name <your account>

The --auth-mode key will be deprecated in the future, try to use other method.

As this warning message is new – it may break your scripts, as they do not expect the message to come.

You can get more information with adding –debug  to the command, like;

az storage container list --debug --account-name  <name>

 

More Information:

https://learn.microsoft.com/en-us/azure/storage/blobs/authorize-data-operations-cli 

 

When you don’t specify the authentication type, it will try yo get the access key of the storage account: This requires Microsoft.Storage/storageAccounts/listkeys/action permission. If you have contributor role or the storage account, you have the required permission.

--auth-mode login means it will use AAD auth to connect to the storage. You can use of the built-in roles to access the storage (see documentation):

  • Storage Table Data Contributor
  • Storage Table Data Reader

When using AAD Auth, you could also disable access key authentication.

There is an good article related to RBAC management and data plane model: Assign an Azure role for access to blob data.

Product:
Microsoft Azure File share

Issue:

How use POSTMAN to send up a file to AZURE file store with REST API?

Solution:

Download POSTMAN program from https://www.postman.com/downloads/

Go into your AZURE subscription and to your storage account, to get the shared access signature (SAS), that is a URI that grants restricted access rights to Azure Storage resources.

As this is a file share you should select allowed service to be FILE, and allowed resource types to be OBJECT.

Set the end date for expiry to a year from now.

Leave the Allowed IP addresses blank – to allow any computer to access the account.  (Keep the DNS and FIREWALL setup so that only computers from your company can reach the azure area).

Allowed protocols should be HTTPS only.

Click on Generate SAS and connection string.  This strings you must copy and save in notepad. You can not show them again when you have left this Azure page.

The connection string contain all info you need. Inside notepad split it up to only have the information you need in one string for file share. Copy text after FileEndpoint=.

You should get something like this:

https://xyz.file.core.windows.net/?sv=2022-11-02&ss=f&srt=o&sp=rwdlc&se=2025-07-31T22:45:45Z&st=2023-05-30T14:45:45Z&spr=https&sig=xyzsecretkeyxyz

sv= is the version of REST API, the value you may need to add as; x-ms-version: 2022-11-02

se= is the end date for the connection key to work, like 2025-07-31T22:45:45Z

st= is the start date for the connection key to work, like 2023-05-30T14:45:45Z

sig= is the key value, that gives you full access to the area. Do not share it with others.

sp= is what kind of rights you have given, e.g. read write delete list create.

In your storage account file share, you may have created some subfolders like testfiles. Click on File Shares to find the name and any sub-folders you have underneath it. Click on file share and click on browse to find the folder name where you have Authentication method: Access key. This is the folder you can access.

Update your url to contain the path and the filename of the file you want to create. Like https://xyz.file.core.windows.net/testfiles/test.txt?sv=2022-11-02……

Start POSTMAN. Do not login, skip that. Create a New HTTP dialog.

Select PUT and paste in your URL. Then POSTMAN will interpreter your values and list them as parameters.

With file share, you must do the REST API in two steps. First create the file in correct size, and then do a second call to fill the file with data. This is different from BLOB storage, where you can do it in one REST API CALL.

In POSTMAN go to the Headers tab and add two keys:

x-ms-type = file

x-ms-content-length = 1

Here we set the length of the file to 1 character (1 byte). (this will work as long you only use a-z characters and UTF-8 coding).

Click on SEND button, and if all is correct you should get:  201 created.

Browse to you AZURE file storage and check that the file was created, with a size of 1.

To write to the file, add in headers this two keys:

x-ms-write = update

x-ms-range = bytes=0-0

The x-ms-range should always start with 0 and then be one number less than the total of characters in your file. If the file is 42 characters, then the value should be bytes=0-41.

Important, in the params tab you must add a key as below (this to active the range function – otherwise the x-ms-range is not used);

comp = range

Then we need to add some data in POSTMAN to write to the file, go to the Body tab, and select RAW – text and enter a letter.

The text should be the same size as the file you have created. File size and text you put into the file must match exactly on the byte.

Click on SEND, and you should get a Status: 201 Created if all is fine.

Common errors you can see in POSTMAN are:

Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. Authentication scheme Bearer for files is not supported in this version.

This is solved by adding the correct version, like: x-ms-version: 2022-11-02

You should also try to have headers like:

Authorization: Bearer
x-ms-type:file
<Error><Code>UnsupportedHttpVerb</Code><Message>The resource doesn't support specified Http Verb.

This is solved by using PUT instead of POST.

<Error><Code>ShareNotFound</Code><Message>The specified share does not exist.

This is solved by you enter the correct path and filename to the URL.

HTTP Error 400.  The request host name is invalid.

Solved by enter the correct host.

HTTP Error 411. The request must be chunked or have a content length

Solved by enter the header Content-Length.

HTTP Error 404 The specified resource does not exist.

HTTP Error 400 Value for one of the query parameters specified in the request URI is invalid

HTTP Error 404 The specified parent path does not exist
Solved by enter the correct path to the files in AZURE.
One of the HTTP headers specified in the request is not supported.
<HeaderName>x-ms-range</HeaderName>

Solved by adding the key comp in params tab.

An HTTP header that's mandatory for this request is not specified.
<HeaderName>x-ms-content-length</HeaderName>

Solved by adding the key x-ms-content-length.

An HTTP header that's mandatory for this request is not specified.
<HeaderName>x-ms-write</HeaderName>

Solved by adding the key x-ms-write.

The value for one of the HTTP headers is not in the correct format.
<HeaderName>Content-Length</HeaderName>
<HeaderValue>1</HeaderValue>
This is solved by enter correct value in the x-ms-range key or that you add comp = range in the params tab.

 

More Information:

https://www.mikaelsand.se/2019/11/simple-how-to-upload-a-file-to-azure-storage-using-rest-api/
https://www.petecodes.co.uk/uploading-files-to-azure-blob-storage-using-the-rest-api-and-postman/
https://www.serverless360.com/blog/azure-blob-storage-vs-file-storage
https://raaviblog.com/how-to-use-azure-blob-storage-service-rest-api-operations-using-postman/
http://www.mikaelsand.se/2020/06/oauth-with-azure-the-just-make-it-work-edition/

https://azureops.org/articles/connect-azure-sql-from-data-factory-using-managed-identity/ 
https://www.datahai.co.uk/power-bi/connecting-power-bi-to-azure-sql-database-using-private-endpoints/ 

https://en.wikipedia.org/wiki/Spy_vs._Spy

 

Product:

Microsoft SQL Azure database

Issue:

A new created SQL native user [Kalle] can not see the tables in the database, but he can login to SSMS.

Solution:

Please do not use the role db_datareader or db_datawriter or their deny equivalents. They are for backwards compatibility only.

Remove the user from the role with below command, did not help.

EXEC sp_droprolemember 'db_datareader', 'Kalle'

You have to drop the user and create him again;

DROP USER Kalle

DROP LOGIN Kalle

Use Master

CREATE LOGIN Kalle WITH PASSWORD = 'advancedpasswordhere'

CREATE USER Kalle FOR LOGIN Kalle
-- to be able to login from SSMS you need to have the user in master database --

CREATE USER Kalle FOR LOGIN Kalle

-- gives the user Kalle access to see all tables in the DM schema --

GRANT SELECT ON SCHEMA::DM TO Kalle

This should give that the user only have read access to all tables and views that are part of the DM schema in the database.

To list members of built in roles use:

 SELECT DP1.name AS DatabaseRoleName, 
isnull (DP2.name, 'No members') AS DatabaseUserName 
FROM sys.database_role_members AS DRM 
RIGHT OUTER JOIN sys.database_principals AS DP1 
ON DRM.role_principal_id = DP1.principal_id 
LEFT OUTER JOIN sys.database_principals AS DP2 
ON DRM.member_principal_id = DP2.principal_id 
WHERE DP1.type = 'R'
ORDER BY DP1.name;

To list if any user have DENY rights use:

SELECT l.name as grantee_name, p.state_desc, p.permission_name, o.name
FROM sys.database_permissions AS p JOIN sys.database_principals AS l 
ON p.grantee_principal_id = l.principal_id
JOIN sys.sysobjects O 
ON p.major_id = O.id 
WHERE p.state_desc ='DENY'

 

More information:

https://learn.microsoft.com/en-us/sql/relational-databases/system-stored-procedures/sp-droprolemember-transact-sql?view=sql-server-ver16 

https://blog.sqlauthority.com/2017/03/02/sql-server-unable-see-tables-objects-ssms/ 

https://www.mssqltips.com/sqlservertip/2385/unable-to-see-a-sql-server-table/