Product:

Azure Data Factory

Issue:

How connect with managed identity to SQL private endpoint?

Solution:

In your azure subscription, ensure that both ADF and SQL are in the same subnet. Then there should not be any firewalls that need to be open between them.

On your AZURE SQL server, set Networking – public access to disable. Ensure you have created private endpoints for you SQL AZURE resource.

Set up a Managed Identity for your Azure Data Factory:  This will create a ID that can be used by other Azure resources to get access.

    • In the Azure portal, go to your Azure Data Factory resource.
    • Under the “Settings” section, select ” managed identity “.
    • Enable the system-assigned managed identity for your Data Factory.

Grant the Managed Identity access to the SQL Azure Database:

    • Go to your SQL Azure Database resource.
    • Under the “Settings” section, select “Access control (IAM)”.
    • Click on “+ Add” and add a role assignment.
    • Select the appropriate role (e.g., “Contributor” or “SQL Server Contributor”) and search for the name of your Azure Data Factory.
    • Select the Data Factory name from the search results and click “Save”.

You can also give the ADF access inside SQL server by giving it access with this commands in SSMS:

-- run in master database

CREATE LOGIN [adf-name] FROM EXTERNAL PROVIDER

CREATE USER [adf-name] FROM LOGIN [adf-name] WITH DEFAULT_SCHEMA=[dbo]

-- run in sql database

CREATE USER [adf-name] FROM LOGIN [adf-name]

ALTER ROLE [db_owner] ADD MEMBER [adf-name]

 

Configure the Linked Service in Azure Data Factory:

    • Open your Azure Data Factory resource in the Azure portal.
    • Click on launch Studio
    • Go to the “manage” section.
    • Click on the “Linked service” tab and select “New”.
    • Choose the appropriate SQL Server connector (e.g., “Azure SQL Database”).
    • Provide the required connection details such as server name, database name, authentication type like:
      integrationRuntime2 (Managed Virtual Network)
      connection string
      Account selection method – Enter manually.
      Enter SQL server name (Fully qualified domain name) like: sql-name.database.windows.net
      Enter database name
      For authentication type, under “Managed private endpoint”, Select System Assigned Managed Identity – then all values should come up automatic.
    • Click on “Test Connection” to validate the connection.

 

Use the Linked Service in Azure Data Factory:

      • Now, you can use the configured Linked Service to connect to the SQL Azure Database private endpoint in your dataset, that are the integration pipelines within Azure Data Factory.

By following these steps, you’ll be able to establish a connection to a SQL Azure Database private endpoint from Azure Data Factory using a managed identity.

More information:

https://learn.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/tutorial-windows-vm-access-sql 

https://techcommunity.microsoft.com/t5/azure-sql-blog/private-endpoints-for-azure-sql-managed-instance/ba-p/3782015 

https://learn.microsoft.com/en-us/azure/data-factory/tutorial-copy-data-portal-private

Product:

Microsoft Azure Storage Account

Issue:

When doing the Microsoft learn section:

https://learn.microsoft.com/en-us/training/modules/connect-an-app-to-azure-storage/9-initialize-the-storage-account-model?pivots=csharp

Try to list content in a blob, you get a message:

There are no credentials provided in your command and environment, we will query for account key for your storage account.
It is recommended to provide –connection-string, –account-key or –sas-token in your command as credentials.

You also can add `–auth-mode login` in your command to use Azure Active Directory (Azure AD) for authorization if your login account is assigned required RBAC roles.
For more information about RBAC roles in storage, visit https://docs.microsoft.com/azure/storage/common/storage-auth-aad-rbac-cli.

In addition, setting the corresponding environment variables can avoid inputting credentials in your command. Please use –help to get more information about environment variable usage.

 

Solution:

The blob photos have been created, if you check direct in your subscription storageaccount. But you can not verify it with the command:

az storage container list \
--account-name <name>

The <name> should be replaced with your uniq storage account name.

The result returned is cryptic, but if you find below lines, then there is a success. Keep in mind that you need to enter the CLI command in one line.

 },
"immutableStorageWithVersioningEnabled": false,
"metadata": null,
"name": "photos",
"properties": {

As this message is a warning, and not a error. You can add –only-show-errors  to suppress warnings. Like this;

az storage container list --only-show-errors --account-key  <your key> --account-name <your account>

The --auth-mode key will be deprecated in the future, try to use other method.

As this warning message is new – it may break your scripts, as they do not expect the message to come.

You can get more information with adding –debug  to the command, like;

az storage container list --debug --account-name  <name>

 

More Information:

https://learn.microsoft.com/en-us/azure/storage/blobs/authorize-data-operations-cli 

 

When you don’t specify the authentication type, it will try yo get the access key of the storage account: This requires Microsoft.Storage/storageAccounts/listkeys/action permission. If you have contributor role or the storage account, you have the required permission.

--auth-mode login means it will use AAD auth to connect to the storage. You can use of the built-in roles to access the storage (see documentation):

  • Storage Table Data Contributor
  • Storage Table Data Reader

When using AAD Auth, you could also disable access key authentication.

There is an good article related to RBAC management and data plane model: Assign an Azure role for access to blob data.

Product:
Microsoft Azure File share

Issue:

How use POSTMAN to send up a file to AZURE file store with REST API?

Solution:

Download POSTMAN program from https://www.postman.com/downloads/

Go into your AZURE subscription and to your storage account, to get the shared access signature (SAS), that is a URI that grants restricted access rights to Azure Storage resources.

As this is a file share you should select allowed service to be FILE, and allowed resource types to be OBJECT.

Set the end date for expiry to a year from now.

Leave the Allowed IP addresses blank – to allow any computer to access the account.  (Keep the DNS and FIREWALL setup so that only computers from your company can reach the azure area).

Allowed protocols should be HTTPS only.

Click on Generate SAS and connection string.  This strings you must copy and save in notepad. You can not show them again when you have left this Azure page.

The connection string contain all info you need. Inside notepad split it up to only have the information you need in one string for file share. Copy text after FileEndpoint=.

You should get something like this:

https://xyz.file.core.windows.net/?sv=2022-11-02&ss=f&srt=o&sp=rwdlc&se=2025-07-31T22:45:45Z&st=2023-05-30T14:45:45Z&spr=https&sig=xyzsecretkeyxyz

sv= is the version of REST API, the value you may need to add as; x-ms-version: 2022-11-02

se= is the end date for the connection key to work, like 2025-07-31T22:45:45Z

st= is the start date for the connection key to work, like 2023-05-30T14:45:45Z

sig= is the key value, that gives you full access to the area. Do not share it with others.

sp= is what kind of rights you have given, e.g. read write delete list create.

In your storage account file share, you may have created some subfolders like testfiles. Click on File Shares to find the name and any sub-folders you have underneath it. Click on file share and click on browse to find the folder name where you have Authentication method: Access key. This is the folder you can access.

Update your url to contain the path and the filename of the file you want to create. Like https://xyz.file.core.windows.net/testfiles/test.txt?sv=2022-11-02……

Start POSTMAN. Do not login, skip that. Create a New HTTP dialog.

Select PUT and paste in your URL. Then POSTMAN will interpreter your values and list them as parameters.

With file share, you must do the REST API in two steps. First create the file in correct size, and then do a second call to fill the file with data. This is different from BLOB storage, where you can do it in one REST API CALL.

In POSTMAN go to the Headers tab and add two keys:

x-ms-type = file

x-ms-content-length = 1

Here we set the length of the file to 1 character (1 byte). (this will work as long you only use a-z characters and UTF-8 coding).

Click on SEND button, and if all is correct you should get:  201 created.

Browse to you AZURE file storage and check that the file was created, with a size of 1.

To write to the file, add in headers this two keys:

x-ms-write = update

x-ms-range = bytes=0-0

The x-ms-range should always start with 0 and then be one number less than the total of characters in your file. If the file is 42 characters, then the value should be bytes=0-41.

Important, in the params tab you must add a key as below (this to active the range function – otherwise the x-ms-range is not used);

comp = range

Then we need to add some data in POSTMAN to write to the file, go to the Body tab, and select RAW – text and enter a letter.

The text should be the same size as the file you have created. File size and text you put into the file must match exactly on the byte.

Click on SEND, and you should get a Status: 201 Created if all is fine.

Common errors you can see in POSTMAN are:

Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. Authentication scheme Bearer for files is not supported in this version.

This is solved by adding the correct version, like: x-ms-version: 2022-11-02

You should also try to have headers like:

Authorization: Bearer
x-ms-type:file
<Error><Code>UnsupportedHttpVerb</Code><Message>The resource doesn't support specified Http Verb.

This is solved by using PUT instead of POST.

<Error><Code>ShareNotFound</Code><Message>The specified share does not exist.

This is solved by you enter the correct path and filename to the URL.

HTTP Error 400.  The request host name is invalid.

Solved by enter the correct host.

HTTP Error 411. The request must be chunked or have a content length

Solved by enter the header Content-Length.

HTTP Error 404 The specified resource does not exist.

HTTP Error 400 Value for one of the query parameters specified in the request URI is invalid

HTTP Error 404 The specified parent path does not exist
Solved by enter the correct path to the files in AZURE.
One of the HTTP headers specified in the request is not supported.
<HeaderName>x-ms-range</HeaderName>

Solved by adding the key comp in params tab.

An HTTP header that's mandatory for this request is not specified.
<HeaderName>x-ms-content-length</HeaderName>

Solved by adding the key x-ms-content-length.

An HTTP header that's mandatory for this request is not specified.
<HeaderName>x-ms-write</HeaderName>

Solved by adding the key x-ms-write.

The value for one of the HTTP headers is not in the correct format.
<HeaderName>Content-Length</HeaderName>
<HeaderValue>1</HeaderValue>
This is solved by enter correct value in the x-ms-range key or that you add comp = range in the params tab.

 

More Information:

https://www.mikaelsand.se/2019/11/simple-how-to-upload-a-file-to-azure-storage-using-rest-api/
https://www.petecodes.co.uk/uploading-files-to-azure-blob-storage-using-the-rest-api-and-postman/
https://www.serverless360.com/blog/azure-blob-storage-vs-file-storage
https://raaviblog.com/how-to-use-azure-blob-storage-service-rest-api-operations-using-postman/
http://www.mikaelsand.se/2020/06/oauth-with-azure-the-just-make-it-work-edition/

https://azureops.org/articles/connect-azure-sql-from-data-factory-using-managed-identity/ 
https://www.datahai.co.uk/power-bi/connecting-power-bi-to-azure-sql-database-using-private-endpoints/ 

https://en.wikipedia.org/wiki/Spy_vs._Spy

 

Product:

Microsoft SQL Azure database

Issue:

A new created SQL native user [Kalle] can not see the tables in the database, but he can login to SSMS.

Solution:

Please do not use the role db_datareader or db_datawriter or their deny equivalents. They are for backwards compatibility only.

Remove the user from the role with below command, did not help.

EXEC sp_droprolemember 'db_datareader', 'Kalle'

You have to drop the user and create him again;

DROP USER Kalle

DROP LOGIN Kalle

Use Master

CREATE LOGIN Kalle WITH PASSWORD = 'advancedpasswordhere'

CREATE USER Kalle FOR LOGIN Kalle
-- to be able to login from SSMS you need to have the user in master database --

CREATE USER Kalle FOR LOGIN Kalle

-- gives the user Kalle access to see all tables in the DM schema --

GRANT SELECT ON SCHEMA::DM TO Kalle

This should give that the user only have read access to all tables and views that are part of the DM schema in the database.

To list members of built in roles use:

 SELECT DP1.name AS DatabaseRoleName, 
isnull (DP2.name, 'No members') AS DatabaseUserName 
FROM sys.database_role_members AS DRM 
RIGHT OUTER JOIN sys.database_principals AS DP1 
ON DRM.role_principal_id = DP1.principal_id 
LEFT OUTER JOIN sys.database_principals AS DP2 
ON DRM.member_principal_id = DP2.principal_id 
WHERE DP1.type = 'R'
ORDER BY DP1.name;

To list if any user have DENY rights use:

SELECT l.name as grantee_name, p.state_desc, p.permission_name, o.name
FROM sys.database_permissions AS p JOIN sys.database_principals AS l 
ON p.grantee_principal_id = l.principal_id
JOIN sys.sysobjects O 
ON p.major_id = O.id 
WHERE p.state_desc ='DENY'

 

More information:

https://learn.microsoft.com/en-us/sql/relational-databases/system-stored-procedures/sp-droprolemember-transact-sql?view=sql-server-ver16 

https://blog.sqlauthority.com/2017/03/02/sql-server-unable-see-tables-objects-ssms/ 

https://www.mssqltips.com/sqlservertip/2385/unable-to-see-a-sql-server-table/

Product:

Microsoft AZURE file storage

Issue:

How upload a file to AZURE file storage?

Suggestion:

Download the AZURE STORAGE EXPLORER and install it.  https://learn.microsoft.com/en-us/azure/vs-azure-tools-storage-manage-with-storage-explorer?tabs=windows 

Connect to Azure with your account from inside Azure Storage Explorer.

Expand in the left to your file share.

Click on Upload icon on the right.

Find a example file and upload from the correct folder you want to upload files from.

Click upload and watch the program work.

When finish in lower right corner click on link : ‘Copy AzCopy Command to Clipboard’ next to the log message.

Paste this into NOTEPAD.

Edit the string, to be as you want it to be.

Download azcopy.exe to a folder like d:\script from https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10 

Open a powershell command window.

Go to the d:\script folder.

Paste in your azcopy command from the notepad into powershell session, and it will copy the files you defined.

 

Maybe you can programatically change the powershell script for azcopy to use it from a schedule program.

 

More Information:

https://learn.microsoft.com/en-us/azure/active-directory/develop/howto-create-service-principal-portal 

https://www.51sec.org/2022/08/12/using-azcopy-to-migrate-local-files-to-azure-blob-storage/

https://learn.microsoft.com/en-us/azure/storage/common/storage-configure-connection-string 

https://www.sqlshack.com/getting-started-with-azure-storage-explorer/ 

https://learn.microsoft.com/en-us/azure/storage/blobs/quickstart-storage-explorer

https://youtu.be/owXHtmQLQNY

Product:

SQL AZURE server

Issue:

Can not connect to AZURE SQL server in SSMS.

Error like:

the server was not found – msg 110001

Solution:

You need a new DNS record to point to the server-name in azure (db.database.windows.net).

There are more than one DNS zone where you need to update DNS records for your resource (privatelink.database.windows.net).

 

If you get a different error that, the server is connected but refuse to allow your user, then try with some other login method like:

Azure Active Directory – integrated

Azure Active Directory with MFA

SQL Server authentication (native login)

The allowed login method change with from where you try to connect and how firewall/DNS is setup.

 

More information:

https://learn.microsoft.com/en-us/azure/private-link/private-endpoint-dns 

https://learn.microsoft.com/en-us/sql/relational-databases/errors-events/mssqlserver-11001-database-engine-error?view=sql-server-ver16

https://learn.microsoft.com/en-us/azure/azure-sql/database/troubleshoot-common-errors-issues?view=azuresql

https://learn.microsoft.com/en-us/azure/azure-sql/database/authentication-aad-overview?view=azuresql

https://learn.microsoft.com/en-us/azure/azure-sql/database/authentication-aad-configure?view=azuresql&tabs=azure-powershell 

Product:

Microsoft SQL server

Issue:

How to import a txt file to a table inside SQL server?

Solution:

Download the file you want to import to your laptop or computer where SSMS is installed.

Right click on your database and select Tasks – Import Flat File.

Select the csv file and enter the name of the new table

 

 

Set a primary key, and change any date formatted columns to string or correct date format.

 

Then you can check the new table inside SSMS.

To use a small set in a new table, create a new table first:

CREATE TABLE [dbo].[Company](
[index] [int] NOT NULL,
[Name] [nvarchar](50) NULL,
[Country] [nvarchar](50) NULL,
[Employess] [int] NULL
) ON [PRIMARY]
GO

Then to copy the data over to that new table use:

INSERT INTO [AdventureWorksDW2019].[dbo].[Company] ([index], [name], [country], [employess] )
SELECT [index], [name] , [country] , [Number_of_employees]
FROM [AdventureWorksDW2019].[dbo].[organizations] 
-- [WHERE condition];

 

Use [  ] around columns names to ensure they are not misunderstood.

https://www.geeksforgeeks.org/how-to-use-reserved-words-as-column-names-in-sql/

 

More Information:

https://www.sqlshack.com/import-flat-file-sql-server-database-using-import-flat-file-wizard/

https://www.sqlservergeeks.com/sql-server-import-flat-file-using-ssms/

You can get sample data from this sites:

https://www.stats.govt.nz/large-datasets/csv-files-for-download/

https://people.sc.fsu.edu/~jburkardt/data/csv/csv.html

https://www.datablist.com/learn/csv/download-sample-csv-files

https://www.tutorialspoint.com/sql/sql-insert-query.htm

Product:

Microsoft SQL server

Issue:

What user is connected with what SQL login?

Solution:

Run this query’s:

SELECT A.name as userName, B.name as login 
FROM sys.sysusers A 
FULL OUTER JOIN sys.sql_logins B 
ON A.sid = B.sid

 

https://learn.microsoft.com/en-us/sql/relational-databases/system-compatibility-views/sys-sysusers-transact-sql?view=sql-server-ver16

select sp.name as login,
sp.type_desc as login_type,
sl.password_hash,
sp.create_date,
sp.modify_date,
case when sp.is_disabled = 1 then 'Disabled'
else 'Enabled' end as status
from sys.server_principals sp
left join sys.sql_logins sl
on sp.principal_id = sl.principal_id
where sp.type not in ('G', 'R')
order by sp.name;

https://dataedo.com/kb/query/sql-server/list-logins-on-server

SELECT *
FROM master.sys.sql_logins;

More SQL information:

https://www.w3schools.com/sql/sql_quickref.asp 

Product:

Microsoft Azure Data Factory

Issue:

How do i start a pipeline to run now and not wait for the scheduled run?

Solution:

Go to https://portal.azure.com/ and select you Azure Data Factory resource.

Click on button “Launch Studio”.

Click on pencil icon (author) on the left.

Expand Pipeline, so you see a list of your pipeline.

Double click on the one you want to start.

Then on top you click on Trigger to run that job.

Select “Trigger Now”

Click OK on Pipeline Run dialog.

Click on the monitor icon – and see if it is started under pipeline runs.

 

 

More Information:

https://learn.microsoft.com/en-us/azure/data-factory/concepts-pipeline-execution-triggers 

https://www.mssqltips.com/sqlservertutorial/9398/building-an-azure-data-factory-pipeline-manually/

Product:

Microsoft Azure SQL

Issue:

Error when try to create a user in AZURE SQL from a Managed Identity Object ID.

Principal ‘xyz’ could not be found or this principal type is not supported.

Cannot add the principal ‘xyz’, because it does not exist or you do not have permission.

Solution:

Use the managed resource name instead of the object id.  Managed Identity can be replaced with the resource name, when referring to the object.

Background:

To make Azure Data Factory to connect to a Azure SQL resource with Managed Identity, you need to create a Managed identity for the ADF resource.

Then you can add the resource to AZURE SQL with below command to give it access to the database.

Login to the AZURE SQL server with SSMS, use the Azure SQL server name to connect.
Select the database and click New Query:

CREATE USER [adf_name] FROM EXTERNAL PROVIDER

ALTER ROLE [db_owner] ADD MEMBER [adf_name]

By adding a USER direct to the database, and not create a login in SQL, the user must provided the database name when it connects to AZURE SQL.

 

If you are Owner of the Azure database, then you do not need to be AAD Admin to be able to perform the change in SSMS.

“Azure Active Directory authentication allows you to centrally manage identity and access to your Azure SQL Database.”

 

More Information:

Connect Azure SQL from Data Factory using Managed Identity

https://learn.microsoft.com/en-us/azure/private-link/tutorial-private-endpoint-sql-portal

https://learn.microsoft.com/en-us/azure/azure-sql/database/authentication-aad-service-principal-tutorial?view=azuresql

https://crmchap.co.uk/principal-could-not-be-found-or-this-principal-type-is-not-supported-error-azure-sql-server/

https://www.data4v.com/managed-identity-between-azure-data-factory-and-azure-storage/