Product:
Microsoft Azure

Issue:
You get a mail about that some of your subscriptions parts are expire in a few days like “Your Storage File Data Privileged Contributor role in the prod subscription will expire in 1 day(s)”

Solution:

Depending how your Azure accounts are setup and handled by your organisation, but it should be similar to this:

(you need to be owner of the subscription, to be able to extend the role for your self)

Go to azure  Home – Microsoft Azure

Search for PIM.  Click on Privileged Identity Management

Expand tasks and click on my roles.

Go to Azure Resources, and for the subscriptions you need to extend, go to owner line and activate you as owner.

click on activate. After someone have approved, go to next step.

Click on Privileged Identity Management, to get back to the start page of PIM.

Click now on Azure resources under Manage, and from the new page drop down select the subscription you want to handled.

Click on Manage button, after you have selected a subscription.

Click on Assignments under Manage. Then browse the lines to the right, to find what roles you can extend. Click on extend link.

Set a date, at least a year in the future. Press save.  Repeat for all roles you need to extend.

If you do not find your role, check under the Expired assignments tab – to see if it is there, and you can extend it there.

Repeat for all subscriptions you have in Azure.

 

More Information:

Renew Azure resource role assignments in PIM – Microsoft Entra ID Governance | Microsoft Learn

Extend or renew PIM for groups assignments – Microsoft Entra ID Governance | Microsoft Learn

What is Privileged Identity Management? – Microsoft Entra ID Governance | Microsoft Learn

 

Product:
Microsoft Windows 2019 server

Issue:

Your security scan software report that you have a issue on the Windows server with ciphers.

TLS/SSL Birthday attacks on 64-bit block ciphers (SWEET32)
Negotiated with the following insecure cipher suites:
* TLS 1.2 ciphers:
* TLS_RSA_WITH_3DES_EDE_CBC_SHA

Configure the server to disable support for 3DES suite.

Solution:

Check that the application software you use on the server does not need this cipher.

Login to the Windows server as local administrator.

Then run this powershell command to remove the support for the cipher:

Disable-TlsCipherSuite -Name 'TLS_RSA_WITH_3DES_EDE_CBC_SHA'

 

Reboot the windows server for the change to take affect.

Get list of cipher that is 3DES:

Get-TlsCipherSuite -name “3DES”

 

Get list of all cipher on server:

Get-TlsCipherSuite | Format-Table Name

Get list of curves:

certutil.exe –displayEccCurve

 

More Information:

A cipher suite is a set of cryptographic algorithms. The schannel SSP implementation of the TLS/SSL protocols use algorithms from a cipher suite to create keys and encrypt information. A cipher suite specifies one algorithm for each of the following tasks:

  • Key exchange
  • Bulk encryption
  • Message authentication

Key exchange algorithms protect information required to create shared keys. These algorithms are asymmetric (public key algorithms) and perform well for relatively small amounts of data.

Bulk encryption algorithms encrypt messages exchanged between clients and servers. These algorithms are symmetric and perform well for large amounts of data.

Message authentication algorithms generate message hashes and signatures that ensure the integrity of a message.

Developers specify these elements by using ALG_ID data types. For more information, see Specifying Schannel Ciphers and Cipher Strengths.

In earlier versions of Windows, TLS cipher suites and elliptical curves were configured by using a single string:

Diagram that shows a single string for a Cipher Suite.

 

https://learn.microsoft.com/en-us/windows/win32/secauthn/cipher-suites-in-schannel 

https://learn.microsoft.com/en-us/windows-server/security/tls/manage-tls#configuring-tls-cipher-suite-order 

https://learn.microsoft.com/en-us/powershell/module/tls/?view=windowsserver2022-ps

https://learn.microsoft.com/en-us/powershell/module/tls/disable-tlsciphersuite?view=windowsserver2022-ps 

https://rdr-it.io/en/windows-server-disable-a-cipher-suite/ 

https://learn.microsoft.com/en-us/windows-server/security/tls/tls-registry-settings?tabs=diffie-hellman

Product:

Microsoft Power BI desktop

Issue:

Would like to add a step inside a existing step list in Power BI desktop.

Solution:

Go to Transform Data. Select that Query you want to change steps for.

In the right “applied steps” column, mark the row where you want to insert a blank step below.

Click on FX icon to add a step.

Now you can paste code from other step into the = #”Replaced Value” field, or do the transforms you need.

Save the report.

More Information:

https://learn.microsoft.com/en-us/power-query/applied-steps

https://learn.microsoft.com/en-us/power-bi/

https://learn.microsoft.com/en-us/power-bi/transform-model/dataflows/dataflows-configure-consume 

https://learn.microsoft.com/en-us/power-bi/create-reports/desktop-buttons?tabs=powerbi-desktop

https://learn.microsoft.com/en-us/power-bi/transform-model/desktop-common-query-tasks

Product:
Microsoft Power BI

Issue:

What is Enable Load and Include In Report Refresh in the power bi transform data for each query?

 

Solution:

“Include In Report Refresh” means query is automatically refreshed when you press “Refresh” button on the ribbon. “Enable Load” means query results are available for report builder. Otherwise you may use it in your other queries (for example to merge data), but it is not shown in the report builder.

If you have a query that use a other query, that is unchecked “Include In Report Refresh”, then both query’s are updated when you update the first query.

This works in Power BI Desktop but not in Power BI service at 2022, but can have changed. If you unchecked Include in report refresh – The functionality works only in Power BI Desktop and it will not have any impact in Power BI Service i.e. if you refresh report in Power BI service, this will still refresh all the queries even through Include in report refresh is unchecked.

 

More Information:

https://www.purplefrogsystems.com/2021/04/power-bi-enable-load/ 

Keep The Existing Data In Your Power BI Dataset And Add New Data To It Using Incremental Refresh

 

Product:
Microsoft Power BI desktop

Issue:

How select all data from a year, when you have a date column in your table?

Solution:

Inside PowerBI you can enter formulas like this;

to select last year

= Table.SelectRows(#"Changed Type1", each Date.IsInPreviousYear([Datum]))

 

to select specific year

= Table.SelectRows(#"Changed Type1", each Date.Year([Datum])=2022)

 

to select all dates before 2022


= Table.SelectRows(#"Changed Type1", each [Datum] < #date(2022, 1, 1))

 

 

More Information:

https://learn.microsoft.com/en-us/powerquery-m/date-isinpreviousyear

How to Calculate Year to Date and Prior YTD in Power BI

https://thedatalabs.org/how-to-get-the-same-period-last-year-to-date-in-power-bi/

Product:

Microsoft Power BI service (in the cloud)

https://learn.microsoft.com/en-us/power-bi/connect-data/service-azure-and-power-bi 

Issue:

When i check my report, it show old numbers, why?

Solution:

It can be that you need to refresh your web browser.  Press F5 to see if it helps.

Data is first updated at the source, can be your SQL database server.

Then (if you use it) the azure dataflow need to be updated.

When that is finished, you can update your semantic model. That will give you a updated Power BI report.

But the web based report can be cached, so please also update the page in your web browser, to ensure you are updated.

(if you use direct queries in your powerbi reports, there is other implications that may give issues).

 

More Information:

https://fabricdigital.co.nz/blog/how-to-hard-refresh-your-browser-and-clear-cache 

If the dataflow is standard, then the data is stored in Dataverse. Dataverse is like a database system; it has the concept of tables, views, and so on. Dataverse is a structured data storage option used by standard dataflows.

However, when the dataflow is analytical, the data is stored in Azure Data Lake Storage. A dataflow’s data and metadata is stored in a Common Data Model folder. Since a storage account might have multiple dataflows stored in it, a hierarchy of folders and subfolders has been introduced to help organize the data. Depending on the product the dataflow was created in, the folders and subfolders may represent workspaces (or environments), and then the dataflow’s Common Data Model folder. Inside the Common Data Model folder, both schema and data of the dataflow tables are stored. This structure follows the standards defined for Common Data Model.

https://learn.microsoft.com/en-us/power-query/dataflows/what-is-the-cdm-storage-structure-for-analytical-dataflows 

A dataflow stores the data for each table in a subfolder with the table’s name. Data for a table might be split into multiple data partitions, stored in CSV format.

https://learn.microsoft.com/en-us/power-query/dataflows/configuring-storage-and-compute-options-for-analytical-dataflows 

In Power BI, in addition to the standard dataflow engine, an enhanced compute engine is available for the dataflows created in Power BI Premium workspaces. You can configure this setting in the Power BI admin portal, under the Premium capacity settings. The enhanced compute engine is available in Premium P1 or A3 capacities and above. The enhanced compute engine reduces the refresh time required for long-running extract, transform, load (ETL) steps over computed tables, such as joins, distinct, filters, and group by. It also provides the ability to perform DirectQuery over tables from the Power BI semantic model. More information: The enhanced compute engine

https://10senses.com/blog/azure-synapse-vs-azure-data-factory-vs-power-bi-dataflows-what-are-the-similarities-and-differences/

Power Platform dataflows are data transformation services empowered by the Power Query engine and hosted in the cloud. These dataflows get data from different data sources and, after applying transformations, store it either in Dataverse or in Azure Data Lake Storage.

Dataflows are created using Power Query Online. Once you create them, the “M” scripts are available for review or for changes, but you do not need to write any line of code by yourself. It makes creating dataflows in Power Bi a code-free solution, just like Azure Synapse and ADF.

With Power BI dataflows, you can develop ETL processes which can be used to connect with business data from multiple data sources. Data imported by Power BI dataflows is stored in Azure Data Lake (Gen2), which is known for having massive scalability.

https://debbiesmspowerbiazureblog.home.blog/2019/12/04/use-data-lake-storage-v2-as-data-flow-storage/ 

Power BI semantic models can store data in a highly compressed in-memory cache for optimized query performance, enabling fast user interactivity. With Premium capacities, large semantic models beyond the default limit can be enabled with the Large semantic model storage format setting. When enabled, semantic model size is limited by the Premium capacity size or the maximum size set by the administrator.

Large semantic models in the service don’t affect the Power BI Desktop model upload size, which is still limited to 10 GB.

https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-large-models 

While a semantic model can be built using Power BI Desktop (in a .pbix file), it does not need to contain any visuals. Think of a semantic model as the last stop in the data pipeline before reports and dashboards are built. Thereafter, once you share a semantic model with other members of the organization, they can build any number of reports and dashboards from just that one semantic model.

Semantic models hide the complex technical details behind reports so that both technical and non-technical users can concentrate on analyzing the data and answering business questions. Sharing and reusability are two stand-out features of semantic models.

 A Power BI Desktop model is effectively an Analysis Services tabular model.

https://www.datacamp.com/blog/what-are-power-bi-semantic-models 

https://learn.microsoft.com/en-us/power-bi/connect-data/service-datasets-understand 

https://kyligence.io/plp/what-is-a-semantic-layer-in-power-bi/ 

https://www.analyticscreator.com/blog/power-bi-dataset-content-type-renamed-to-semantic-model 

  • Larger model sizes may not be supported by your capacity. Shared capacity can host models up to 1 GB in size, while Premium capacities can host larger models depending on the SKU. For further information, read the Power BI Premium support for large semantic models article. (Semantic models were previously known as datasets.)
  • Smaller model sizes reduce contention for capacity resources, in particular memory. It allows more models to be concurrently loaded for longer periods of time, resulting in lower eviction rates.
  • Smaller models achieve faster data refresh, resulting in lower latency reporting, higher semantic model refresh throughput, and less pressure on source system and capacity resources.
  • Smaller table row counts can result in faster calculation evaluations, which can deliver better overall query performance.

https://azure.microsoft.com/en-us/blog/data-models-within-azure-analysis-services-and-power-bi/ 

https://learn.microsoft.com/en-us/power-bi/guidance/import-modeling-data-reduction 

Power BI’s Data Compression: Large Data Imports in Power BI

https://community.fabric.microsoft.com/t5/Service/Maximum-Data-that-be-Consumed-by-Power-BI-from-Azure-Data-Lake/m-p/2031983 

https://learn.microsoft.com/en-us/power-bi/connect-data/service-live-connect-dq-datasets

Product:

Microsoft SQL server 2019

Issue:

How find what SQL Query is it that i is working one?

Solution:

When you have a long list of Query’s you work with, click on the little PIN one each and they will stay at the top. And maybe it is easier for you to find back to the one you want to work with.

 

Other tip is to use WHERE 1=1 in your statements, to easy then remark the lines you need or not need when you adjust your selection.

SELECT TOP (1000) [GeographyKey]
,[City]
,[StateProvinceCode]
,[StateProvinceName]
,[CountryRegionCode]
,[EnglishCountryRegionName]
,[SpanishCountryRegionName]
,[FrenchCountryRegionName]
,[PostalCode]
,[SalesTerritoryKey]
,[IpAddressLocator]
FROM [AdventureRestore].[dbo].[DimGeography]
WHERE 1=1
and CountryRegionCode = 'AU'
--and city = 'Alexandria'
and PostalCode = 2015

 

More Information:

https://www.red-gate.com/simple-talk/wp-content/uploads/imported/1307-keystrokes.pdf

https://learn.microsoft.com/en-us/sql/ssms/sql-server-management-studio-keyboard-shortcuts?view=sql-server-ver16

https://www.mssqltips.com/sqlservertip/2542/display-line-numbers-in-a-sql-server-management-studio-query-window/

https://sqlstudies.com/2022/07/21/ssms-put-pinned-tabs-in-their-own-row/

Product:
Microsoft Windows

Issue:

How rename many files in a folder by change the first letter to a other letter?

Solution:

Go to the command prompt.

Go to the folder where the files are.

Enter below to change all files to start with the letter b

ren  *.* b*.*

 

More Information:

https://www.alphr.com/how-to-batch-rename-files-in-windows-10/

https://www.partitionwizard.com/partitionmagic/batch-rename-files.html

 

Product:
Microsoft SQL Azure database

Issue:
How create a new database from a BACPAC file in SQL Azure?

The Azure portal only supports creating a single database in Azure SQL Database and only from a .bacpac file stored in Azure Blob storage.

Depending how your firewall and network endpoints are setup in the SQL azure, you can get different problems.

https://learn.microsoft.com/en-us/azure/azure-sql/database/database-import?view=azuresql&tabs=azure-powershell 

Error: The ImportExport operation failed because of invalid storage credentials, can be that you need to click in “use private link” for the subscription.

You need to select the backup bacpac file from your blob storage.
You should select the database model you want to get a big and valuable SQL Azure database created.
Enter a name that you can easy find from the other databases.

The ImportExport operation with Request Id 'xxxxx-xxxxx' failed due to 'The server principal "donald" is not able to access the database "sqldatabasename-new" under the current security context.
Cannot open database "sqldatabasename-new" requested by the login. The login failed.

Error like above, can be that you do not have access to the database, the user account is not ADMIN on the SQL Azure server.

Other common issue can be that the BLOB storage is not allowing access from the SQL Azure database.
When doing the import above, you need to monitor both Private Endpoint Connections and SQL azure private access, to Approve the request for access that the above import process will ask for.

You can also may need to add the SQL server as resource type.  But it may still fail, if you have not given access to the SQL server over the network from the BLOB storage.

Solution:

If you know you have access from your computer to the SQL azure server, then you should upload the DACPAC from you c:\temp folder with a powershell script.

.\SqlPackage.exe /Action:Import /SourceFile:"C:\temp\sqldatabasfilesavedname.bacpac" /TargetConnectionString:"Server=tcp:sqlservername.database.windows.net,1433;Initial Catalog=databasename_test_restore;Authentication=Active Directory Password;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;User ID=donald.duck@company.com;Password=xxxxxxx" /p:DatabaseEdition=Standard /p:DatabaseServiceObjective=S4

First you need to install the DacFramework.msi from https://learn.microsoft.com/en-us/sql/tools/sqlpackage/sqlpackage-download?view=sql-server-ver16

Then start powershell as administrator and go to the folder C:\Program Files\Microsoft SQL Server\160\DAC\bin, and run above powershell script after you change the names to match yours.

Parameters used in script:

.\SqlPackage.exe /Action:Import
= start the process and what action we do – in this case import
/SourceFile:”C:\temp\sqldatabasfilesavedname.bacpac”
= tell the location and filename of the file to restore from
/TargetConnectionString:”Server=tcp:sqlservername.database.windows.net,1433;
= tell protocol and name of SQL server and port the script should use
Initial Catalog=databasename_test_restore;
= tell the new name of the database in SQL azure
Authentication=Active Directory Password;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;
= set parameters that you will login with active directory
User ID=donald.duck@company.com;
= enter the name of a administrator – this account should not be specified in the database
Password=xxxxxxx”
= enter the password for the account
/p:DatabaseEdition=Standard
= set the version of SQL Azure database you want in the subscription
/p:DatabaseServiceObjective=S4
= set the number of DTU you will use first in your new database

When it does the import – it start with this – check the line for servername to ensure it is correct

Importing to database ‘databasename_test_restore’ on server ‘tcp:sqlservername.database.windows.net,1433’.
Creating deployment plan
Initializing deployment
Verifying deployment plan
Analyzing deployment plan
Importing package schema and data into database
Updating database
Importing data
Processing Import.
Disabling indexes.

….

When done it should say something like this:

Successfully imported database.
Changes to connection setting default values were incorporated in a recent release. More information is available at https://aka.ms/dacfx-connection
Time elapsed 0:13:32.42

 

If you get below error, it can be that you are on VPN, and need to be at the office instead, to be allowed to connect to the Azure SQL Server.

The ImportExport operation with Request Id ‘xxxxx-xxxx’ failed due to ‘An error occurred while communicating with the SQL Server using AdPassword-login: AADSTS50076: Due to a configuration change made by your administrator, or because you moved to a new location, you must use multi-factor authentication to access ‘xxxxx-xxxxxx’

If you get below error, the account you gave, already exist in the DACPAC file, you need to remove the user from the database, and do a new extract of data to dacpac.

Error SQL72014: Framework Microsoft SqlClient Data Provider: Msg 15063, Level 16, State 1, Line 1 The login already has an account under a different user name.
Error SQL72045: Script execution error.  The executed script:
CREATE USER xxxxxxx

You may need to remove logins for the user account used at the import, on the database that you see exist at the server level (on the master database).

To see server accounts, run below on master database:

SELECT A.name as userName, B.name as login, B.Type_desc, default_database_name, B.* 
FROM sys.sysusers A 
    FULL OUTER JOIN sys.sql_logins B 
       ON A.sid = B.sid 
WHERE islogin = 1 and A.sid is not null

More Information:

https://learn.microsoft.com/sv-se/azure/azure-sql/database/database-import?view=azuresql&tabs=azure-powershell 

SqlPackage /a:import /tcs:”Data Source=<serverName>.database.windows.net;Initial Catalog=<migratedDatabase>;User Id=<userId>;Password=<password>” /sf:AdventureWorks2008R2.bacpac /p:DatabaseEdition=Premium /p:DatabaseServiceObjective=P6

SqlPackage /a:Import /sf:testExport.bacpac /tdn:NewDacFX /tsn:apptestserver.database.windows.net /ua:True /tid:”apptest.onmicrosoft.com”

https://support.ptc.com/help/windchill/r13.0.0.0/en/index.html#page/Windchill_Help_Center/WCUpgradeGuide/WCUpgrade_ImportingAzureSQL.html 

https://learn.microsoft.com/en-us/sql/tools/sqlpackage/sqlpackage-import?view=sql-server-ver16 

 

Product:

Microsoft Azure SQL database

Issue:

How create a BACPAC, a file that contain both data and table structure (metadata) from the database? That you can import to other database later.

If you are going to import the BACPAC into SQL Azure later, the user account doing the import, can not exist inside the database. Please remove that user from inside the database.

Solution:

From inside SSMS (SQL Server Management Studio) you right click on the database and select task – export data-tier application.

You can enter a filename and save the file to your computer. (downloading database from Azure can take some time)

It is faster to save the file to AZURE BLOB storage (if they are at the same location), then you select “save to Microsoft Azure”.
Select a Storage Account, from your subscription.
Enter a name for the file, or keep the default.
Leave the temporary file as is.
Click next…

Or to save the file to disk, keep the “save to local disk” settings.

Click next and finish to start download the file to your local disk.

A DAC is a logical database management entity that defines all of the SQL Server objects which associates with a user’s database. A BACPAC includes the database schema as well as the data stored in the database.

More Information:

https://learn.microsoft.com/en-us/azure/azure-sql/database/database-export?view=azuresql 

https://www.sqlshack.com/azure-automation-export-azure-sql-database-to-blob-storage-in-a-bacpac-file/ 

SQLPackage utility to export Azure SQL Databases