Product:

Microsoft Power BI service (in the cloud)

https://learn.microsoft.com/en-us/power-bi/connect-data/service-azure-and-power-bi 

Issue:

When i check my report, it show old numbers, why?

Solution:

It can be that you need to refresh your web browser.  Press F5 to see if it helps.

Data is first updated at the source, can be your SQL database server.

Then (if you use it) the azure dataflow need to be updated.

When that is finished, you can update your semantic model. That will give you a updated Power BI report.

But the web based report can be cached, so please also update the page in your web browser, to ensure you are updated.

(if you use direct queries in your powerbi reports, there is other implications that may give issues).

 

More Information:

https://fabricdigital.co.nz/blog/how-to-hard-refresh-your-browser-and-clear-cache 

If the dataflow is standard, then the data is stored in Dataverse. Dataverse is like a database system; it has the concept of tables, views, and so on. Dataverse is a structured data storage option used by standard dataflows.

However, when the dataflow is analytical, the data is stored in Azure Data Lake Storage. A dataflow’s data and metadata is stored in a Common Data Model folder. Since a storage account might have multiple dataflows stored in it, a hierarchy of folders and subfolders has been introduced to help organize the data. Depending on the product the dataflow was created in, the folders and subfolders may represent workspaces (or environments), and then the dataflow’s Common Data Model folder. Inside the Common Data Model folder, both schema and data of the dataflow tables are stored. This structure follows the standards defined for Common Data Model.

https://learn.microsoft.com/en-us/power-query/dataflows/what-is-the-cdm-storage-structure-for-analytical-dataflows 

A dataflow stores the data for each table in a subfolder with the table’s name. Data for a table might be split into multiple data partitions, stored in CSV format.

https://learn.microsoft.com/en-us/power-query/dataflows/configuring-storage-and-compute-options-for-analytical-dataflows 

In Power BI, in addition to the standard dataflow engine, an enhanced compute engine is available for the dataflows created in Power BI Premium workspaces. You can configure this setting in the Power BI admin portal, under the Premium capacity settings. The enhanced compute engine is available in Premium P1 or A3 capacities and above. The enhanced compute engine reduces the refresh time required for long-running extract, transform, load (ETL) steps over computed tables, such as joins, distinct, filters, and group by. It also provides the ability to perform DirectQuery over tables from the Power BI semantic model. More information: The enhanced compute engine

https://10senses.com/blog/azure-synapse-vs-azure-data-factory-vs-power-bi-dataflows-what-are-the-similarities-and-differences/

Power Platform dataflows are data transformation services empowered by the Power Query engine and hosted in the cloud. These dataflows get data from different data sources and, after applying transformations, store it either in Dataverse or in Azure Data Lake Storage.

Dataflows are created using Power Query Online. Once you create them, the “M” scripts are available for review or for changes, but you do not need to write any line of code by yourself. It makes creating dataflows in Power Bi a code-free solution, just like Azure Synapse and ADF.

With Power BI dataflows, you can develop ETL processes which can be used to connect with business data from multiple data sources. Data imported by Power BI dataflows is stored in Azure Data Lake (Gen2), which is known for having massive scalability.

https://debbiesmspowerbiazureblog.home.blog/2019/12/04/use-data-lake-storage-v2-as-data-flow-storage/ 

Power BI semantic models can store data in a highly compressed in-memory cache for optimized query performance, enabling fast user interactivity. With Premium capacities, large semantic models beyond the default limit can be enabled with the Large semantic model storage format setting. When enabled, semantic model size is limited by the Premium capacity size or the maximum size set by the administrator.

Large semantic models in the service don’t affect the Power BI Desktop model upload size, which is still limited to 10 GB.

https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-large-models 

While a semantic model can be built using Power BI Desktop (in a .pbix file), it does not need to contain any visuals. Think of a semantic model as the last stop in the data pipeline before reports and dashboards are built. Thereafter, once you share a semantic model with other members of the organization, they can build any number of reports and dashboards from just that one semantic model.

Semantic models hide the complex technical details behind reports so that both technical and non-technical users can concentrate on analyzing the data and answering business questions. Sharing and reusability are two stand-out features of semantic models.

 A Power BI Desktop model is effectively an Analysis Services tabular model.

https://www.datacamp.com/blog/what-are-power-bi-semantic-models 

https://learn.microsoft.com/en-us/power-bi/connect-data/service-datasets-understand 

https://kyligence.io/plp/what-is-a-semantic-layer-in-power-bi/ 

https://www.analyticscreator.com/blog/power-bi-dataset-content-type-renamed-to-semantic-model 

  • Larger model sizes may not be supported by your capacity. Shared capacity can host models up to 1 GB in size, while Premium capacities can host larger models depending on the SKU. For further information, read the Power BI Premium support for large semantic models article. (Semantic models were previously known as datasets.)
  • Smaller model sizes reduce contention for capacity resources, in particular memory. It allows more models to be concurrently loaded for longer periods of time, resulting in lower eviction rates.
  • Smaller models achieve faster data refresh, resulting in lower latency reporting, higher semantic model refresh throughput, and less pressure on source system and capacity resources.
  • Smaller table row counts can result in faster calculation evaluations, which can deliver better overall query performance.

https://azure.microsoft.com/en-us/blog/data-models-within-azure-analysis-services-and-power-bi/ 

https://learn.microsoft.com/en-us/power-bi/guidance/import-modeling-data-reduction 

Power BI’s Data Compression: Large Data Imports in Power BI

https://community.fabric.microsoft.com/t5/Service/Maximum-Data-that-be-Consumed-by-Power-BI-from-Azure-Data-Lake/m-p/2031983 

https://learn.microsoft.com/en-us/power-bi/connect-data/service-live-connect-dq-datasets

Product:
Planning Analytics 2.0.9.19 TM1
Microsoft Windows 2019 Server

Issue:

When user browse to the tm1web they get a error – random in different web-browsers e.g. chrome and edge.

Error msg:

This site can’t provide a secure connection
tm1sebservername.domain.com sent an invalid response.
Try running Windows Network Diagnostics.
ERR_SSL_PROTOCOL_ERROR

Solution:

Check the URL you use – it need to start with HTTP:// if you do not have a certificate on your TM1WEB service.

If you by accident enter;

https://servername.domain.com:9511/tm1web/

you get above error.

You need to enter

http://servername.domain.com:9511/tm1web/

 

More Information:

https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=itw-configure-ssl-planning-analytics-tm1-webspreadsheet-services-existing-keystore 

https://www.ibm.com/support/pages/planning-analytics-ssl-configuration-tm1web-or-any-web-tier-components-does-not-work-expected

https://www.ibm.com/docs/en/SSD29G_2.0.0/com.ibm.swg.ba.cognos.tm1_ug.2.0.0.doc/tm1_ug.pdf