Product:
FileBeat 8.19.3

Microsoft Windows 2022 server

Issue:

Have installed filebeat service in windows, to collect logs files to elastic, but it does not read any files. And when i try to stop the services it hangs.

After you stop the filebeat service with Task Manager, you need to erase the lock file in folder C:\ProgramData\filebeat\ to make it to read the yml file at next start of the service.

Solution:

Check the filebeat.yml file. The JSON format is sensitive to spaces and other formats.

In this case there was a row:
ignore_older: ‘7d’
that made the filebeat service to stop.

It only supports minutes and hours, so you need to enter like this:
ignore_older: ‘168h’

The ignore_older: ‘168h’ function will check the timestamp of the file, and not read files that was created more than 7 days ago.

The filebeat.yml file is in folder C:\Program Files\Filebeat on windows.

Below a example of a filebeat.yml file for use with TM1 logs files – you need to add spaces in the beginning of every line to get it to work.

# ============================== Filebeat inputs ===============================
filebeat.inputs:
- type: filestream
id: tm1server
enabled: true

paths:
- D:/TM1 folder/Logs/tm1server.log
fields_under_root: true
fields:
event:
dataset: audit.plain

- type: filestream
id: tm1s2
enabled: true
ignore_older: '168h'
paths:
- D:/TM1 folder/Logs/tm1s2*.log
exclude_lines:
- '^#'
include_lines:
- 'AD'
fields_under_root: true
fields:
event:
dataset: audit.plain

# ---------------------- beats state ----------------------
- type: filestream
id: beats-logs
enabled: true
paths:
- C:/ProgramData/filebeat/logs/filebeat*.ndjson
include_lines:
- 'Non-zero metrics in the last 30s'
fields_under_root: true
fields:
event:
dataset: beats.state
processors:
- dissect:
tokenizer: '%{}"@timestamp":"%{event.start}"'
field: message
target_prefix: ""
ignore_failure: true
setup.template.settings:
index.number_of_shards: 1
fields:
system:
env: prod
id: SystemTM1
fields_under_root: true
max_procs: 1
processors:
- add_host_metadata:
when.not.contains.tags: forwarded
- add_cloud_metadata: ~
output.logstash:
hosts: ["elasticservername.domain.com:9999"]
ssl.enabled: true
ttl: 5m
pipelining: 0



More Information:

Filebeat sends log files to Logstash or directly to Elasticsearch.

## Getting Started

To get started with Filebeat, you need to set up Elasticsearch on
your localhost first. After that, start Filebeat with:

./filebeat -c filebeat.yml -e

This will start Filebeat and send the data to your Elasticsearch
instance. To load the dashboards for Filebeat into Kibana, run:

./filebeat setup -e

For further steps visit the (https://www.elastic.co/guide/en/beats/filebeat/8.19/filebeat-installation-configuration.html) guide.

## Documentation

Visit (https://www.elastic.co/guide/en/beats/filebeat/8.19/index.html) for the full Filebeat documentation.

## Release notes

https://www.elastic.co/guide/en/beats/libbeat/8.19/release-notes-8.19.3.html

https://www.elastic.co/beats/filebeat

https://www.elastic.co/downloads/beats/filebeat

https://github.com/elastic/beats

Product:

Microsoft Power Bi dataflow

Issue:

Get a error on all dataflow, Error: An authentication problem is preventing the dataflow from being refreshed. Please have the owner of the dataflow sign out of Power BI and sign back in and try again.

or error message like  {“code”:”DMTS_OAuthTokenRefreshFailedError”,”details”:[{“code”:”DM_ErrorDetailNameCode_UnderlyingErrorMessage”,”detail”:{“type”:1,”value”:”AADSTS135010: UserPrincipal doesn’t have the key ID configured} from the dataset that use the dataflow.

Can happen if you change you windows pincode login from microsoft, and use the same account to run the dataflow with.

Solution:

Go to the dataflow in your workspace in https://app.powerbi.com/

Go to settings

Click on data source credentials

Click on edit credentials

If the value is correct, click Sign in without a change.

Confirm the login question you will get, from you company.

Go to the workspace.

Try to refresh the dataflow again.

 

More information:

This error indicates that the cached credentials for the dataflow owner have expired, changed, or are no longer authorized to access the data source, causing a Power BI refresh failure. To resolve it, the owner must sign out of the Power BI service, clear browser cache, sign back in, and re-enter credentials in the dataflow settings.
Actionable Steps to Resolve:
  • Re-authenticate Dataflow Credentials: The dataflow owner should go to the Power BI Service, locate the dataflow, go to Settings > Data source credentials, and click “Edit credentials” to re-authenticate.
  • Sign Out/In: Sign out of app.powerbi.com completely, clear your browser cache/cookies, and sign back in to refresh token authentication.
  • Check Data Source: If using a Gateway, ensure the credentials in the gateway settings are valid.
  • Verify Permissions: Ensure the user who created the dataflow still has permissions to the underlying data sources (SQL, SharePoint, etc.).

 

https://www.fourmoo.com/2019/02/13/why-is-my-powerbi-dataflow-refresh-failing/ 

Product:
Cognos Analytics
Microsoft Windows 2019 server

Issue:

Does the cert expire in 2026 on old Cognos installations?

Solution:

Upgrade to later version of Cognos. The new version of Cognos Analytics and Planning Analytics have updated cert files as of IBM.

“IBM will provide updates SSL certificates as part of the next Planning Analytics Local 2.0.9 release. The best course of action is to configure your own SSL certificates for use with any version of TM1. You really do not need to wait until IBM provides updates certificates.”

If you have older version of TM1 client software and CA11 services, you may need to replace the certificates.

 

You can check the date of the cert in Cognos Analytics (CA11) with the IKEYMAN.EXE program, then open the camkeystore and check the date on your certificates as picture above show.
The case is that some certificates end date are updated, when you from inside Cognos Configuration program do a save.

Open the Cognos Configuration (as administrator) right click the root element –> “Test” then “Save” (re-save) and restart the service. Besides updating the cryptographic keys, you can also see transparency checking the details, during this process.

To check the Tm1 cert, copy C:\Program Files\ibm\cognos\tm1_64\bin64\ssl\ibmtm1.arm file to a c:\temp folder.
Rename the file to ibmtm1.crt. Right click on the ibmtm1.crt file and select OPEN.

There you see the certificate end date for your version of Tm1.

This document describes the process to renew and import updated certificates into Cognos Analytics, where Third-Party certificate authority has been enabled.

Objective

The purpose of this document, is to provide a straightforward process for obtaining and importing updated certificates into Cognos Analytics in order to minimize downtime, and reduce the need for a support case.
Depending on the certificate authority used by your organization, certificate renewal can typically be managed in one-of-two ways:
  1. The certificate authority allows you to resubmit your original Certificate Signing Request (CSR) to obtain a renewed certificate.
  2. The certificate authority requires that you present a new Certificate Signing Request (CSR) to obtain a renewed certificate.
Both of these methods can be achieved without repeating the original configuration steps to enable Third-Party certificate authority, and, with appropriate planning and backups in place, can be managed within a planned maintenance schedule.

Environment

Cognos Analytics 11.1.x and 11.2.x
All Operating System platforms
Third-Party certificate authority enabled

Steps

Part 1 – Back up the Cognos Analytics configuration.
  1. Stop Cognos Analytics
  2. Open Cognos Configuration
  3. Export the Configuration to plain text by clicking File –> Export As (recommended file name: decrypt.xml)
  4. Close Cognos Configuration
  5. Take a backup of the ‘configuration’ folder (recommended name: configuration-existingcerts)
Part 2 – Generate a Certificate Signing Request (CSR) if required.
  1. Open iKeyman (located in <COGNOS_HOME>\ibm-jre\jre\bin)
  2. Click Key Database File –> Open
  3. Navigate to <COGNOS_HOME>\configuration\certs
  4. Choose the CAMKeystore file and click Open
  5. Set the Key Database Type to PKCS12S2 and click OK
  6. Enter the keystore password (default is: NoPassWordSet) and click OK
  7. Click the current ‘encryption’ certificate, and click “Re-create Request”
  8. Provide a filename, and click OK
  9. Exit iKeyman
  10. Submit the generated CSR to your certificate authority for signing, and obtain your new certificate.
NOTE: If you need to have Cognos Analytics running while the certificate request is being processed, you can achieve this by renaming the ‘configuration’ folder to ‘configuration-csk’ and copying the ‘configuration-existingcerts’ folder back to ‘configuration’
Part 3 – Receiving the updated certificate
  1. Stop Cognos Analytics
  2. Ensure that the correct ‘configuration’ folder is in-place (if you used Step 2 of this process, ensure that the ‘configuration-csk’ folder is renamed to ‘configuration’
  3. Open iKeyman
  4. Open the <COGNOS_INSTALL>\configuration\certs\CAMKeystore file
  5. Click ‘Receive’
  6. Locate the newly received certificate, select, and click OK. You should receive a “Validation Successful” message
  7. Close iKeyman
  8. Open Cognos Configuration
  9. Save the configuration
  10. Start Cognos Analytics

 

 

More Information:

https://community.ibm.com/community/user/question/use-our-own-certificats 

For planning analytics this are the files used for certificates;

\bin64\ssl\ibmtm1.arm is the default certificate and it does not expire until 2035. ibmtm1.arm has been in use for a few years now, and even in 2020, its expiration was 2035. The “applixca” files in that folder are just there for some historical/nostalgic reason in my opinion, but I’m always focused on the latest releases, so there is that.

Inside of the \bin64\ssl\ibmtm1.kdb keystore, ibmtm1.arm has already been imported as “ibmtm1_server” and has been set as the default personal certificate. The “keystore” is what the components of PA will use to access the stored certificates and ibmtm1.sth is an encrypted password that PA uses to have access into the ibmtm1.kdb keystore while it is running.

New file \bin64\config.tm1admsrv.json  replaces the Cognos Configuration node that accepted TM1 Admin server settings. It defaults to using the \bin64\ssl\ibmtm1.kdb keystore and referring to the server certificate label ibmtm1_server that represents the imported ibmtm1.arm certificate IBM provided. A fresh install should be all set using ibmtm1.arm inside of ibmtm1.kdb.

You may have a \bin\tm1api.config or \bin64\tm1api.config where Architect and Perspectives have been installed on the users machines. This text file just refers to where Architect/Perspectives can find the keystore if it has been moved from \bin64\ssl\ibmtm1.kdb, if it is at a networked shared location, or it has been renamed. You likely don’t have this tm1api.config file if you never used custom certificates, but I mentioned it just in case. See link but ignore the stale top half. Most of the old Architect and Perspectives SSL Options are deprecated, so the top half is stale.

If you want to see the inside of the \bin64\ssl\ibmtm1.arm cert yourself, make a copy of it, and rename the copy to ibmtm1.crt. Then right-click on it and select “Open”…not “Install”. Microsoft will show you the date, etc. 2035.

Custom certificates for PA are not impossible, but not simple to implement.

 

NEW for 2.0.9.21/2.1.8/2.1.9 default \bin64\config.tm1admsrv.json file:

{
“tm1AdminNonSSLPortNumber”: 5495,
“tm1AdminSSLPortNumber”: 5498,
“tm1AdminHTTPPortNumber”: 5895,
“tm1AdminHTTPSPortNumber”: 5898,
“tm1AdminSupportNonSSLClients”: false,
    “tm1AdminKeyFile”: “./ssl/ibmtm1.kdb”,
    “tm1AdminKeyStashFile”: “./ssl/ibmtm1.sth”,
    “tm1AdminKeyLabel”: “ibmtm1_server”,
“tm1AdminTLSCipherList”: [],
“tm1AdminFIPSOperationMode”: 2,
“tm1AdminSupportPreTLSv12Clients”: false,
“tm1AdminNIST_SP800_131A_MODE”: false,
“tm1AdminIPVersion”: “IPv4”,
“tm1AdminActivityInterval”: 10,
“tm1AdminInactivityTimeout”: 10,
“tm1AdminRESTAPIToken”: “”
}

From the online documentation:

applixca.der
The original default certificate in DER format used for Java™ certificate stores.
applixca.pem
The original root authority certificate.
ibmtm1.arm
The default certificate file.
ibmtm1.crl
The certificate revocation list.
ibmtm1.kdb
The key database file, which contains the server certificate and trusted certificate authorities.
ibmtm1.rdb
The requested key pair and the certificate request data.
ibmtm1.sth
The key store, which contains passwords to the key database file.
tm1ca_v2.der
The updated default certificate.
tm1ca_v2.pem
The updated default root authority certificate.
tm1store
The Java certificate store containing the public root authority certificate.

https://www.ibm.com/support/pages/how-renew-third-party-ca-certificates-cognos-analytics 

https://community.ibm.com/community/user/discussion/what-happens-when-configurationscerts-expire 

https://community.ibm.com/community/user/groups/community-home/digestviewer/viewthread?GroupId=3067&MessageKey=5a98b967-835b-482b-998c-20bba9d68119&CommunityKey=8fde0600-e22b-4178-acf5-bf4eda43146b&tab=digestviewer&hlmlt=QT 

https://www.tm1forum.com/viewtopic.php?t=16287

Product:
Planning Analytics Workspace 2.1.14
Microsoft Windows 2022 server

Issue:

How do i easy copy one book in PAW to other TM1 instance?

Solution:

Copy the JSON code between the TM1 applications.

Go to the book, click on the Swedish keyboard CTRL+Q+’     (That is the * key left of the carriage-return key.)

This opens up the JSON code that describe the book. This can be copied to notepad for editing, or direct to other PAW book.

 

More Information:

https://www.acgi.com/blog/copying-tabs-across-books-in-ibm-planning-analytics-workspace

https://community.ibm.com/community/user/blogs/george-tonkin/2023/02/18/copying-a-tab-from-one-paw-book-to-another

https://www.ibm.com/docs/da/planning-analytics/2.0.0?topic=books-create-book

Product:

Power BI Dataflow

Issue:

After user account is changed (owner), the dataflow is not updated.  (this is only when you use a Power BI gateway to access SQL database on prem.)

 

Solution:

You have on the dataflow gone to settings, and change the owner to a other person. (by press the Take Over button)

On the dataflow tab go to EDIT

Click EDIT on one table.

You get a error message, that it can not contact SQL server.

Click on OPTIONS icon.

Click on project and data load, and change the data gateway to one in the dropdown list.

Click OK.

Now the table should be refreshed automatic, and you can save and close the dataflow.

 

More Information:

https://learn.microsoft.com/en-us/power-bi/transform-model/dataflows/dataflows-configure-consume

https://learn.microsoft.com/en-us/power-query/dataflows/data-sources?tabs=power-bi-service

https://data-marc.com/2022/08/02/what-you-must-know-when-building-power-bi-dataflows-routing-over-the-on-premises-data-gateway/ 

Product:
Microsoft Excel 365
Planning Analytics TM1 perspective

Issue:

How unhide rows in excel?  I need to find the data for the formulas tm1rptview.

Solution:

To unhide rows in Excel is through the Ribbon:

  1. Open your spreadsheet in Excel.
  2. Click on the triangular Select All button in the upper left corner in your spreadsheet, or press CTRL + A.
  3. In the Home tab, go to the Format section > Hide & Unhide > Unhide Rows.

  • A row can look hidden if its height is set to zero, even if you didn’t hide it manually. So even if you right-click and choose Unhide, it won’t work in this case.

    Here’s how to fix this:

    1. Select the rows around the missing one.

    2. Go to Home > Format > Row Height (or right-click the row number and choose Row Height).

    3. Set the height to 15 (Excel’s default).

    4. Click OK, and the row should appear again.

  • If you have the Freeze Panes option on, try disabling it to see if the static state of panes is preventing the rows from completely showing up.
  • Filters can hide rows based on criteria, sometimes unintentionally. To remove filters:1)Go to the Data tab on the Ribbon.2) Click on “Filter” to toggle off any active filters, revealing any rows hidden by these filters.3) Removing filters can instantly bring back a significant portion of your “missing” data.

 

More Information:

https://xodo.com/blog/how-to-unhide-all-rows-in-excel 

https://www.xelplus.com/hide-unhide-excel/

https://www.simplesheets.co/blog/how-to-unhide-all-rows-in-excel

https://blog.octanesolutions.com.au/dynamizing-dynamic-reports-hacks

How to Create a List Report using a TM1 Dynamic Report

https://www.ibm.com/docs/sl/planning-analytics/2.0.0?topic=wf-tm1rptrow

https://www.ibm.com/docs/en/planning-analytics/2.0.0?topic=functions-tm1rptview

 

TM1RPTVIEW defines the view displayed in a Dynamic Report.

Syntax

TM1RPTVIEW(ViewID,ZeroSuppression,TM1RPTTITLE,...) 

Argument

Description

Required/Optional
ViewID A name for the view using the format server:cube:unique_id. Required
ZeroSuppression A Boolean flag to turn on or off the zero suppression property for the view. 1 = on, 0 = off Required
TM1RPTTITLE For each title dimension in the Dynamic Report, include a reference to a TM1RPTTITLE function as an argument to TM1RPTVIEW. Required
FormatRange The formatting range for the Dynamic Report. You can use a cell reference or a named range for this parameter.

When you create a Dynamic Report, a named range called TM1RPTFMTRNG is created to include all formatting range cells. You can use this named range as an argument.

Required
IDColumn The column in the Dynamic Report that contains format IDs. You can use a cell reference or a named range for this parameter.

When you create a Dynamic Report, a named range called TM1RPTFMTIDCOL is created to include all formatting range cells. You can use this named range as an argument.

Required

 

TM1RptRow sets the Active Form control row definition. The control row definition governs the behavior of all rows in the Active Form. This worksheet function is used to create Active Forms.

This worksheet function is valid in worksheets only.

Syntax

TM1RptRow(ReportView, Dimension, Subset, SubsetElements,
Alias, ExpandAbove,MDXStatement, Indentations, ConsolidationDrilling)

Argument

Description

ReportView A reference to a cell that contains a TM1RptView formula.
Dimension A dimension, specified using the format tm1_server_name:dimension_name.
Subset A named subset. If this argument is empty, all elements of the dimension will be used.
SubsetElements A cell range reference that specifies a list of elements to constitute a subset.

When this argument is supplied, the named subset specified by the Subset argument is ignored.

If this argument is empty, the elements from the subset specified by the Subset argument are used.

Alias A string that defines the alias used for the subset.

When this argument is supplied, it overrides the default alias property defined by the subset specified by the Subset argument.

If this argument is empty, the alias from the subset specified by the Subset argument are used.

ExpandAbove A Boolean flag to turn on or off the subset Expand Above property. When this argument is supplied, it overrides the default Expand Above property defined by the subset specified by the Subset argument.

If the argument value is 1, consolidated elements expand upward when drilling.

If the argument value is 0, consolidated elements expand downward when drilling.

If this argument is empty, the Expand Above property from the subset specified by the Subset argument is used.

MDXStatement An MDX statement that applies to the subset specified by the Subset argument.

When this argument is supplied, it overrides the default MDX filter defined by the subset specified by the Subset argument.

If this argument is empty or omitted, the elements from the subset specified by the Subset argument are used.

Indentations An integer value to indicate how many indentations are applied to each level when drilling down on a consolidated element. If the argument value is 0, no auto-indentation is performed.

This is an optional argument. When the value is missing, one indentation is applied to each level as you drill down on a consolidated element.

ConsolidationDrilling A Boolean flag to turn on or off drilling on consolidated elements.

When this argument value is 1, users can drill down on consolidated elements in the Active Form.

When this argument value is 0, users can not drill down on consolidated elements in the Active Form.

This is an optional argument. When the argument is missing, the default behavior is to allow drilling on consolidated elements.

Product:

Planning Analytics 2.1.14

Microsoft Windows 2022 server

Issue:

CUB file size does not shrink when we remove a dimension element, like a year from a cube. Why?

Solution:

You need to update a data value in the cube, for TM1 to recognize the data change, and then after you do a SaveDataAll the cub file is smaller.

 

More Information:

https://www.ibm.com/docs/en/planning-analytics/2.1.0?topic=functions-savedataall

You have to do a data (not metadata) change that “flags” to TM1 whether it needs to re-save a particular cube. If there has been no change flagged since the last data save, then the cube will not be saved even if you do a Save All. This is a time saving feature, since it would be pointless for TM1 to re-save each and every cube regardless of whether it had changed. After all, the save process is often the primary performance bottleneck. You don’t actually lose much by retaining the data in the .cub file; all that will happen upon startup is that the data that no longer has a valid element will fail to load.

The deletion of data via the deletion of elements is not (flagged as) an actual data change. (Were it otherwise, after each metadata change the server would need to somehow go through and see whether any populated cells had been lost, which would be a ridiculously time consuming task.) Consequently the cube would not be flagged for a save after you made your change, and thus it retained the same size. If you checked the time and date of the .cub file it would have been from the save prior to you deleting the elements.

Remember that the RAM TM1 reports to the OS as being “in use” is just the RAM it has reserved for use. TM1 doesn’t tend to give back RAM that it has allocated (again, a bias toward speed and not efficiency), so you’re unlikely to see a drop in reserved RAM, even if it is no longer used.

 

https://exploringtm1.com/tm1-file-extensions-understand-the-files-in-your-tm1-model/

https://blog.octanesolutions.com.au/tm1-object-extensions

https://www.ibm.com/docs/en/planning-analytics/2.1.0?topic=SSD29G_2.1.0/com.ibm.swg.ba.cognos.tm1_dg_dvlpr.2.0.0.doc/c_tm1_dev_ob_app_on_svr_n5730.htm

https://exploringtm1.com/tm1-health-how-to-keep-your-tm1-server-optimized/ 

https://www.linkedin.com/pulse/limits-tm1-christoph-hein

Product:

Planning Analytics Workspace

Microsoft Windows 2022 server

Issue:

Check if port is used by docker or other process?

Solution:

On the PAW server login and start the CMD prompt;

Enter below to check if port 53 (default port for docker is used)

 netstat -ano | findstr ":53"

the ‘LISTENING’ status means it’s in use, and the PID (Process ID) shows which application, which you then check with tasklist /fi “PID eq [PID_number]” to identify the service.

 

Ensure that all ports in paw.ps1 file is open in the local firewall on the PAW server to ensure smooth operation.

More information:

https://learn.microsoft.com/en-us/windows-server/networking/dns/network-ports

https://simpledns.plus/kb/47-error-message-could-not-start-dns-service-on-ip-address-port-53-udp-port-is-used

https://kb.synology.com/en-global/DSM/tutorial/Whether_TCP_port_is_open_or_closed

https://www.ninjaone.com/blog/how-to-find-dns-servers-used-in-windows-11/ 

Product:

Planning Analytics 2.1.14 workspace

Issue:

Why 404 error in the log file  D:\RUNNING\PAW2114\log\wa-proxy\WAproxy4.log   (the file name is increased with a number for each new log file)

You find this on a row
statusCode: 404,

 

Suggested solution:

If you check a few lines before the statuscode you see lines like this;

2025-12-08T13:10:12.023Z [3] – info: Incoming request: [xxx.xxx.xxx.xxx] GET /rolemgmt/v1/profiles/@me/avatar?t=17651941997 [37e3346000009af]
2025-12-08T13:10:12.029Z [3] – warn: Proxy: “GET http://user-admin:3333/rolemgmt/v1/profiles/@me/avatar?t=17651941997” : 404 (3 ms) [37e3346000009af]

Then the user that is accessing the PAW environment, have not created a “picture” of himself, and above error will be in the log file WAproxy.log as long he does not have set a “picture” for his user. Solution is for the end user to add a picture to her login profile in PAW:

Click on you name in the top right corner and select “profile and settings” from the dropdown menu.

Click on profile photo.

Click on upload photo and select a photo of you (that is saved on the computer in png format)
(or get a fun avatar from https://unsplash.com/ or https://www.pexels.com/)

Click Done.

Now you have a small picture in PAW – and there should be less error in the log file.

 

When in PAW open a book, you have in the URL the ID of the book, like http://servername.domain.com/?perspective=dashboard&id=237dd67d-f35-40a5-95a3-5e76c6e8289

This ID can be found in the log file WAproxy.log under your PAW installations log folder. Then you can on the line with this book ID, see who have open it.

 

If you open WAproxy.log file in Notepad++ and search in file for all occurrence of the text: c/v2/content/dashboard/

You will get a list of all books that have been open and by who – the login ID will be last on the line.

 

More Information:

404  – page or data not found

401 – unauthenticated, you have not logged in to access this page or data

200 – success – page or data loaded fine

304 – you already have the page cached in your web browser, no need to send it again

500 – A generic error message, given when an unexpected condition was encountered. Read the WAproxy file for more information and search IBM web site for an explanation.

https://www.siteground.com/kb/304-status-code/ 

https://en.wikipedia.org/wiki/List_of_HTTP_status_codes

https://www.canva.com/features/photo-effects/?utm_medium=partner&utm_source=pixabay&utm_campaign=canva_attribution 

 

Product:

Planning Analytics Workspace

Microsoft Windows 2022 server

Issue:

How record web activate to send to support to troubleshoot issues in the web application.

Solution:

Use a tool to catch the traffic between the Web Browser and the server, try like https://www.telerik.com/fiddler/fiddler-classic

or more network tools like https://www.wireshark.org/download.html ….

    • Wireshark: Open Wireshark, choose the network interface you want to monitor, and start the capture. You can use capture filters to narrow down the traffic you see (e.g., by IP address or port). Stop the capture and save the results in a .pcap file for later analysis.
  • Fiddler: This tool is excellent for capturing and inspecting both HTTP and HTTPS traffic from any browser or application. It allows you to filter, save, and replay traffic sessions.
  • tcpdump: A command-line utility that captures raw packets. You can specify an interface (-i) and write the output to a file (-w). For example, tcpdump -i wlan0 -w capture.pcap would capture traffic from the wlan0 interface to a file named capture.pcap
You can also use the network tool built inside the web browser, they save the collected data to a HAR file.
HAR files (“Http ARchive”) are  a JSON-formatted archive file format for logging a web browser’s interaction with a site.
  • Typically, they are useful in troubleshooting performance problems (and errors) when using web-based software (such as PAW).
The steps vary, depending on the web browser that the user is using.
Chrome
1.    Launch Google Chrome.
2.    In Chrome, go to the page where you are experiencing trouble
3.    Select the Chrome menu (⋮) at the top-right of your browser window, then select More Tools > Developer Tools.
4.    The Developer Tools will open as a docked panel at the side or bottom of Chrome. Click on the Network tab.
5.    Tick/select the option Preserve log (see below):
6.    The recording should autostart and you will see a red circle at the top left of the Network tab (see above).
If not, click the black circle, to start recording activity in your browser.
7.    Refresh the web page that you are on
– Perform the task which is causing the issue
– The goal is to reproduce the problem you’ve been experiencing while Google Chrome is recording activity.
8.    Once you have reproduced the problem (while recording), right-click within the Network tab and click Save as HAR with Content to save a copy of the activity that you recorded:
9.    Save the HAR file somewhere convenient.
10.  Now click the Console tab and right-click anywhere in the console log. Select the popup option “Save as…” and name the log file something sensible:
11.  Close the developer panel.

12. Typically, the customer will now upload the two files (especially the HAR file) into their support case.

Firefox
1.    Launch Firefox.
2.    In Firefox, go to the page where you are experiencing trouble.
3.    Select the Firefox menu (Three horizontal parallel lines) at the top-right of your browser window, then select Developer > Network.
4.    The Developer Network Tools will open as a docked panel at the side or bottom of Firefox. Click on the Network tab.

– Before starting the capture, be sure to enable the setting: Enable persistent logs in the Toolbox Options > Common Preferences (click on gear icon on the Toolbox toolbar to open the Toolbox Options pane)
– Click on the Network tab after setting the option.
5.    The recording will autostart once you start performing actions in the browser.
6. Refresh the web page that you are on

– Perform the task which is causing the issue
– The goal is to reproduce the problem you’ve been experiencing while Firefox is recording activity.
7.    Once you have reproduced the issue and you see that all of the actions have been generated in the Developer Network Panel (this should just take a few seconds), right-click anywhere under the “File” column and click on “Save All as Har
8.    Save the HAR file somewhere convenient and close the developer panel.
9.    Typically, the customer will now upload the HAR file into their support case.Internet Explorer (IE)
We shall use Microsoft F12 Developer Tools and the Network Tool to capture the browser headers. Then export the captured traffic as HAR file (or XML if using older version of IE).
1.    Launch Microsoft Internet Explorer.
2.    Press the F12 key on your keyboard. This should open the Developer Tools panel in IE. If not, find it on the browser menu: F12 Developer Tools
3.    In the Developer Tools panel, click the Network panel and then deselect the Clear entries on navigate option. (on by default).
4.    Click the Network panel/button and then the Start Capturing button (or press Ctrl + E). Note: the icon looks like a green triangle
5.    Click the IE Refresh button. The goal is to reproduce the problem you’ve been experiencing while IE is recording activity
6.    The Developer Tool panel should now show a list of the URLs that are included in the page you have in your browser.
7.    Click the Stop button when the issue has been reproduced (or press Ctrl + E). Note: the icon looks like a red square
8.    Click the Export captured traffic icon and save the file somewhere convenient (or press Ctrl + S). Note: the icon looks like a floppy disk.
9.    Typically, the customer will now upload the HAR file into their support case.
10.    Click on the Console tab and look for any errors/warnings reported. If errors are shown, please right-click on the console errors and select Copy all or send a screenshot of them.
11.    Close the Developer Tools panel.
More Information:

https://github.com/radekstepan/fetch-fiddler

https://www.telerik.com/fiddler/usecases/https-traffic-recording

https://www.wireshark.org/docs/wsug_html_chunked/

https://httptoolkit.com/

https://help.okta.com/oag/en-us/content/topics/access-gateway/troubleshooting-with-har.htm 

Apache Guacamole®