In one of our previous posts Detailed Logging for Enterprise-Grade PostgreSQL, we discussed parameters to enable detailed logging and use Log Analyzer – pgBadger. In this blog post, we will configure a Microsoft Azure provisioned PostgreSQL Flexi Server to populate logs and generate a pgBadger report.

The latest pgBadger Utility provides support for JSON format logs. Microsoft Azure PostgreSQL Flexi Server does not provide PostgreSQL logs as we used to get with a single server or on-premises environment. It will get populated after enabling it in JSON Format.

In this blog, we will configure and generate a pgBadger report using JSON format logs and if we are using an older version of pgBadger utility, then convert it into regular logs.

Before downloading we need to tune parameters related to logging in the PostgreSQL.conf file and reload the configuration. You can download and install pgBadger from here.

Also, you can go through here and navigate to pgBadger to know more about it.

Configuration

From the Microsoft Azure Cloud console — https://portal.azure.com/#home  — we need to create a storage account as shown below:

Microsoft Azure Cloud console

Click on the CREATE option and fill in details like name and resource group as shown below:

Configure existing PostgreSQL Flexi Server to use the storage account to generate PostgreSQL Logs.

Select diagnostic settings and add the already-created storage account.

PostgreSQL storage account

 

Login into Microsoft Azure Cloud, navigate to the storage account and navigate to the respective storage account that has been created for PostgreSQL Flexible logs. Navigate to the location of the logs as shown below:

Navigate to the date and time to choose the hourly JSON file(s) required. Right-click on the .json file and download the log which will be in JSON format.

Sample .json logs look like the below:

 

Generate pgBadger report

Use the Jump server provisioned for pgBadger, and copy the JSON file from the local machine to the Jump server.

If you are using the latest pgBadger utility, you can pass the JSON format logs using the -f option to generate a pgBadger report:

If you do not have the option to use the latest pgBadger utility, then use the below Linux command to extract PostgreSQL logs from JSON File and generate a postgresql.log file.

Generate the PgBadger report from the postgresql.log file and parse it into an HTML file:

Copy the pgbadger_report.html from the Jump server side to the local machine and review the PgBadger report.

Conclusion

The pgBadger utility is continuously emerging as the best log analyzer tool with each release, as it adds more features and functionalities. We can configure and generate pgBadger reports from Microsoft Azure’s Flexi server logs and it does make a DBA’s life easier! 🙂

Subscribe
Notify of
guest

8 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
?Alicja Kucharczyk

thank you Abdul for a great blog post. Just to add if you would like to automate the process you can use the following:
* mount storage account to the vm using blobfuse https://learn.microsoft.com/en-us/azure/storage/blobs/blobfuse2-how-to-deploy
* or use log analytics instead of storage account; then you can make use of API to download “messages” part from bigger range of logs (days/weeks/month depending on retention period).

Abdul Faimeed

Really it’s a noteworthy

Akhil

Appreciate efforts for sharing knowledge

Salman Ahmed

Thanks for the useful information, I found this blog very useful to know advance feature of pgbadger utility.

Abdul Juneed

Very informative. Great work

Sugandh

Its well explained through snapshots.
Waiting for another blog.

Vidya Sagar Geddada

Thank you Abdul for the great content. I think it is well-illustrated along with the snips