Cube load job failing with memory error

Last week in one of my project we were seeing the SQL Server job failing on one server. It has the schedule to run every 2 minutes, it runs fine for majority of time but fails with the below error every once or twice a day.

Executed as user: Domain\accnt… …….. Error returned: ‘The operation has been cancelled because there is not enough memory available for the application. If using a 32-bit version of the product, consider upgrading to the 64-bit version or increasing the amount of memory available on the machine.’..   …….  The step failed.

This job basically loads the SSAS cube which reside on a different server.

As the error states not enough memory, the first thing that I checked was the total server memory, it was 32 GB. When checking from the Azure portal got to know that the free memory has always been around 18 GB.

Available memory graph
Taken from Azure portal

That means there was always around 18 GB free space available but still the job was failing with insufficient memory.

To check the memory utilized by the SSAS on the server itself, I added some perfmon counters, it was like below.

perfmon for the server

The above screenshot was post the failure of the job, the memory hard limit is 16,777,216 KB which is 16 GB.

The reason for the 16 GB hard limit was the Standard edition of SQL Server being used.

SQL Server editions limitations

As you can see this the limitation of standard edition, its solutions are:

  1. Use the Enterprise edition (the expansive solution)
  2. Redefine the cube to use lesser number of columns

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s