Today a developer reached out to me stating that his package is failing with the below error:
DFT – Load Change Data: Error: The buffer manager cannot write 8008000 bytes to file “C:\Windows\Temp…..\..tmp”. There was insufficient disk space or quota.
There are two things,
1. Why is it even going to temp for the ETL package execution,
2. Is the server having insufficient disk space in C: drive.
Whenever its C drive with disk space issue, no-matter-what, I always check the size of drive to make sure that this is not the actual reason. (You never ever want your C drive to be completely full)
The C drive was having 60 GB free space, so it was having enough free space. In such cases when there is sufficient space and you get any error like insufficient space it is generally due to some missing permissions.
The ETL job was running under a proxy account, and so I provided the permission on Temp folder to the account by which proxy was created and this fixed the issue.
Now, the question why is it even using the C: drive temp folder, it was because in the package he had added multiple Lookups. SSIS engine always first loads the tables into the memory and than do the match. And here, when it was loading everything into the memory, due to shortage of memory available, it was getting spilled over to the temp buffer which is by default in the C drive.
The best practice is not to use the default directory and pass a parameter with the location of the desired temp folder.