There is insufficient system memory in resource pool ‘internal’ to run this query

Publish Date: October 4, 2022

There is insufficient system memory in resource pool ‘internal’ to run this query

Did you ever try to import a large SQL script in SQL server using SQLCMD utility and got There is insufficient system memory in resource pool ‘internal’ to run this query error as shown in this screenshot.

There is insufficient system memory in resource pool 'internal' to run this query
There is insufficient system memory in resource pool ‘internal’ to run this query

This error is most likely to occur when there are a lot of consecutive INSERT statements in your SQL script. The SQLCMD utility tries to insert all the records as a single batch and you get this error. No matter how much memory your physical machine has or how much memory you allocate to your SQL server instance, this error will not go away.

Resolution

To resolve this error, you just need to insert a GO statement after every few hundred or thousand rows to instruct SQLCMD to treat the rows following the GO statement as a single batch. Make sure your SQL script looks like the one shown in the screenshot:

Sample SQL script
Sample SQL script

Now if you’re importing a single SQL script having a few thousand rows, you can do it manually but what if your script has a millions of records or maybe there are hundreds of SQL scripts? In that case, manual process is simply not feasible. Well, PowerShell is here to the rescue.

The following PowerShell script takes all the files having .sql extension recursively from an input directory, inserts a GO statement after every 1000 rows in each file and then saves each file in an output directory with the same name while leaving the original files intact.

<#
.SYNOPSIS
Bulk inserts a GO statement in SQL files 

.NOTES
    Author : Surender Kumar
    Author URI : https://www.techtutsonline.com/staff/surender-kumar/
    Version : 1.0
    Purpose : Bulk inserts a GO statement in SQL files 
#>
$inputDir = "C:\inputDir"
$outputDir = "C:\outputDir"
$sqlFiles = Get-ChildItem -Path $inputDir -File *.sql -Recurse

$insertLine = @'
GO
'@
foreach($file in $sqlFiles){
    Get-Content -Path $file.FullName -ReadCount 1000 | ForEach-Object { $_,$insertLine | Add-Content $outputDir\$($file.name) -Force}
}

To save time, you can download my Bulk-InsertLine.ps1 script. You can modify it to meet your own needs. At the end, your SQL script will have a GO statement after every thousand rows. After modifying your SQL script, try importing it again and it will work without error now.



Microsoft Certified | Cisco Certified

Leave a Reply