NavWithNav

The premier knowledge-sharing hub for Microsoft Dynamics 365 Business Central developers, architects, and ERP professionals.

Back to Series
Business Central

Stop Writing Slow AL Code — Common Query Mistakes in Business Central and How to Fix Them

Most Business Central performance problems are caused by poorly written AL code, not server issues. This guide walks through the six most common query and data retrieval mistakes BC developers make — including nested loops, late filtering, missing SetLoadFields, and CalcFields abuse — with real world before and after AL code examples for each.

NitinApril 28, 2026 15 min read
Business CentralAL LanguageQuery ObjectsPerformance OptimizationSetLoadFieldsCalcFieldsSetAutoCalcFieldsBC DevelopmentDynamics 365AL TipsBusiness Central PerformanceFlowFieldsAL Best PracticesMicrosoft ERPBusiness Central Developer
Stop Writing Slow AL Code — Common Query Mistakes in Business Central and How to Fix Them

THE PROBLEM

Your Business Central environment is getting slow. Pages take too long to load. Reports time out. Background jobs run forever. Your client is complaining.

You check the server. It looks fine. You check the database size. Nothing unusual.

The problem is almost always in the AL code itself.

Most BC developers learned to write AL by reading records in loops, filtering after the fact, and calling the database more times than necessary. These habits feel harmless on small datasets but become disasters when the client has hundreds of thousands of records.

Let us go through the most common mistakes one by one and fix them.


MISTAKE 1 — READING RECORDS INSIDE A LOOP


This is the most common and most damaging mistake in AL development. It is called the N plus 1 query problem. For every record in the outer loop you fire another database query inside.

The wrong way:

AL (Business Central)
 procedure ProcessSalesLines()
    var
        SalesHeader: Record "Sales Header";
        SalesLine: Record "Sales Line";
    begin
        SalesHeader.SetRange("Document Type", SalesHeader."Document Type"::Order);
        if SalesHeader.FindSet() then
            repeat
                SalesLine.SetRange("Document Type", SalesLine."Document Type"::Order);
                SalesLine.SetRange("Document No.", SalesHeader."No.");
                if SalesLine.FindSet() then
                    repeat
                    // process each line
                    until SalesLine.Next() = 0;
            until SalesHeader.Next() = 0;
    end;

If you have 500 Sales Orders this fires 501 database queries. One for the headers and one for each order's lines.

The right way — use a Query Object:

AL (Business Central)
query 50100 "Sales Orders With Lines"
{
       QueryType = Normal;
    elements
    {
        dataitem(SalesHeader; "Sales Header")
        {
            DataItemTableFilter = "Document Type" = filter(Order);
            column(DocumentNo; "No.") { }
            column(CustomerNo; "Sell-to Customer No.") { }
            column(OrderDate; "Order Date") { }
            dataitem(SalesLine; "Sales Line")
            {
                DataItemLink = "Document Type" = SalesHeader."Document Type",
"Document No." = SalesHeader."No.";
                SqlJoinType = InnerJoin;
                column(ItemNo; "No.") { }
                column(Quantity; Quantity) { }
                column(UnitPrice; "Unit Price") { }
            }
        }
    }

    procedure ProcessWithQuery()
    var
        SalesQuery: Query "Sales Orders With Lines";
    begin
        SalesQuery.Open();
        while SalesQuery.Read() do begin
            // all data available in one pass — zero nested queries
            // SalesQuery.DocumentNo, SalesQuery.ItemNo, SalesQuery.Quantity
        end;
        SalesQuery.Close();
    end;

Now you fire exactly one database query regardless of how many orders you have. This is the difference between a report that runs in 2 seconds versus 4 minutes.


MISTAKE 2 — FILTERING AFTER READING RECORDS


Many developers read all records first and then filter them in AL code using if statements. This pulls thousands of rows from the database into memory and then throws most of them away.

The wrong way:

AL (Business Central)
 procedure GetHighValueOrders()
    var
        SalesHeader: Record "Sales Header";
    begin
        if SalesHeader.FindSet() then
            repeat
                if (SalesHeader."Document Type" = SalesHeader."Document Type"::Order) and
                (SalesHeader.Amount > 10000) and
                (SalesHeader."Order Date" >= 20240101D) then begin
                    // process record
                end;
            until SalesHeader.Next() = 0;
    end;

This reads every single Sales Header in the database before applying your conditions.

The right way — always filter before you read:


AL (Business Central)
procedure GetHighValueOrders()
    var
        SalesHeader: Record "Sales Header";
    begin
        SalesHeader.SetRange("Document Type", SalesHeader."Document Type"::Order);
        SalesHeader.SetFilter(Amount, '>%1', 10000);
        SalesHeader.SetFilter("Order Date", '>=%1', 20240101D);
        if SalesHeader.FindSet() then
            repeat
            // only matching records are fetched from database
            until SalesHeader.Next() = 0;
    end;

Always push filters to the database using SetRange and SetFilter. Never filter in AL after reading.


MISTAKE 3 — USING FIND MINUS ONE WHEN YOU ONLY NEED ONE RECORD


FindSet loads multiple records and prepares a cursor for looping. If you only need to check whether a record exists or read a single record, FindSet is overkill.

The wrong way:

if Customer.FindSet() then
            // do something with first customer
            // The right way:
            // To check existence only
            if Customer.IsEmpty() then
                Error('No customers found.');
        // To read exactly one record
        if Customer.FindFirst() then
            // process single record
            // To check count without reading records
            if Customer.Count() > 100 then
                Message('Large customer base detected.');


IsEmpty is the fastest existence check in AL. It fires a COUNT query with no data transfer. Never use FindSet just to check if records exist.


MISTAKE 4 — NOT USING SETLOADFIELDS

By default when you read a record in AL, Business Central fetches every single field in that table from the database. If you only need two or three fields from a table with fifty columns you are transferring a huge amount of unnecessary data on every read.

The wrong way:


    procedure GetCustomerNames()
    var
        Customer: Record Customer;
    begin
        if Customer.FindSet() then
            repeat
                Message(Customer.Name);
            until Customer.Next() = 0;
    end;


This fetches all 100 plus fields of the Customer table on every iteration even though you only need the Name field.

The right way — use SetLoadFields:


procedure GetCustomerNames()
    var
        Customer: Record Customer;
    begin
        Customer.SetLoadFields(Name, "No.", "Phone No.");
        if Customer.FindSet() then
            repeat
                Message(Customer.Name);
            until Customer.Next() = 0;
    end;


SetLoadFields tells BC to fetch only the fields you specify. On large tables with blob fields or big text columns this can reduce data transfer by 80 percent or more. Always use SetLoadFields when you do not need the full record.


MISTAKE 5 — AGGREGATING DATA IN AL INSTEAD OF USING QUERY TOTALS


Developers often loop through thousands of records in AL just to calculate a total or count. Business Central Query Objects can do this directly in the database in a single query.

The wrong way:


procedure GetTotalSalesByCustomer()
    var
        SalesLine: Record "Sales Line";
        TotalAmount: Decimal;
    begin
        SalesLine.SetRange("Document Type", SalesLine."Document Type"::Order);
        if SalesLine.FindSet() then
            repeat
                TotalAmount += SalesLine.Amount;
            until SalesLine.Next() = 0;
        Message('Total: %1', TotalAmount);
    end;


This reads every single Sales Line record into memory just to add up amounts.

The right way — use a Query with totaling:

    query 50101 "Customer Sales Totals"
{
        QueryType = Normal;
    elements
    {
        dataitem(SalesLine; "Sales Line")
        {
            DataItemTableFilter = "Document Type" = filter(Order);
            column(CustomerNo; "Sell-to Customer No.") { }
            column(TotalAmount; Amount)
            {
                Method = Sum;
            }
            column(OrderCount; "Line No.")
            {
                Method = Count;
            }
            filter(DocumentType; "Document Type") { }
        }
    }

    procedure GetTotalSalesByCustomer()
    var
        CustomerTotals: Query "Customer Sales Totals";
    begin
        CustomerTotals.Open();
        while CustomerTotals.Read() do
            Message('Customer: %1 Total: %2 Orders: %3',
            CustomerTotals.CustomerNo,
            CustomerTotals.TotalAmount,
            CustomerTotals.OrderCount);
        CustomerTotals.Close();
    end;


The database calculates the sum and count. AL receives only the aggregated result. No looping through thousands of lines in memory.


MISTAKE 6 — CALLING CALCFIELDS INSIDE A LOOP


CalcFields recalculates a FlowField by hitting the database every time it is called. Calling it inside a loop on thousands of records is one of the worst performance patterns in AL.

The wrong way:

procedure ShowCustomerBalances()
    var
        Customer: Record Customer;
    begin
        if Customer.FindSet() then
            repeat
                Customer.CalcFields("Balance (LCY)");
                if Customer."Balance (LCY)" > 0 then
                    Message('%1 owes %2', Customer.Name, Customer."Balance (LCY)");
            until Customer.Next() = 0;
    end;


Every CalcFields call fires a separate database query. With 1000 customers this is 1000 extra queries just to get balances.

The right way — use SetAutoCalcFields before looping:


procedure ShowCustomerBalances()
    var
        Customer: Record Customer;
    begin
        Customer.SetAutoCalcFields("Balance (LCY)", "Sales (LCY)");
        if Customer.FindSet() then
            repeat
                if Customer."Balance (LCY)" > 0 then
                    Message('%1 owes %2', Customer.Name, Customer."Balance (LCY)");
            until Customer.Next() = 0;
    end;

SetAutoCalcFields tells BC to calculate the FlowFields as part of the main record fetch. One database round trip instead of thousands.


PERFORMANCE QUICK REFERENCE

Use SetLoadFields when you need only specific fields from a table. Use SetRange and SetFilter before FindSet never after. Use IsEmpty instead of FindSet when you only need to check existence. Use FindFirst instead of FindSet when you only need one record. Use SetAutoCalcFields instead of CalcFields inside loops. Use Query Objects for joins, aggregations and totaling instead of nested loops. Never read all records and filter in AL — always filter at database level.



THE RESULT

Before applying these fixes your AL code fires hundreds or thousands of database queries for simple operations. Pages load slowly. Reports time out. Background jobs block other sessions.

After applying these fixes the same operations run in a fraction of the time. Fewer database hits, less memory usage, faster pages, happier clients.

Performance is not about the server. It is about respecting the database.


Happy Coding!

0
0

Discussion (0)

Leave a comment

No comments yet. Be the first to share your thoughts!

Newsletter

Stay updated with the latest Business Central development tips.

Nitin Verma

Nitin Verma

Solution Architect

Extensive experience specializing in Microsoft Dynamics NAV, Business Central, Power Platform, and ERP Architecture.

Read full bio

Search Articles

Monthly Archive

Loading...

Visitor Stats