Behind the Numbers: Nonfarm Payrolls

by | Aug 7, 2025

The July jobs report put a spotlight on the revisions to the payroll numbers – especially after the weak release, coupled with the sharp downward revisions to the May and June numbers, resulted in President Trump announcing the termination of the Commissioner of the Bureau of Labor Statistics (BLS).

Opinions on whether the firing was justified are largely divided along political lines. However, it’s a safe bet that few people on either side of the divide really understand where the jobs report data comes from, and why and how the revisions are made.

Here’s how the nonfarm payroll data (NFP) are actually collected and published each month by the BLS.

The data come from a survey, not from administrative records

The BLS does not track every paycheck or tax record, or walk into businesses and count heads. Instead, it conducts a monthly survey of about 119,000 businesses and government agencies, covering over 600,000 individual worksites. This is called the Current Employment Statistics (CES) survey, generally referred to as the “establishment survey.”

These employers report the number of jobs (not people; an important distinction), along with hours and wages. The establishment survey is separate from the household survey, which is used to determine the unemployment rate.

What do employers report?

Each participating employer provides the following information for a specific “reference week” each month (usually the week including the 12th of the month):

  • Number of employees on the payroll, full-time and part-time
  • Total hours worked (for production/nonsupervisory employees)
  • Earnings for the week (average hourly and weekly earnings)
  • Industry classification (based on NAICS codes)

These are counts of jobs, so if one person has two jobs with two different employers that are included in the survey, both jobs get counted.

Initial estimates involve modeling and imputation

Because not all 119,000 employers respond on time (and the response rate has deteriorated since covid), the BLS has to impute (estimate) the missing data based on:

  • Historical patterns for late responders
  • Trends from respondents that did submit on time
  • Seasonal adjustment factors (think school years, Christmas holidays, summer retooling for auto manufacturers, etc.)

So when you hear that “payrolls rose by 205,000 in June,” that number is a modeled estimate based on partial survey data and historical assumptions – not a complete census.

Two monthly revisions follow the initial release

Each month’s initial jobs estimate is revised twice:

  • First revision: published one month later, when more survey data have come in
  • Second revision: published two months later, using nearly full survey response

These revisions can be large or small, depending on:

  • Response rates (typically 70-75% at the initial release)
  • Trends in newly reported or corrected data
  • Seasonal adjustment quirks (especially during economic turning points or shocks like covid)

Annual benchmarking (major revisions)

Once a year – usually in February – the BLS makes a massive adjustment called the benchmark revision, which recalibrates the whole payroll series using actual employment counts from unemployment insurance tax records. (Lots of economic data are subject to annual revisions, btw.) This gives a more accurate picture than the survey-based estimates and can result in large upward or downward revisions going back up to 21 months.

Limitations and common misunderstandings

  • The NFP report is not a literal count of all jobs in the country, it’s a statistical sample.
  • Jobs do not equal workers. People with multiple jobs get counted multiple times.
  • It excludes farm workers, the self-employed, and gig workers (unless they’re on a payroll).
  • Response rate matters: early estimates are vulnerable to errors during major trend changes (e.g., recessions, covid rebound)
  • Seasonal adjustment can distort monthly changes, especially around holidays, weather shifts, and school calendars. (This is why the numbers are always skewed when there’s a major winter storm or a hurricane.)

One simple observation is that the cadence of initial release – first revision – second revision is very similar to the cadence of GDP releases, and for similar reasons. The initial GDP estimate released each quarter is based on estimates of the factors of production; then, as more data comes in, more reliable GDP numbers are released in the form of second and third estimates, and long-term adjustments are sometimes made.

Now, let’s look at some of the reasons the magnitude of the revisions has increased since covid. First, the survey response rate has declined somewhat since then, so the initial estimate is less reliable. Second, business churn in the covid aftermath (companies going out of business and new businesses starting – referred to as the birth/death rate) has increased, distorting the models, because that rate is a significant factor in the revision calculations.

However, by late 2023 and early 2024, the adjustments made to the models to compensate for the greater birth/death rate may have overcompensated, especially as job growth began to slow. That led to some of the overstated initial numbers during that period, which later saw massive downward revisions like the -800,000 adjustment made to the March 2024 data.

More recently, the DOGE cuts have resulted in large job losses at several government agencies (nonfarm payrolls include government employees), but there may not be enough, or the right, personnel left to respond to the CES survey, at least timely. So there may some distortion there. And immigration trends could play a role (we don’t know how much, because we don’t know how many employers would have actually reported illegal immigrants on their payrolls in 2021-24, when the illegal immigrant population soared. Of course, that trend is reversing in 2025.)

The bottom line is that the methodology for collecting and revising the data, while imperfect, is not politically biased. While the firing of the BLS Commissioner may have been politically motivated, there might be a couple of positive outcomes.

First, a shake-up at the bureau may address the long-standing technical challenges within the data collection and estimation process, which have become increasingly problematic. Second, greater insight into the revisions may provide useful insights for the economists who parse this data for the rest of us. Improvements in the process and greater transparency may offset any threat to market confidence in future data releases from the appearance of politicization of the agencies that collect and publish economic data.