Skip to content

feat(analytics-browser): support gzip request body compression#1542

Open
Mercy811 wants to merge 16 commits intogzipfrom
AMP-149145-update-browser-sdk-to-use-enable-request-body-compression
Open

feat(analytics-browser): support gzip request body compression#1542
Mercy811 wants to merge 16 commits intogzipfrom
AMP-149145-update-browser-sdk-to-use-enable-request-body-compression

Conversation

@Mercy811
Copy link
Contributor

@Mercy811 Mercy811 commented Feb 17, 2026

Summary

  • Gzip request if server url is default.
  • Add a new configuration enableRequestBodyCompression to disable it if server url is set to a proxy url
  • Same logic as amplitude/Amplitude-Swift@12bbd67
  • Only payload larger than 2kb will be compressed

Checklist

  • Does your PR title have the correct title format?
  • Does your PR have a breaking change?:

Note

Medium Risk
Touches the core event upload pipeline and transport interface; mistakes could break ingestion or proxy setups, though changes are gated by endpoint checks, size thresholds, and extensive tests.

Overview
Adds optional gzip compression for event upload bodies by extending the core Transport.send() API and plumbing a new enableRequestBodyCompression config through browser/core config and the destination upload path.

Compression is forced for Amplitude’s default ingestion endpoints, while custom serverUrls only compress when explicitly enabled; browser transports (XHRTransport and a browser-local FetchTransport) gzip payloads �≥2MB when CompressionStream is available, and SendBeaconTransport explicitly opts out since it can’t set headers. E2E and unit tests are updated/added to handle gzipped bodies (including request parsing helpers) and to validate compression behavior across transports and endpoint types.

Written by Cursor Bugbot for commit df2cac1. Configure here.

@Mercy811 Mercy811 force-pushed the AMP-149145-update-browser-sdk-to-use-enable-request-body-compression branch from 3b91b84 to e0f6d6e Compare February 20, 2026 00:28
test: e2e
Co-Authored-By: Cursor <cursoragent@cursor.com>
@Mercy811 Mercy811 force-pushed the AMP-149145-update-browser-sdk-to-use-enable-request-body-compression branch from e0f6d6e to 84254a5 Compare February 20, 2026 00:42
@Mercy811
Copy link
Contributor Author

bugbot run

Copy link

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

✅ Bugbot reviewed your changes and found no new issues!

Comment @cursor review or bugbot run to trigger another review on this PR

@daniel-graham-amplitude
Copy link
Collaborator

Because this adds extra computation prior to calling fetch, I think this increases the risk of us missing requests that are made after the pageunload event. Would it be possible to disable compression while the page is unloading?

@Mercy811 Mercy811 changed the base branch from main to gzip February 20, 2026 17:53
@Mercy811 Mercy811 marked this pull request as ready for review February 20, 2026 18:10
@Mercy811
Copy link
Contributor Author

bugbot run

Copy link

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 2 potential issues.

Bugbot Autofix is OFF. To automatically fix reported issues with Cloud Agents, enable autofix in the Cursor dashboard.

const shouldCompressBody =
shouldCompressUploadBody &&
getStringSizeInBytes(bodyString) >= MIN_GZIP_UPLOAD_BODY_SIZE_BYTES &&
isCompressionStreamAvailable();
Copy link
Collaborator

@daniel-graham-amplitude daniel-graham-amplitude Feb 26, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking at the getStringSizeInBytes helper functions, it looks like it runs new TextEncoder(value), which creates a copy of the payload and then returns the size, meaning there's an additional copy of the payload in memory (in byte format) that is generated in O(n) time (because it has to iterate through each character in the string and copy it over).

I'm thinking because this threshold is just a heuristic anyways, maybe could just define the minimum upload size as a minimum string length rather than a minimum byte size, then we'd just need to check bodyString.length.

I understand that this goes against our agreement to make it 2kb, but string length should be "good enough". Most characters in a string are 1 byte each, and at most they're 3 bytes each, which gives an upper bound on how much we can under-exaggerate the "size" of a string.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a good point! English letter, number, and common punctuation are all 1 byte each. So I'm keeping 2 * 1024 * 1024 and compare it with string length.

if (compressed) {
headers['Content-Encoding'] = 'gzip';
body = compressed;
}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome! This nice.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually I re-thought about this and felt it makes more sense to not allow custom headers override default headers (Content-Type, Accept, Content-Encoding). If it's useless and even harmful to override default headers if transport providers keeps their behavior to set json or gzip payload. Custom headers are support to support proxy authorization in real world for example.

super();
}

async send(serverUrl: string, payload: Payload, _enableRequestBodyCompression = false): Promise<Response | null> {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since _enableRequestBodyCompression is an optional parameter in the BaseTransport method, you should just be able to leave it out of the send type signature, since it's not being used.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh I just wanna make it look aligned in browser platform.

_enableRequestBodyCompression is an optional parameter in the BaseTransport method,

Yes! So we can keep the http transport in node sdk remains unchanged for now

// Temporary browser-specific fetch transport with gzip support.
// TODO: Merge this implementation back into @amplitude/analytics-core FetchTransport
// once React Native SDK supports request body gzip.
export class FetchTransport extends BaseTransport implements Transport {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What was the reason for making a new fetch.ts in analytics-browser? Is it because the core fetch.ts fails to compile in other environments (like React Native), when you add the compression logic?

Copy link
Contributor Author

@Mercy811 Mercy811 Feb 26, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

React Native import core fetch. I don't wanna let it use the gzip feature now because by default we compress

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does React Native fail at runtime or compilation time though? Couldn't we just do a check on !!globalScope.CompressionStream and if it returns false don't run gzip for that platform? I'm just concerned about having 2 separate implementations of fetch that would need to be maintained.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The browser fetch is for short term. After dogfooding, we can have another pr for react native and merge it to the gzip feature branch and then merge gzip feature branch into main together. I tend to avoid making this giant pr bigger.


export class BaseTransport implements Transport {
send(_serverUrl: string, _payload: Payload): Promise<Response | null> {
send(_serverUrl: string, _payload: Payload, _enableRequestBodyCompression?: boolean): Promise<Response | null> {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thinking about the signature of this some more. I feel like there's a chance, in the future, that there could be more options that we need to add to send and the method signature would be pretty long.

Would it be much trouble if we did it like this:

type SendOptions = { _enableRequestBodyCompression?: boolean };
send(_serverUrl: string, _payload: Payload, _sendOptions?: SendOptions)

I'll leave this one as "optional".

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thought about this too but realized we probably need a refactor on the whole class. It doesn't make sense to me to set custom headers as a class property but pass server url as a parameter to send. They should be treated the same and as well as the should compress flag.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I’ll keep it as it is because it’s an internal interface, unlike storage provider.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants