Skip to content

Update WPT #49

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions test/fixtures/wpt/common/get-host-info.sub.js
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,7 @@ function get_host_info() {
PORT2: PORT2,
ORIGINAL_HOST: ORIGINAL_HOST,
REMOTE_HOST: REMOTE_HOST,
NOTSAMESITE_HOST,

ORIGIN: PROTOCOL + "//" + ORIGINAL_HOST + PORT_ELIDED,
HTTP_ORIGIN: 'http://' + ORIGINAL_HOST + HTTP_PORT_ELIDED,
Expand Down
4 changes: 4 additions & 0 deletions test/fixtures/wpt/fetch/api/basic/WEB_FEATURES.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
features:
- name: fetch-request-streams
files:
- request-upload*
2 changes: 1 addition & 1 deletion test/fixtures/wpt/fetch/api/body/mime-type.any.js
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@

[
() => new Request("about:blank", { method: "POST", body: new Blob([""], { type: "Text/Plain" }), headers: [["Content-Type", "Text/Html"]] }),
() => new Response(new Blob([""], { type: "Text/Plain" }, { headers: [["Content-Type", "Text/Html"]] }))
() => new Response(new Blob([""], { type: "Text/Plain" }), { headers: [["Content-Type", "Text/Html"]] })
].forEach(bodyContainerCreator => {
const bodyContainer = bodyContainerCreator();
const cloned = bodyContainer.clone();
Expand Down
4 changes: 4 additions & 0 deletions test/fixtures/wpt/fetch/api/request/WEB_FEATURES.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
features:
- name: fetch-priority
files:
- request-init-priority.any.js
2 changes: 1 addition & 1 deletion test/fixtures/wpt/fetch/api/resources/keepalive-helper.js
Original file line number Diff line number Diff line change
Expand Up @@ -117,7 +117,7 @@ function assertStashedTokenAsync(
*
* `unloadIframe` to unload the iframe before verifying stashed token to
* simulate the situation that unloads after fetching. Note that this test is
* different from `keepaliveRedirectInUnloadTest()` in that the the latter
* different from `keepaliveRedirectInUnloadTest()` in that the latter
* performs fetch() call directly in `unload` event handler, while this test
* does it in `load`.
*/
Expand Down
3 changes: 3 additions & 0 deletions test/fixtures/wpt/fetch/compression-dictionary/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
These are the tests for the [Compression Dictionary Transport](https://datatracker.ietf.org/doc/draft-ietf-httpbis-compression-dictionary/) standard (currently in IETF draft state, approved for publication). The tests are marked as tentative, pending the publication of the RFC.

The MDN reference is [here](https://developer.mozilla.org/en-US/docs/Web/HTTP/Guides/Compression_dictionary_transport).
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
<meta name="timeout" content="long"/>
<script src="/resources/testharness.js"></script>
<script src="/resources/testharnessreport.js"></script>
<script src="./resources/compression-dictionary-util.js"></script>
<script src="./resources/compression-dictionary-util.sub.js"></script>
</head>
<body>
<script>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
<meta name="timeout" content="long"/>
<script src="/resources/testharness.js"></script>
<script src="/resources/testharnessreport.js"></script>
<script src="./resources/compression-dictionary-util.js"></script>
<script src="./resources/compression-dictionary-util.sub.js"></script>
</head>
<body>
<script>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
<meta name="timeout" content="long"/>
<script src="/resources/testharness.js"></script>
<script src="/resources/testharnessreport.js"></script>
<script src="./resources/compression-dictionary-util.js"></script>
<script src="./resources/compression-dictionary-util.sub.js"></script>
</head>
<body>
<script>
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,105 @@
<!DOCTYPE html>
<head>
<meta charset="utf-8">
<meta name="timeout" content="long"/>
<script src="/resources/testharness.js"></script>
<script src="/resources/testharnessreport.js"></script>
<script src="/common/get-host-info.sub.js"></script>
<script src="./resources/compression-dictionary-util.sub.js"></script>
</head>
<body>
<script>

// This is a set of tests for the dictionary itself being compressed, both by
// non-dictionary content encodings and dictionary encodings. The encoding used
// for the dictionary itself is independent of the encoding used for the data
// so the test uses different encodings just to make sure that the dictionaries
// don't carry any encoding-specific dependencies.

compression_dictionary_promise_test(async (t) => {
const dictionaryUrl =
`${SAME_ORIGIN_RESOURCES_URL}/register-dictionary.py?content_encoding=gzip`;
const dict = await (await fetch(dictionaryUrl)).text();
assert_equals(dict, kDefaultDictionaryContent);
const dictionary_hash = await waitUntilAvailableDictionaryHeader(t, {});
assert_equals(dictionary_hash, kDefaultDictionaryHashBase64);

// Check if the data compressed using the dictionary can be decompressed.
const data_url = `${kCompressedDataPath}?content_encoding=dcb`;
const data = await (await fetch(data_url)).text();
assert_equals(data, kExpectedCompressedData);
}, 'Decompresion using gzip-encoded dictionary works as expected');

compression_dictionary_promise_test(async (t) => {
const dictionaryUrl =
`${SAME_ORIGIN_RESOURCES_URL}/register-dictionary.py?content_encoding=br`;
const dict = await (await fetch(dictionaryUrl)).text();
assert_equals(dict, kDefaultDictionaryContent);
const dictionary_hash = await waitUntilAvailableDictionaryHeader(t, {});
assert_equals(dictionary_hash, kDefaultDictionaryHashBase64);

// Check if the data compressed using the dictionary can be decompressed.
const data_url = `${kCompressedDataPath}?content_encoding=dcz`;
const data = await (await fetch(data_url)).text();
assert_equals(data, kExpectedCompressedData);
}, 'Decompresion using Brotli-encoded dictionary works as expected');

compression_dictionary_promise_test(async (t) => {
const dictionaryUrl =
`${SAME_ORIGIN_RESOURCES_URL}/register-dictionary.py?content_encoding=zstd`;
const dict = await (await fetch(dictionaryUrl)).text();
assert_equals(dict, kDefaultDictionaryContent);
const dictionary_hash = await waitUntilAvailableDictionaryHeader(t, {});
assert_equals(dictionary_hash, kDefaultDictionaryHashBase64);

// Check if the data compressed using Brotli with the dictionary can be
// decompressed (Zstandard decompression of the data is tested separately).
const data_url = `${kCompressedDataPath}?content_encoding=dcb`;
const data = await (await fetch(data_url)).text();
assert_equals(data, kExpectedCompressedData);
}, 'Decompresion using Zstandard-encoded dictionary works as expected');

compression_dictionary_promise_test(async (t) => {
const dictionaryUrl = `${SAME_ORIGIN_RESOURCES_URL}/register-dictionary.py?id=id1`;
const dict = await (await fetch(dictionaryUrl)).text();
assert_equals(dict, kDefaultDictionaryContent);
assert_equals(
await waitUntilAvailableDictionaryHeader(t, {}),
kDefaultDictionaryHashBase64);

// Register another dictionary, compressed with dcb using the first dictionary.
const compressedDictionaryUrl =
`${SAME_ORIGIN_RESOURCES_URL}/register-dictionary.py?content_encoding=dcb&id=id2`;
const dict2 = await (await fetch(compressedDictionaryUrl)).text();
assert_equals(dict2, kDefaultDictionaryContent);
await waitUntilHeader(t, "dictionary-id", {expected_header: '"id2"'});

// Check if the data compressed using dcz with the updated dictionary works.
const data_url = `${SAME_ORIGIN_RESOURCES_URL}/compressed-data.py?content_encoding=dcz`;
const data = await (await fetch(data_url)).text();
assert_equals(data, kExpectedCompressedData);
}, 'A dcb dictionary-compressed dictionary can be used as a dictionary for future requests.');

compression_dictionary_promise_test(async (t) => {
const dictionaryUrl = `${SAME_ORIGIN_RESOURCES_URL}/register-dictionary.py?id=id1`;
const dict = await (await fetch(dictionaryUrl)).text();
assert_equals(dict, kDefaultDictionaryContent);
assert_equals(
await waitUntilAvailableDictionaryHeader(t, {}),
kDefaultDictionaryHashBase64);

// Register another dictionary, compressed with dcz using the first dictionary.
const compressedDictionaryUrl =
`${SAME_ORIGIN_RESOURCES_URL}/register-dictionary.py?content_encoding=dcz&id=id2`;
const dict2 = await (await fetch(compressedDictionaryUrl)).text();
assert_equals(dict2, kDefaultDictionaryContent);
await waitUntilHeader(t, "dictionary-id", {expected_header: '"id2"'});

// Check if the data compressed using dcb with the updated dictionary works.
const data_url = `${SAME_ORIGIN_RESOURCES_URL}/compressed-data.py?content_encoding=dcb`;
const data = await (await fetch(data_url)).text();
assert_equals(data, kExpectedCompressedData);
}, 'A dcz dictionary-compressed dictionary can be used as a dictionary for future requests.');

</script>
</body>
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
<script src="/resources/testharness.js"></script>
<script src="/resources/testharnessreport.js"></script>
<script src="/common/get-host-info.sub.js"></script>
<script src="./resources/compression-dictionary-util.js"></script>
<script src="./resources/compression-dictionary-util.sub.js"></script>
</head>
<body>
<script>
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
<!DOCTYPE html>
<head>
<meta charset="utf-8">
<meta name="timeout" content="long"/>
<script src="/resources/testharness.js"></script>
<script src="/resources/testharnessreport.js"></script>
<script src="/common/get-host-info.sub.js"></script>
<script src="/common/utils.js"></script>
<script src="./resources/compression-dictionary-util.sub.js"></script>
</head>
<body>
<script>

function getHeadersCrossorigin() {
function headersCallback(r) {
return (x) => {
r(x);
}
}
let script = document.createElement("script");
return new Promise((resolve, reject) => {
getHeadersCrossorigin['callback'] = headersCallback(resolve);
script.src =
`${CROSS_ORIGIN_RESOURCES_URL}/echo-headers.py?callback=getHeadersCrossorigin.callback`;
document.head.appendChild(script);
});
}

compression_dictionary_promise_test(async (t) => {
// Register the dictionary
const dict = await (await fetch(kRegisterDictionaryPath)).text();
assert_equals(dict, kDefaultDictionaryContent);
assert_equals(
await waitUntilAvailableDictionaryHeader(t, {}),
kDefaultDictionaryHashBase64);
// Test a no-cors crossorigin fetch
const headers = await getHeadersCrossorigin();
assert_false("available-dictionary" in headers);
}, 'Fetch cross-origin no-cors request does not include Available-Dictionary header');

</script>
</body>
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
<script src="/resources/testharnessreport.js"></script>
<script src="/common/get-host-info.sub.js"></script>
<script src="/common/utils.js"></script>
<script src="./resources/compression-dictionary-util.js"></script>
<script src="./resources/compression-dictionary-util.sub.js"></script>
</head>
<body>
<script>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
<script src="/resources/testharnessreport.js"></script>
<script src="/common/get-host-info.sub.js"></script>
<script src="/common/utils.js"></script>
<script src="./resources/compression-dictionary-util.js"></script>
<script src="./resources/compression-dictionary-util.sub.js"></script>
</head>
<body>
<script>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,9 @@
<meta name="timeout" content="long"/>
<script src="/resources/testharness.js"></script>
<script src="/resources/testharnessreport.js"></script>
<script src="./resources/compression-dictionary-util.js"></script>
<script src="/common/get-host-info.sub.js"></script>
<script src="/common/utils.js"></script>
<script src="./resources/compression-dictionary-util.sub.js"></script>
</head>
<body>
<script>
Expand All @@ -24,7 +26,7 @@
assert_equals(
await waitUntilAvailableDictionaryHeader(t, {}),
kDefaultDictionaryHashBase64);
assert_equals((await checkHeaders())['dictionary-id'], '"test"');
assert_equals(await checkHeader('dictionary-id', {}), '"test"');
}, 'Dictionary registration with dictionary ID');

compression_dictionary_promise_test(async (t) => {
Expand All @@ -36,7 +38,7 @@
await waitUntilAvailableDictionaryHeader(t, {}),
kDefaultDictionaryHashBase64);
// Check the `dictionary-id` header.
assert_equals((await checkHeaders())['dictionary-id'], '"id1"');
assert_equals(await checkHeader('dictionary-id', {}), '"id1"');

// Registers a second dictionary.
const kAlternativeDictionaryContent =
Expand All @@ -54,8 +56,52 @@
t, {expected_header: expected_dictionary_header}),
expected_dictionary_header);
// Check the `dictionary-id` header.
assert_equals((await checkHeaders())['dictionary-id'], '"id2"');
assert_equals(await checkHeader('dictionary-id', {}), '"id2"');
}, 'New dictionary registration overrides the existing one');

compression_dictionary_promise_test(async (t) => {
// Dictionary responses often include
// Vary: available-dictionary, accept-encoding
// We need to make sure that the browser cache does not actually vary
// based on those headers, otherwise a resource that uses itself as a
// dictionary would trigger a second fetch of the same resource.
const dictionaryUrl = `${SAME_ORIGIN_RESOURCES_URL}/register-dictionary.py?id=cache`;
const dict = await (await fetch(dictionaryUrl)).text();
assert_equals(dict, kDefaultDictionaryContent);
// Wait until `available-dictionary` header is available.
assert_equals(
await waitUntilAvailableDictionaryHeader(t, {}),
kDefaultDictionaryHashBase64);

// re-fetch the dictionary (should come from cache)
const dict2 = await (await fetch(dictionaryUrl)).text();
assert_equals(dict2, kDefaultDictionaryContent);

const entries = performance.getEntriesByName(dictionaryUrl);
assert_equals(entries.length, 2);
assert_not_equals(entries[0].transferSize, 0);
assert_equals(entries[1].transferSize, 0);
}, 'Dictionary registration does not invalidate cache entry');

compression_dictionary_promise_test(async (t) => {
// Register a dictionary that has already expired (age > max-age).
// Make sure it is on a path separate from another dictionary so they can
// be checked independently.
const pattern = "%2Ffetch%2Fcompression-dictionary%2Fresources%2Fecho-headers.py";
await fetch(`${kRegisterDictionaryPath}?id=id1&age=7200&max-age=3600&match=${pattern}`);
// register another dictionary that we can use to tell when the first
// dictionary should also be registered (since the first dictionary
// should not send any headers that can be detected directly).
const pattern2 = "%2Ffetch%2Fcompression-dictionary%2Fresources%2Fecho-headers2.py";
await fetch(`${kRegisterDictionaryPath}?id=id2&match=${pattern2}`);
assert_equals(
await waitUntilAvailableDictionaryHeader(t, {use_alt_path: true}),
kDefaultDictionaryHashBase64);
assert_equals((await checkHeaders({use_alt_path: true}))['dictionary-id'], '"id2"');
// Make sure the expired dictionary isn't announced as being available.
const headers = await (await fetch('./resources/echo-headers.py')).json();
assert_false("available-dictionary" in headers);
}, 'Expired dictionary is not used');

</script>
</body>
Loading