Skip to content

Conversation

@manzt
Copy link
Member

@manzt manzt commented Feb 21, 2025

HiGlass's built-in data fetching logic is designed to optimize API calls
by consolidating requests based on time, ID, and server. Our custom
Jupyter data fetcher doesn't include these optimizations.

This PR introduces a consolidator helper, which performs a much simpler
form of batching. Since we assume a single server, this function only
batches tile requests that occur within the same animation frame and
submits them together. This helps reduce the number of comms calls, and
at a minimum dedupes requests with the same tile id.

This is the approach taken in mosaic:

"As multiple queries tend to be issued in tandem (for example upon
initialization), the Coordinator waits one animation frame, collects
incoming queries, and merges those .... Upon query completion,
the Coordinator parcels out appropriate projections to clients."

Initial testing suggests it feels quite responsive!

Before (slight lag due to repeated (shared) requests)

Screen Recording 2025-02-20 at 22 45 33

After

Screen Recording 2025-02-20 at 22 47 14

Checklist

  • Clear PR title (used for generating release notes).
    • Prefer using prefixes like fix: or feat: to help organize auto-generated
      notes.
  • Unit tests added or updated.
  • Documentation added or updated.

HiGlass's built-in data fetching logic is designed to optimize API calls
by consolidating requests based on time, ID, and server. Our custom
Jupyter data fetcher doesn't include these optimizations.

This PR introduces a `consolidator` helper, which performs a much simpler
form of batching. Since we assume a single server, this function only
batches tile requests that occur within the same animation frame and
submits them together. This helps reduce the number of comms calls and
deduplicates requests efficiently.

Initial testing suggests it feels quite responsive!
@manzt manzt requested a review from nvictus February 21, 2025 03:43
* @template T
* @returns {{ promise: Promise<T>, resolve: (success: T) => void, reject: (err: unknown) => void }}
*/
function defer() {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can't wrap my head around what this function is supposed to do.

Copy link
Member Author

@manzt manzt Feb 21, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's a polyfill for something like Promise.withResolvers.

Basically, it lets you create a "deferred" value. Normally, you'd resolve or reject a promise directly within the promise body. But this popular technique lets you bring the resolve/reject functions into the same scope as the promise. This let s you return the promise directly (for something else to "await") and then pass around the resolvers for that promise to other internal functions and methods.

This lets us have a top level API of:

const submit = consolidator((batch) => {
	for (let {data, resolve} of batch) {
		resolve(data + "!");
	}
});
let aPromise = submit("a");
let bPromise = submit("b"); 

await Promise.all([aPromise, bPromise]) // ["a!" "b!"]

See how each of the callers nicely can "await" their individual submissions without needing to know about the batching. It's because we pass down the resolve for each of those separate promises to the processBatch handler.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh, actually Promise.withResolvers is pretty widely available now according to MDN. I think I'll just use it direcly.

@manzt manzt merged commit a22b771 into main Mar 11, 2025
8 checks passed
@manzt manzt added the enhancement New feature or request label Mar 11, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants