If you are not sure if the website you would like to visit is secure, you can verify it here. Enter the website address of the page and see parts of its content and the thumbnail images on this site. None (if any) dangerous scripts on the referenced page will be executed. Additionally, if the selected site contains subpages, you can verify it (review) in batches containing 5 pages.
favicon.ico: shopify.engineering/faster-breadth-first-graphql-execution - Shopifys journey to faster bre.

site address: shopify.engineering/faster-breadth-first-graphql-execution redirected to: shopify.engineering/faster-breadth-first-graphql-execution

site title: Shopifys journey to faster breadth-first GraphQL execution (2026) - Shopify

Our opinion (on Sunday 19 April 2026 9:04:35 UTC):

GREEN status (no comments) - no comments
After content analysis of this website we propose the following hashtags:



Meta tags:
description=We questioned why conventional GraphQL execution incurs hidden costs, and rewrote it in a faster breadth-first manner to avoid them.;

Headings (most frequently used words):

breadth, execution, to, the, migrating, cost, shopify, first, graphql, of, scale, field, engine, journey, faster, problems, hidden, costs, depth, traversal, hypothesis, how, cardinal, works, try, it, linear, level, overhead, lazy, dataloader, promises, napkin, math, from, whitepaper, tree, building, planning, phase, lookbehind, errors, ruby, interpreter, tracers, resolvers, looking, ahead, support, developers, products, global, impact, solutions,

Text of the page (most frequently used words):
the (112), and (79), #breadth (49), execution (48), this (46), graphql (45), that (45), field (41), for (35), our (31), with (29), first (29), depth (28), shopify (26), cardinal (19), fields (18), time (17), was (17), your (16), engine (16), resolvers (16), each (16), business (15), list (15), are (14), these (14), all (13), objects (13), one (13), while (12), data (12), based (12), tree (12), new (11), from (11), ruby (11), across (11), out (11), which (11), see (10), more (10), run (10), costs (10), only (10), faster (10), into (10), products (9), can (9), scale (9), now (9), then (9), results (9), object (9), its (9), building (8), result (8), their (8), stack (8), once (8), memory (8), built (8), overhead (8), cost (8), work (7), platform (7), many (7), queries (7), set (7), how (7), than (7), may (7), traversal (7), scope (7), promise (7), what (7), sell (7), support (6), open (6), development (6), make (6), running (6), start (6), level (6), return (6), performance (6), some (6), would (6), single (6), interpreter (6), response (6), down (6), next (6), when (6), resolve (6), request (6), promises (6), hidden (6), online (5), store (5), build (5), design (5), engineering (5), need (5), non (5), legacy (5), migrating (5), has (5), strategy (5), resolved (5), were (5), pass (5), entire (5), end (5), flat (5), sets (5), cpu (5), bound (5), they (5), through (5), community (4), about (4), code (4), 2026 (4), greg (4), own (4), implementation (4), any (4), require (4), large (4), uses (4), where (4), runs (4), core (4), started (4), journey (4), manage (4), even (4), but (4), processing (4), rather (4), main (4), errors (4), step (4), scopes (4), pattern (4), like (4), mapped (4), empty (4), not (4), child (4), linear (4), scaling (4), variants (4), found (4), subtree (4), scales (4), led (4), 000 (4), lazy (4), incur (4), dataloaders (4), conventional (4), spent (4), model (4), nested (4), customers (4), website (3), shop (3), api (3), help (3), roles (3), story (3), 2023 (3), apps (3), editor (3), developer (3), tooling (3), published (3), mar (3), macwilliam (3), strategies (3), potential (3), looks (3), two (3), language (3), way (3), spec (3), lists (3), patterns (3), looking (3), translation (3), have (3), off (3), claude (3), example (3), resolver (3), number (3), tracers (3), per (3), slightly (3), repetitions (3), heavy (3), still (3), around (3), receive (3), traces (3), less (3), advantages (3), majority (3), subtrees (3), hashes (3), during (3), generation (3), you (3), root (3), before (3), process (3), above (3), create (3), get (3), planning (3), because (3), lazily (3), zero (3), let (3), works (3), using (3), dataloader (3), repetition (3), however (3), used (3), very (3), hypothesis (3), dimension (3), 1ms (3), problems (3), instead (3), document (3), every (3), product (3), apis (3), frequently (3), commerce (3), tools (3), sales (3), market (3), privacy (2), choices (2), service (2), builder (2), solutions (2), dev (2), developers (2), status (2), press (2), media (2), site (2), window (2), learn (2), digital (2), culture (2), behind (2), flex (2), comp (2), web (2), shopifyql (2), introducing (2), ruvy (2), share (2), source (2), direct (2), here (2), try (2), defines (2), offer (2), benchmarks (2), resources (2), post (2), letter (2), present (2), findings (2), requirements (2), equivalent (2), official (2), merchants (2), technology (2), match (2), approach (2), driven (2), accelerate (2), lower (2), opportunities (2), date (2), native (2), features (2), migration (2), cases (2), quite (2), query (2), use (2), early (2), must (2), find (2), error (2), scenario (2), numerous (2), metrics (2), suite (2), studying (2), migrated (2), challenges (2), tens (2), thousands (2), change (2), capture (2), duration (2), reporting (2), schema (2), also (2), could (2), selection (2), making (2), them (2), dramatically (2), moment (2), really (2), tradeoff (2), able (2), improve (2), test (2), allow (2), sequence (2), develop (2), interface (2), another (2), recursion (2), deep (2), over (2), other (2), concept (2), track (2), always (2), sharing (2), already (2), assembled (2), note (2), repeat (2), starts (2), structure (2), will (2), top (2), bottom (2), following (2), consider (2), parent (2), alternative (2), lookahead (2), below (2), lookbehind (2), statically (2), resolvable (2), resolves (2), benefit (2), typed (2), never (2), primitives (2), type (2), requests (2), garbage (2), improvements (2), production (2), sized (2), p50 (2), largest (2), similar (2), advantage (2), 15x (2), wrapper (2), static (2), etc (2), optimized (2), high (2), cardinality (2), should (2), callbacks (2), assume (2), intermediary (2), call (2), napkin (2), math (2), simply (2), batched (2), batching (2), makes (2), ran (2), slower (2), essential (2), performing (2), criteria (2), seconds (2), tend (2), profiling (2), measure (2), problem (2), size (2), profile (2), hundred (2), column (2), 100 (2), poorly (2), flow (2), perform (2), describes (2), sizes (2), hundreds (2), entirely (2), geometrically (2), such (2), bottleneck (2), layer (2), deeply (2), fan (2), guard (2), against (2), something (2), 250 (2), blog (2), free (2), topics (2), generator (2), stock (2), explore (2), orders (2), analytics (2), social (2), reach (2), sitemap, policy, terms, ecommerce, research, accessibility, black, sustainability, global, impact, enterprise, pay, degree, documentation, academy, hire, partner, center, merchant, legal, affiliates, partners, investors, careers, opens, external, anywhere, 2022, minute, read, linkedin, twitter, facebook, article, engineer, enthusiast, author, coder, dad, skier, likes, dogs, juggles, fire, home, office, worker, angry, who, attacks, his, reflection, stitching, hard, comparisons, languages, jit, worthy, investigation, rubyists, concepts, collaborating, defacto, standard, final, highlight, relative, module, write, conversation, says, think, shake, quo, conformance, expressed, algorithms, fulfilled, long, perceived, readily, accessible, job, fundamental, hardly, unique, async, bindings, major, untapped, continuing, plenty, continue, upon, strengths, everything, achieved, synchronous, ahead, ongoing, simple, trickier, nuanced, carefully, matched, regressions, attributed, mistakes, yet, fundamentally, worse, burndown, tracking, benchmark, shadow, verifier, confirm, counterparts, library, skills, translations, leg, focused, introduced, whole, safely, rollout, implementations, team, risen, challenge, innovations, required, minor, adjustments, timings, average, effectively, anyway, relatively, straightforward, adaptation, much, instrument, scaled, linearly, exciting, outcome, cheaper, collaboration, shined, presented, efficiency, rolled, lighter, produced, visible, benefits, without, changing, successfully, tuned, cutting, redundancies, consumed, puppet, runtime, shouldn, individually, existing, incrementally, swapping, replacements, prospect, actually, paradigm, far, challenging, blue, sky, prototyping, monolith, traditional, switching, incremental, bridge, gap, adopting, novel, aspect, enqueuing, avoids, notorious, contributes, reducing, footprint, loop, grown, line, surgical, net, responses, thanks, either, reasonable, given, traffic, validation, chose, optimize, success, rate, unlike, paths, bubble, exceptions, generally, completion, aside, failed, mutation, terminate, rescued, inlined, added, locate, report, occurred, keyed, place, passed, shaped, passing, superpower, cycles, elements, reference, wondering, gets, easy, miss, ended, holding, mapping, fast, amortizing, setup, finally, finish, leaf, shine, handling, merged, lastly, map, corresponding, algorithmically, combine, traverse, matches, key, establish, groupings, called, complete, holds, filled, event, planned, again, hash, after, heavily, inspired, ancestors, register, preloads, notes, influence, cannot, informed, unresolved, abstracts, grafast, phase, trees, eagerly, ast, abstract, positions, omitted, concretely, guesswork, intentional, constraint, navigated, upward, construct, closure, written, pseudocode, dig, internals, performs, execute, simplified, version, admin, inspecting, profiles, tests, confirmed, theory, spending, equal, staging, neighboring, collection, huge, margins, deliver, testing, experimental, fetched, various, payloads, clearly, translated, saved, pronounced, comparing, usage, study, compares, varying, degrees, case, item, there, wins, slim, margin, negligible, repeated, demonstrate, increase, scrutiny, equally, important, understand, initial, experiments, fed, json, had, same, back, speed, encouraging, standalone, schemas, asts, original, proof, algorithm, breadth_exec, prototype, whitepaper, basic, figures, favorably, removing, multiplying, factor, chain, onto, resolution, operates, returns, 5ms, 1000, five, associated, simplicity, round, say, pessimistic, assumption, looked, promising, theoretically, system, logic, individual, implicitly, multiple, bind, longer, hotter, underlying, function, wrapping, operating, subgraph, considerably, different, federated, supergraph, partials, wundergraph, airbnb, executions, performed, executed, aggregated, good, optimizing, come, steep, tradeoffs, bloat, allocations, collector, backpressure, add, backtracking, resolving, workflow, batch, particular, deserves, special, mention, tool, solving, separate, pooling, lookup, load, fulfill, promised, value, multiplicative, balloon, show, just, adding, created, tiny, elusive, slip, between, frames, difficult, total, left, tracing, hook, amplifies, carries, methodology, authorization, instrumentation, others, inherent, directly, baked, complexity, shows, distinct, emerge, slice, traversing, columns, independent, amortized, similarly, multiplied, primary, lacks, opportunity, amortize, seen, almost, including, canonical, gem, since, 2015, follows, experience, means, descends, recursively, moving, advance, engines, dynamic, width, returned, variable, highly, considers, selected, fixed, expect, low, embark, though, take, define, mean, context, studied, issue, became, clear, central, bias, towards, reconsider, better, examining, often, reveals, surprising, assemble, loading, itself, powered, supports, structures, walk, observed, internally, takes, migrate, massive, shave, times, discovering, full, culprit, real, dug, unexpected, wasn, necessarily, powers, serve, fetching, creating, hiring, incurs, questioned, why, rewrote, manner, saw, dramatic, log, search, science, security, latest, infrastructure, mobile, machine, learning, company, news, releases, newsroom, recent, updates, changelog, photography, logo, maker, name, popular, proven, experts, courses, pricing, solution, growing, brands, plus, powerful, automate, inventory, order, management, gain, customer, insights, know, audience, messaging, nurture, inbox, chat, integrations, retain, b2b, wholesale, international, globally, millions, shoppers, boost, channels, grow, world, class, checkout, check, person, domains, hosting, domain, app, themes, customize, brand, skip, content,


Text of the page (random words):
versal the breadth first hypothesis that led to cardinal how the engine works internally and what it takes to migrate a massive production stack to an entirely new execution model problems of scale shopify s graphql powered data layer supports deeply nested structures with fan out that graphql apis frequently guard against for example nested lists like this tend to perform poorly as they scale geometrically and examining traces for such queries often reveals a surprising bottleneck in our stack the majority of request time may be spent running field resolvers that assemble the response not loading the data itself the more we studied this issue the more it became clear that our central problem was graphql execution s bias towards depth based recursion and its hidden costs this led us to reconsider the entire design of graphql execution and develop an entirely new breadth based model that is better optimized for our business before we embark on this journey though let s take a moment to define what graphql depth and breadth mean in this context depth describes the static size of a graphql request document it considers the number of fields selected and how they re nested this dimension is fixed and we expect very large document field sizes to be in the low hundreds breadth describes the dynamic width of the resolved data which scales by the number of objects returned across list fields this dimension is highly variable and very large sizes may be in the tens or even hundreds of thousands of objects the hidden costs of depth traversal conventional graphql engines perform execution using depth first traversal what that means is the engine descends recursively through each object s subtree before moving on to the next in the flow above we resolve a product in a list and then all of its child variants and then advance to the next product in the list and then repeat almost every graphql implementation uses this depth first pattern including the canonical graphql ruby gem that we have used since 2015 and the official graphql js spec implementation that it follows in our experience running this execution model with ruby we ve found that it scales poorly cost linear scale the primary hidden cost of depth based traversal is that it lacks the opportunity to amortize cpu bound processing across subtrees as seen in this stack profile this profile shows graphql processing a list of one hundred products each with one hundred child variants we can see a distinct column pattern emerge during field execution where each column is the slice of time spent traversing a single product s subtree these columns are independent subtree processing is not amortized so the time to process 100 similarly sized products is simply the time of one multiplied by 100 this is linear time complexity that scales directly with the size of the response and it is baked into graphql s conventional execution design cost field level overhead linear scale then amplifies another problem each graphql field execution carries some non zero overhead cost for engine methodology authorization instrumentation etc some of these costs are of our own making while others are inherent to the graphql engine for example an empty field level tracing hook running on 1k fields was about 10 slower in our stack just adding the field wrapper created overhead these tiny field costs are elusive and tend to slip through between profiling frames which makes them difficult to measure in total even with left heavy profiling we incur these costs for every field of every object in depth based execution and this multiplicative overhead can balloon into entire seconds of cpu bound execution time we ll show you that below cost lazy dataloader promises one particular field level overhead deserves a special mention dataloader promises dataloaders are an essential tool for solving graphql s n 1 problems rather than performing separate i o for each of n fields we instead resolve a promise for each field while pooling their lookup criteria then lazily load all criteria at once and then fulfill each promised value while dataloaders are good for optimizing i o bound performance they come with steep memory and cpu performance tradeoffs because they incur a bloat of promise allocations create garbage collector gc backpressure and add execution backtracking resolving 1k lazy fields through a graphql batch workflow with no i o ran 2 5x slower than the equivalent non lazy fields in our stack the breadth first hypothesis these problems with depth based execution led us to consider an alternative strategy what if all field executions ran breadth first instead what if we performed a single pass down the request document and only executed field resolvers one time each with an aggregated breadth of objects to make this work we d change field resolvers to each receive a set of objects and return a mapped set of results this interface is similar to airbnb s batched resolvers but our underlying implementation would make breadth batching a native function of the engine rather than wrapping depth traversal in dataloaders we re also operating at a subgraph execution level which makes this considerably different from wundergraph s breadth batching of federated supergraph partials theoretically resolvers in this breadth based system should run longer and hotter on business logic with no platform overhead for field repetitions individual fields would be implicitly batched and multiple fields sharing i o could run dataloaders that bind entire object sets to a single promise rather than building one to one promises simply by the napkin math this breadth based approach looked promising the napkin math assumption all graphql fields have some non zero overhead cost associated with their execution for simplicity let s round up and say this cost is 1ms which is quite pessimistic scenario we resolve five fields depth across a list of 1 000 objects breadth depth first we call 5 000 field resolvers depth breadth and incur 5s of cost 5 1000 1ms breadth first we call 5 field resolvers depth only and incur only 5ms 5 1ms now assume each field operates lazily and returns a promise depth first we build and resolve 5 000 intermediary promises depth breadth breadth first we build and resolve 5 intermediary promises depth only now assume we chain a then onto the lazy promise resolution depth first we run 10 000 promise callbacks depth breadth 2 breadth first we run 10 promise callbacks depth 2 by these basic figures breadth first execution should scale lists more favorably by removing our largest dimension as a multiplying factor from platform overhead costs from whitepaper to engine this hypothesis led us to prototype a new graphql engine optimized for high cardinality set execution graphql cardinal the engine was built as a standalone execution wrapper around the static graphql ruby primitives that we were already using schemas asts etc the original proof of concept of cardinal s core algorithm can be found in graphql breadth_exec for our initial experiments we fed flat json data with 5k fields into cardinal and graphql ruby and had each engine process the same structure back out cardinal s cpu bound execution speed was 15x faster and used 90 less memory which was very encouraging however these benchmarks require scrutiny because not all requests will benefit equally from a breadth first strategy it s important to understand how breadth advantages scale by repetition the above study compares a 7 deep object subtree with varying degrees of list repetition in the first case with only one list item there is no breadth repetition and we see that depth based execution wins out by a slim margin which is negligible when only repeated once however scaling up to a list of two starts to demonstrate breadth s advantage and this advantage scales dramatically as repetitions increase we see a similar story when comparing memory usage these findings are even more pronounced when studying a single field using dataloader promises testing our experimental engine in production we fetched various sized payloads of products and their child variants and found that these breadth based scale advantages clearly translated into end to end response time improvements we saved over 4s of time at p50 for our largest test queries inspecting profiles of these tests confirmed the linear scaling fields theory cardinal requests were spending equal time on i o and data staging but were able to improve on graphql field execution and its neighboring garbage collection by huge margins to deliver these end to end time improvements how cardinal breadth execution works now let s dig into the internals of how cardinal performs breadth first graphql execution we ll execute through the following query that uses a simplified version of the shopify admin api tree building the first step for any request is to construct an execution tree this tree has two main primitives scopes and fields a scope defines a typed closure with many fields while a field has a return type and zero to many child scopes written as pseudocode an execution tree looks like this execution trees are built eagerly based on a request s statically resolvable ast abstract positions not statically resolvable are omitted from the tree and get built lazily once the parent field resolves its objects the benefit of this pattern is that execution scopes are always concretely typed and require no guesswork an intentional constraint of this design is that an execution tree can only be navigated upward never down planning phase lookbehind after tree building cardinal runs a bottom up planning pass heavily inspired by grafast during this pass each field may consider its ancestors and register preloads and or planning notes that may influence parent execution strategies we offer this lookbehind pass as an alternative to lookahead because lookahead cannot make informed choices about unresolved abstracts below it execution now for the main event we ve built the execution tree from top down we ve planned the tree from bottom up now it s time to go top down again running execution following the graphql spec we start with a root object to resolve from and an empty hash as its result data note that each scope in the tree holds a set of objects and their mapped results that start empty these sets will get filled in as we go our first execution step runs field resolvers in the root scope resolvers are called only once per field with the scope s complete set of objects and they must return a mapped set of results in the above a field resolver mapped the one shop object into one list of its products which matches the schema next we key the resolved data structure into the scope s results to establish list groupings and create new empty result hashes for each object lastly we flat map out all resolved objects and their corresponding result hashes into the next scope as its objects and results algorithmically this step can combine with building results so that we only traverse the resolved field data once one generation down now we repeat breadth really starts to shine when handling merged sets as we see here note that the generation ended with the next scope holding a flat mapping of all objects and results assembled before it flat sets are fast to process while amortizing setup work across subtrees finally we ll run this sequence one more time to finish off the leaf field selection that s it or is it you may be wondering when the response tree gets built it s easy to miss but we already assembled it looking at the root result object that we started execution with it now looks like this result hashes were keyed in place and passed down by reference across scopes to be shaped during the next generation this pattern of passing flat sets is breadth s superpower for sharing cpu bound work cycles across list elements errors unlike depth execution breadth has no concept of subtrees by which to track error paths or bubble exceptions as a result breadth execution generally runs to completion aside from failed mutation fields which always terminate early all rescued errors are inlined into the response tree and then a depth traversal step is added at the end to locate and report on where errors occurred while this strategy is less surgical than depth based execution it may still net faster responses thanks to breadth s other performance advantages either way it s a reasonable tradeoff given that 1 of shopify s api traffic results in non validation errors so we chose to optimize for our majority success rate engine another novel aspect of cardinal s breadth first design is that the processing engine is driven by enqueuing rather than recursion this avoids many of the deep stack traces that graphql is notorious for and contributes to reducing cardinal s memory footprint while cardinal s main execution loop has grown slightly over time it started out as a single line of code migrating to breadth execution the prospect of actually adopting this breadth paradigm was far more challenging than our blue sky prototyping work to develop it our entire core monolith was built around the traditional receive and return one field resolver interface while breadth execution would require switching to receive and return many we d need an incremental strategy to bridge this gap a graphql ruby interpreter we started building an interpreter that would allow the cardinal engine to puppet graphql ruby s runtime sequence for legacy fields while this interpreter shouldn t be any faster it d still need to run legacy field resolvers individually it would allow us to run our existing stack while incrementally swapping out legacy resolvers for their faster breadth replacements we successfully tuned this interpreter to pass our entire core test suite it was even slightly faster at list heavy queries by cutting out some graphql ruby redundancies but it consumed more memory this was a moment where collaboration with claude ai really shined presented with our memory tradeoff claude was able to improve the interpreter s memory efficiency by 40 by the time we rolled out the interpreter was slightly lighter and faster at list repetitions and produced visible benefits for some list heavy queries without changing any field resolvers migrating tracers our field level tracers that instrument field performance and schema metrics also scaled linearly in depth execution an exciting outcome of migrating to breadth was that these tracers could now only run once per field selection making them dramatically cheaper this change required some minor strategy adjustments for example field timings would need to capture a single duration for a breadth resolver and then average that duration across the number of resolved objects this was effectively how we were reporting the data anyway so it was a relatively straightforward adaptation with much lower capture costs migrating field resolvers now...
Thumbnail images (randomly selected): * Images may be subject to copyright.GREEN status (no comments)
  • Shopify
  • Shopify
  • Shopify
  • California Consumer Priva...

Verified site has: 16 subpage(s). Do you want to verify them? Verify pages:

1-5 6-10 11-15 16-16


Pages verified in the last hours (randomly selected):


Top 50 hastags from of all verified websites.

Supplementary Information (add-on for SEO geeks)*- See more on header.verify-www.com

Header

HTTP/1.1 301 Moved Permanently
Date Sun, 19 Apr 2026 09:04:35 GMT
Content-Type text/html
Content-Length 167
Connection close
Cache-Control max-age=3600
Expires Sun, 19 Apr 2026 10:04:35 GMT
Location htt????/shopify.engineering/faster-breadth-first-graphql-execution
Vary Accept-Encoding
X-Content-Type-Options nosniff
Server cloudflare
CF-RAY 9eeabe1cad9bd0be-CDG
alt-svc h3= :443 ; ma=86400
HTTP/2 200
date Sun, 19 Apr 2026 09:04:35 GMT
content-type text/html; charset=utf-8
cache-control max-age=900, stale-while-revalidate=86400
set-cookie _shopify_essential_=54b8d053-f97d-4403-b2f5-a900ddb2c6c4; Domain=shopify.engineering; Path=/; Expires=Mon, 19 Apr 2027 09:04:35 GMT; Secure; SameSite=Lax
report-to endpoints :[ url : https:\/\/a.nel.cloudflare.com\/report\/v4?s=r9g1tczZRX9RQ3SVt2HMOU7co7S1gbIU6MmaAhkiN9Ayyb%2BCAnn4PurAqjUn7py2wt5SG02xwP5tTZbe6wemLMykm9OmAaZkrvSe2nkLzl5uOMBYzvyjhSW0tFqcDzuADQIMSq0%3D ], group : cf-nel , max_age :604800
nel success_fraction :0.01, report_to : cf-nel , max_age :604800
vary Accept-Encoding
strict-transport-security max-age=15552000; includeSubDomains; preload
x-content-type-options nosniff
server-timing cfRequestDuration;dur=0.000000
content-encoding gzip
cf-cache-status BYPASS
set-cookie __cf_bm=5uJRIco5aJWYRKpTbMAm4o5d6NmDGjiBdGHb4SQTwwo-1776589475-1.0.1.1-VraCEbeaUEs4Xr_4DMkkW0Yvh9YoNibJ3nszU6W1uOJhr37TQYoLq.HqXskr2sjegUAebs1OjluBfOlgeiAvhy4ojkf60YkJXb0lZFL1dP0; path=/; expires=Sun, 19-Apr-26 09:34:35 GMT; domain=.shopify.engineering; HttpOnly; Secure; SameSite=None
server cloudflare
cf-ray 9eeabe1cee269ee7-CDG
alt-svc h3= :443 ; ma=86400

Meta Tags

title="Shopifys journey to faster breadth-first GraphQL execution (2026) - Shopify"
charset="utf-8"
name="viewport" content="width=device-width,initial-scale=1"
http-equiv="Accept-CH" content="ECT, Save-Data, Device-Memory, Downlink"
name="description" content="We questioned why conventional GraphQL execution incurs hidden costs, and rewrote it in a faster breadth-first manner to avoid them."
property="fb:pages" content="20409006880"
property="fb:app_id" content="847460188612391"
property="og:type" content="website"
property="og:site_name" content="Shopify"
property="og:title" content="Shopify’s journey to faster breadth-first GraphQL execution (2026) - Shopify"
property="og:description" content="We questioned why conventional GraphQL execution incurs hidden costs, and rewrote it in a faster breadth-first manner to avoid them."
property="og:image" content="htt????/cdn.shopify.com/b/shopify-brochure2-assets/5ef823c3401ae6666288549de234aa84.png"
property="twitter:image" content="htt????/cdn.shopify.com/b/shopify-brochure2-assets/5ef823c3401ae6666288549de234aa84.png"
property="og:url" content="htt????/shopify.engineering/faster-breadth-first-graphql-execution"
property="twitter:card" content="summary_large_image"
property="twitter:site" content="Shopify"
property="twitter:account_id" content="17136315"
property="twitter:title" content="Shopify’s journey to faster breadth-first GraphQL execution (2026) - Shopify"
property="twitter:description" content="We questioned why conventional GraphQL execution incurs hidden costs, and rewrote it in a faster breadth-first manner to avoid them."
itemprop="mainEntityOfPage" content="//faster-breadth-first-graphql-execution"
itemprop="dateModified" content="2026-03-12T12:19:08.000Z"
itemprop="name" content="Shopify"
itemprop="url" content="htt????/cdn.shopify.com/assets/images/logos/shopify_logo_black.png"
itemprop="width" content="210"
itemprop="height" content="60"

Load Info

page size83340
load time (s)0.833287
redirect count1
speed download100048
server IP 172.64.149.196
* all occurrences of the string "http://" have been changed to "htt???/"