We are happy to announce – a declarative, evented, data-retrieval and aggregation gateway for HTTP APIs. Through, we want to help application developers increase engineering clock speed and improve end user experience. can reduce the number of lines of code required to call multiple HTTP APIs while simultaneously bringing down network latency and bandwidth usage in certain use cases. consists of a domain-specific language inspired by SQL and JSON, and a node.js-based runtime to process scripts written in that language. Check out on Github for the source and for demos, examples, and docs. logo


HTTP based APIs – some call them services – are an integral part of eBay’s architecture. This is true not just for eBay, but for most companies that use the Web for content and information delivery. Within eBay’s platform engineering group, we noticed several pain points for application developers attempting to get the data they need from APIs:

  • Most use cases require accessing multiple APIs – which involves making several network round trips.
  • Often those API requests have interdependencies – which requires programmatic orchestration of HTTP requests – making some requests in parallel and some in sequence to satisfy the dependencies and yet keep the overall latency low.
  • APIs are not always consistent as they evolve based on the API producers’ needs – which makes code noisier in order to normalize inconsistencies.

We found that these issues have two critical impacts: engineering clock speed and end user experience.

  • Engineering clocks slow down because developers need to account for dependencies between API calls, and to arrange those calls to optimize overall latency. Implementing orchestration logic involves multi-threaded fork-join code, leads to code bloat, and distracts from the main business use case that the developer is striving to support.
  • End user experience suffers due to high bandwidth usage as well as the latency caused by the number of requests and the processing overhead of non-optimized responses from APIs.

The goal of is to ease both pain points:

  • By using a SQL- and JSON-inspired DSL to declare API calls, their interdependencies, forks and joins, and projections, you can cut down the number of lines of code from hundreds of lines to a few, and the development time from entire sprints to mere hours. Using this language, you can create new consumer-centric interfaces that are optimized for your application’s requirements.
  • You can deploy as an HTTP gateway between client applications and API servers so that can process and condense the data to just the fields that the client needs. This helps reduce the number of requests that the client needs to make as well as the amount of data transported to clients.

A quick taste

Here is one of the typical examples of usage. It shows how can transform the experience of a developer getting the data needed to paint the UI in a native application.

prodid = select ProductID[0].Value from eBay.FindProducts where
    QueryKeywords = 'macbook pro';
details = select * from eBay.ProductDetails where
    ProductID in ('{prodid}') and ProductType = 'Reference';
reviews = select * from eBay.ProductReviews where
    ProductID in ('{prodid}') and ProductType = 'Reference';

return select d.ProductID[0].Value as id, d.Title as title,
    d.ReviewCount as reviewCount, r.ReviewDetails.AverageRating as rating
    from details as d, reviews as r
    where d.ProductID[0].Value = r.ProductID.Value
    via route '/myapi' using method get;

This script uses three API calls (in this case, all offered by eBay) to get four fields of products that match a keyword. The result is provided via a new HTTP resource with URI http://{host}:{port}/myapi. See the guide to build this example yourself, or copy and paste the above script into’s Web Console to see it in action.

While we are still working on various benchmarks, we want to share some early results on developer productivity and end user benefits. One of the teams at eBay recently migrated an application that relies solely on eBay’s APIs to get the data needed to paint its UI. The first diagram below shows the request-response traces before migrating to

Before migrating to

The code related to these API calls was about 2800 lines long. The diagram below shows the request-response traces after migrating API access to

After migrating to

This effort brought the code down to about 1200 lines, in addition to reducing the number of requests from 18 to 5 and the data size from 274k to 91k. In this experiment, latency drop is not significant as the client application was using broadband and some of the APIs used were slow APIs.

How to use is not intended to replace frameworks that are currently used to build HTTP APIs. API producers can continue to use existing frameworks to offer interfaces that are generic and broadly reusable. comes into play when a consumer of APIs wants to implement consumer-specific aggregation, orchestration, and optimizations. In other words, while existing frameworks continue to support “producer-controlled” interfaces, you can use to create “consumer-controlled” interfaces.

We are building with flexible deployment in mind. Depending on where the network costs are felt, you can deploy closer to API servers, closer to users on the edge, or even on front-end machines.

Deploying closer to API servers

The primary usage of is to run it as a gateway at the reverse proxy tier, potentially between your load balancers and API servers. as a gateway on the reverse-proxy tier

Deploying closer to client applications

A secondary usage is to deploy closer to client applications on the edge. on the edge

Edge-side deployment can further reduce network costs for client applications by pushing API orchestration closer to those applications. Where API servers are globally distributed and the best place for aggregation may be closer to client applications, edge-side deployment may yield significant gains. If you are a developer using third-party APIs, you can follow the same pattern and deploy on your own closer to your applications.

Deploying on the front end

Our choice of Javascript and node.js for building provides an additional deployment option: front-end applications built on node.js can use programmatically. on the node front end apps

Why node.js

Early on, one of the critical choices that we had to make was the software stack. We had two choices: Should we go with the proven Java stack that has full operational support within eBay? Or should we choose a stack like node.js with its excellent support for async I/O, but which was not yet proven when we started the project? Moreover, very few companies had operational experience with node.js. This was not an easy choice to make. In our deliberations, we considered the following systemic qualities, in their order of importance:

  • Performance and scalability for I/O workloads.  Of workloads performed during script execution, a significant percentage  is I/O bound. CPU loads are limited to in-memory tasks like joining and projections. Blocking I/O was out of the equation for supporting such workloads.
  • Operability. We need to be able to monitor the runtime, know what is going on, and react quickly when things go wrong. Furthermore, integrating with eBay’s logging and monitoring tools is a prerequisite for bringing in a new technology stack.
  • Low per-connection memory overhead. Since script execution involves some slow and some fast APIs, we need the stack to remain stable as the number of open connections increases.
  • Dynamic language support. This consideration had two parts. We wanted to build very quickly in a very small team with low code-to-execution turn-around times. This approach helps us iterate rapidly in the face of bugs as well as new use cases. In addition, we wanted application developers to be able to extend’s processing pipeline with small snippets of code.

After some analysis and prototyping, we chose Javascript as the language and node.js as the runtime stack. Here are some highlights of our experience so far:

  • Javascript and node.js allowed us to iterate very rapidly. Though we were initially concerned about finding the right tools and libraries, the node.js ecosystem proved sufficient for us to build as complex a system as
  • We were able to tune a regular developer-quality Ubuntu workstation to handle more than 120,000 active connections per node.js process, with each connection consuming about 2k memory. We knew we could go further with the number of connections; although we did not spend the time to go beyond, this gave us the confidence to proceed with node.js.
  •’s core engine does automatic fork-join of HTTP requests by using compile-time analysis of scripts. Node’s evented I/O model freed us from worrying about locking and concurrency issues that are common with multithreaded async I/O.
  • We did pay some operationalization tax while we prepared the and node.js stack for integration with eBay’s monitoring and logging systems. This was a one-time penalty.

What’s next

We’re not done with yet, and we want to continue to develop in the open. Go to on Github and to find more about, try it out, discuss it, and participate.

55 thoughts on “Announcing

  1. Pingback: eBay chooses Node.js as the runtime stack in a data-retrieval and aggregation gateway for HTTP APIs « Joyeur

  2. Pingback: Node API proxy to make 1 client request that calls numerous services |

  3. Pingback: More proof that enterprises love Node.js — Cloud Computing News

  4. Pingback: Gateway para recuperação e agregação de dados para APIs HTTP é anunciado | F2 - Sistemas

  5. Pingback: eBay ha annunciato, un linguaggio “ispirato” a SQL per Node.js | Indipedia – Indipendenti nella rete

  6. Pingback: Новости компьютерного мира - eBay представил, SQL-подобный язык для взаимодействия с web-сервисами

  7. Pingback: SQL secondo eBay | Edit - Il blog di

  8. Pingback: eBay, SQL untuk API Web | LinuxBox.Web.ID

  9. Pingback: Pie in the Sky (December 9, 2011) | MSDN Blogs

  10. Pingback: Ebay projesini yayınladı | /home/haqen

  11. Mohan

    I was wondering if or node.js can be used to build a client. It might not be the intended purposebut I am looking for API’s to quickly write workload simulators that generate XML message load and hit my netty servers.

  12. Pingback: Announcing » 爱博

  13. Fizwhiz

    I’m considering a Proof of Concept app that uses QL.IO as some sort of an orchestration engine. My POC requires certain NPM modules that have persistence (for journaling purposes) and caching. Assuming these modules currently exist through some 3rd party and are perfectly compatible with Node 0.6.x, will I be able to leverage the functionality of these modules through QL.IO built on that specific version of node automatically? Or am I required to cut a branch from the QL.IO code based and write some kind of wrappers myself? What’s the best place to see the anatomy of QL.IO to see the direct translation of SQL-ish code to C to js?

  14. Pingback: Best Resources to Learn Node.js | Codehenge

  15. Pingback: Node.js set to land on Engine Yard’s PaaS

  16. Pingback: Node.js set to land on Engine Yard’s PaaS | Technology News

  17. Pingback: Node.js is taking over the Enterprise – whether you like it or not - Platform as a Service Magazine

  18. Pingback: Kinvey Service Link Uses to Integrate with Third Party Data Sources | kinveyposts

  19. Pingback: APIs for eBay Marketplaces — eBay Tech Blog

  20. Rob Mullen

    Does use the popular async or step node modules under the cover for asynchronous service call aggregation, or did you guys roll your own? I tried searching the source code but got lost, and github search stinks.

  21. D Jones

    Very interesting. Would this prove any efficiencies for applications using eBay API to search for newly listed items rapidly (several times a second) per search term, in order to increase the chances of ensuring the server you hit has the newly listed item before replicated across all systems? ..therefore, seeing items faster than others and being able to buy them before others with higher success?

    Your thoughts are appreciated in advance,


  22. Tim Romano

    These new tricks sound interesting, but I am an old dog, and question the relative efficiency of an amalgamated request vs several discrete requests, when the discrete requests are “paced” — not occurring rapidly, one hard upon the other. Why retrieve in one fell swoop the product-detail and consumer-reviews for ALL products that satisfy the query? That data might never be needed by the user. Let’s say 10 products, or better yet, 100 products, satisfy the query. Would you bring back reviews and product detail for all 100? Why not wait until the user touches a particular product before fetching its detail and reviews? Are several discrete requests, one every few seconds (human fingers can only work so fast) more parsimonious of server resources than a single gargangtuan request? There’s less overall overhead in “look ahead” than in “on-demand”? Is the arbitrary limit of “up to 5 products” an artificial limit that would not be likely to apply in a real-life scenario?

  23. Pingback: Why to consider Node.js for your mobile web apps? - Blog

  24. Pingback: StrongLoop and NodeFly Partner to Help the Enterprise take Node.js to Production | StrongLoop

  25. Sameer Naik


    Can you describe ubuntu tuning which allowed you to support 120,000 concurrent connections per process? Were these 120,000 concurrent TCP connections?


  26. Pingback: To node or not to node – Overview of Node.js as of July,2013 | Developing Scalable Applications

  27. Pingback: Node.js, Java e Scala | Intertesto

  28. Pingback: There is more to Node.js than buzz — GigaOM Research

  29. Pingback: Projects, Applications, and Companies Using Node Companies

  30. Pingback: Node.js真的无所不能?那些不适用的应用领域分析 | zengine

  31. Pingback: Nodejs学习路线图 | 粉丝日志

  32. Pingback: Node.JS 学习路线图,Node.JS最佳学习资料 | 赵伊凡's Blog

  33. Pingback: There is more to Node.js than buzz — Gigaom Research

  34. Dustin Jones

    This sounds interesting. Our software application targets buyers purchasing items on eBay that are newly listed Buy It Now items via the API by using a “ton” of API calls (several a second) to call out for items. Research shows, the more we call out, the better chances we have of pinging an alternate eBay server where a newly listed item has been replicated to, thus finding it before others.

    Would there be any useful benefit to utilizing this technology?

  35. Pingback: Nodejs学习资源-猫猫前端页面合理优化

  36. Pingback: How to: How to decide when to use Node.js? | SevenNet

  37. Pingback: How to: How to decide when to use Node.js? | Technical information for you

  38. Pingback: Solution: How to decide when to use Node.js? #dev #it #computers | Good Answer

  39. Pingback: Nodejs学习路线图 | gleamweb

  40. Pingback: Node.js - Khai's personal knowledge vault.

  41. Pingback: Fix: Is anyone using Node.js as an actual web server? #it #dev #answer | SevenNet

  42. Pingback: Stop fighting Node.js in the enterprise – Wintellect DevCenter

  43. Pingback: Solution: Is anyone using Node.js as an actual web server? #it #programming #solution | IT Info

  44. Pingback: ZhpBlod - Node.JS 学习路线图

  45. Pingback: Nodejs学习路线图 - Getting-Real | 孙磊前端技术知识博客-专注于分享前端开发技术、前端开发工具资源,追求用户体验设计的个人原创博客

Leave a Reply

Your email address will not be published. Required fields are marked *