Don’t Build Pages, Build Modules

We are in an interesting phase of rethinking frontend engineering across eBay Marketplaces, and this blog summarizes where we are heading.

Modular programming is a fundamental design technique that has been practiced since the dawn of software engineering. It is still the most recommended pattern for building maintainable software, and the Node.js community fully embraces this design philosophy. Most Node.js modules are created using smaller modules as building blocks to achieve the final goal. Now with Web Components gaining momentum, we decided to change our approach toward building frontend web applications.

Modularity was already in our frontend codebase, but only in the scope of a particular language. Most of our shared JavaScript (overlays, tabs, carousel, etc.) and CSS (buttons, grid, forms, icons, etc.) were written in a modular fashion. This is great, but when it comes to a page or a view, the thinking was still about building pages, as opposed to building UI modules. With this mindset, we found that as the complexity of pages grew, it became exponentially more difficult to maintain them. What we wanted was a simple way to divide a page into small and manageable pieces, and to develop each piece independently. This is when we came up with the notion, “Don’t build pages, build modules.”

Modular thinking

In general, everyone understands and agrees with the concept of frontend modules. But to make the concept a reality, we needed to deviate from our current style of web development.

Decomposition: First, we wanted to move away from the idea of directly building a page. Instead, when a requirement comes in the form of a page, we decompose it into logical UI modules. We do so recursively until a module becomes FIRST. This means a page is made up of a few top-level modules, which in turn are built from sub-modules, very similar to how JavaScript modules are built in the Node.js world. There are common styles and JavaScript (e.g., jQuery) that all modules depend on. These files together become a module of their own (e.g., a base module) and are added as dependencies of other modules. Engineers start working independently on these modules, and a page is nothing more than a grid that finally assembles them.

modules

DOM encapsulation: We wanted all of our frontend modules to be associated with a DOM node and for that node to be the module’s root element. So we place all client-side JavaScript behavior like event binding, querying the DOM, jQuery plugin triggers, etc. within the scope of the module’s root element. This gives perfect encapsulation to our modules by making them restrictive and truly independent. Obviously we needed some sort of JavaScript abstraction to achieve this encapsulation, and we decided to go with the lightweight widgets functionality (named Marko Widgets) offered by RaptorJS. Marko widgets are a small module (~4 KB) providing a simple mechanism for instantiating widgets and binding them to DOM elements. Instantiated widgets are made into observables, and can also be destroyed when they need to be removed from the DOM. To bind a JavaScript module to a DOM node, we simply used the w-bind directive as shown below:

<div class="my-module" w-bind="./my-module-widget.js"> ... </div>

When rendered, the Marko widgets module tracks which modules are associated with which DOM nodes, and automatically binds the behavior after adding the HTML to the DOM. Some of our modules, like our tracking module, had JavaScript functionality, but no DOM node association. In those scenarios, we use the <noscript> tag to achieve encapsulation. With respect to CSS, we name-space all class names within a module with the root element’s class name, separated with ‘-‘ such as gallery-title, gallery-thumbnail, etc.

Packaging: The next big question was how do we package the modules? Frontend package management has always been challenging, and is a hotly debated topic. Packaging for the same asset type is quite straightforward. For instance, in JavaScript once we nail down a module pattern (CommonJS, AMD, etc.), then packaging becomes easy with tools like browserify. The problem is when we need to bundle other asset types like CSS and markup templates. Here, our in-house Raptor Optimizer came to the rescue. The optimizer is a JavaScript module bundler very similar to browserify or webpack, but with a few differences that make it ideal for our module ecosystem. All it needs is an optimizer.json file in the module directory, to list out the CSS and markup template (dust or marko) dependencies. For JavaScript dependencies, the optimizer scans the source code in the current directory, and resolves them recursively. Finally, an ordered, de-duped bundle is inserted in the CSS and JavaScript slot of the page – for example:

[
    "./base",
    "gallery.less",
    "gallery.html" 
]

Note that the markup templates will be included only when rendering on the client side. Including them otherwise will unnecessarily increase JavaScript file size.

File organization

Going modular also meant changing the way files were structured. Before applying modularity to the frontend codebase, teams would typically create separate top-level directories for JavaScript, CSS, images, fonts, etc. But with the new approach it made sense to group all files associated with a module under the same directory, and to use the module name as the directory name. This practice raised some concerns initially, mainly around violating a proven file structuring scheme, and around tooling changes related to bundling and CDN pushing. But engineers quickly came to an agreement, as the advantages clearly outweighed the disadvantages. The biggest benefit is that the new structure truly promoted module-level encapsulation:  all module-associated files live together and can be packaged.  In addition, any action on a module (deleting, renaming, refactoring, etc., which happen frequently in large codebases) becomes super easy.

before_after

Module communication

We wanted all of our modules to follow the Law of Demeter – meaning two independent modules cannot directly talk to each other. The obvious solution was to use an event bus for communication between client-side modules. We evaluated various eventing mechanisms, with the goal of having it centralized and also not introducing a large library dependency. Surprisingly, we settled on the eventing model that comes with jQuery itself. jQuery’s trigger, on, and off APIs do a fantastic job of abstracting out all eventing complexities, for both DOM and custom events. We wrote a small dispatcher wrapper, which handles interactions between modules by triggering and listening to events on the document element:

(function($) {
    'use strict';
    var $document = $(document.documentElement);

    // Create the dispatcher
    $.dispatcher = $.dispatcher || {};

    var dispatcherMethods = {
        trigger: function(event, data, elem) {
            // If element is provided trigger from element
            if(elem) {
                // Wrap in jQuery and call trigger                
                return $(elem).trigger(event, data);
            } else {
                return $document.trigger(event, data);
            }
        },

        on: function(event, callback, scope) {
            return $document.on(event, $.proxy(callback, scope || $document));
        },

        off: function(event) {
            return $document.off(event);
        }
    }; // dispatcherMethods end

    // Attach the dispatcher methods to $.dispatcher
    $.extend(true, $.dispatcher, dispatcherMethods);
})(jQuery);

Modules can now use the $.dispatcher to trigger and listen to custom events without having any knowledge about other modules. Another advantage of using the jQuery DOM-based eventing model is that we get all event dynamics (propagation and name-spacing) for free.

// Module 1 firing a custom event 'sliderSwiped'
$.dispatcher.trigger('sliderSwiped', {
    activeItemId: 1234
});

// Module 2 listening on 'sliderSwiped' and performing an action
$.dispatcher.on('sliderSwiped', function(evt, data) {
    fetchItem(data.activeItemId);
});

Some teams prefer to create a centralized mediator module to handle the communication. We leave that to engineers’ discretion.

Multiscreen and view model standardization

One of the biggest advantages of frontend modules is they perfectly fit in the multiscreen world. Flows change based on device dimensions, and making a page work either responsively or adaptively on all devices is not practical. But with modules, things fall in place. When engineers finalize the modules in a view, they also evaluate how they look and behave across various screen sizes. Based on this evaluation, the module name and associated view model JSON schema are agreed upon. But the implementation of the module is based upon the device. For some modules, just a responsive implementation is sufficient to work across all screens. For others, the design and interactions (touch or no-touch) would be completely different, thus requiring different implementations. However different the implementations may be, the module name and the view model powering it would be the same.

We indeed extended this concept to the native world, where iOS and Android apps also needed the same modules and view models. But the implementation is still native (Objective-C or Java) to the platform. All clients talk to the frontend servers, which are cognizant of the modules that a particular user agent needs and respond with the appropriate view model chunks. This approach gave us a perfect balance in terms of consistency and good user experience (by not compromising on the implementation). Companies like LinkedIn have already implemented a view-based JSON model that has proved successful. The granularity of the view model is decided by engineers and product managers together, depending on how much control they need over the module. The general guideline is to make the view model’s JSON as smart as possible and the modules dumb (or thin), thus providing a central place to control all clients.

Associated benefits

All of the other benefits of modular programming come for free:

  • Developer productivity – engineers can work in parallel on small contained pieces, resulting in a faster pace.
  • Unit testing – it has never been easier.
  • Debugging – it’s easy to nail down the problem, and even if one module is faulty others are still intact.

Finally, this whole approach takes us closer to the way the web is evolving. Our idea of bundling HTML, CSS, and JS to create an encapsulated UI module puts us on a fast track to adoption. We envision an ideal future where all of our views, across devices, are a bunch of web components.

Conclusion

As mentioned earlier, we are indeed in the process of rethinking frontend engineering at eBay, and modularization is one of the first steps resulting from that rethinking. Thanks to my colleagues Mahdi Pedramrazi and Patrick Steele-Idem for teaming up and pioneering this effort across the organization.

– Senthil
Frontend Engineer

18 thoughts on “Don’t Build Pages, Build Modules

  1. Tony Topper

    This is very similar to what we’ve been doing with Largo, our prototyping platform over in design. We’re using Dust and Grunt though as opposed to Marko and RaptorJS optimizer.

    I’ve been pushing our team towards the Law of Demeter and using an event bus as well. Which was a correction to some mistakes we noticed in some earlier prototypes.

    Really excited to see some improvement here at ebay on the front-end side! I’d love for our team to help.

  2. Janardhan

    Nice Article!
    Under the modules i see the .html files, are they a just a template markups? If not, are you using shadow DOM’s?

  3. Robinson Raju

    Great article. I always look forward to reading your articles, Senthil.
    How different is this approach to portlets which were self contained modules? We had CCHP in the past which had portlets, I guess. Also does this mean that you’d be loading ‘n’ JS/CSS files if there are ‘n’ modules in a page as opposed to one?

    1. Senthil Padmanabhan Post author

      Thank Robin. Yes the concept is very similar, but portlets were both frontend and backend, this approach is purely frontend. Also portlets did not have the concept of DOM encapsulation, packaging, communication etc.
      There will be only 1 JS & CSS request, the files get aggregated during build step and just one file is present in the production version. In dev environment we have separate files loaded.
      Also programing is a cycle, old things again become new. Like how functional programing done 30 years before, is the new cool thing as opposed to OOP 🙂

  4. Pingback: In the News: 2014-10-05 | Klaus' Korner

  5. Pingback: Links & Reads for 2014 Week 40 | Martin's Weekly Curations

  6. Ramesh

    Hey, Nice article.. Sure it is hotly debated on “Front-end” dependency management.. Just curious why to depend on jQuery to propagate custom-events , when plain old Javascript itself can propagate events..

    1. Senthil Padmanabhan Post author

      Agree and that is one of the reasons we have it in a centralized place. Once we nail down a library free eventing mechanism we will start using it instead.

  7. Yoel Damas

    Considering I am in the processing of starting my own app company this blog was extremely helpful.

  8. Stephen

    Interesting read — it seems to be in-line with many of the best practices prescribed around the internet.

    I’m curious how you go about packaging your applications for deployment.

    1) How do you package your views and modules for deployment?
    2) Is it possible to deploy only the code for a module that changes?
    3) What kind of testing is done when a module is updated? Do you test just the module, or is the entire view regression tested? (I’m assuming automation in this regard, is this true?)

    1. Senthil Padmanabhan Post author

      1) How do you package your views and modules for deployment?
      >>> The complete package and deployment is taken care by our open sourced tool Raptor Optimizer https://github.com/raptorjs/optimizer

      2) Is it possible to deploy only the code for a module that changes?
      >>> It again depends on how you package your dependencies. Currently every release a new bundle is rolled out. We will revisit it in the future

      3) What kind of testing is done when a module is updated? Do you test just the module, or is the entire view regression tested? (I’m assuming automation in this regard, is this true?)
      >>> We are using an in-house testing framework which uses mocha and PhantomJS. All unit tests are module based, the entire view regression testing happens through selenium automation

  9. Vincent

    Why don’t you use react and a flux architecture e.g. with alt.js ? The component based thinking seems to fit. 🙂

  10. Amit

    This is how we used to build modules when I was at Yahoo few years ago. Seems like the rest of the web is catching up.

Comments are closed.