Tim’s Dream for the Web

Even before I get myself this username (timdream) and polluted the search result, search result of that term will give you a link to on LogicError, appropriated titled “Tim’s Dream for the Web”. That “Tim” was Sir Tim Berners-Lee, and he imagines the Web to be the powerful means for collaboration between not only people, but machines.

Weaving the Web, credit: W3C.

Weaving the Web, credit: W3C.

Collaboration between people has been closer than ever because of the Web. It have already grew closer thanks to the technologies set branded as “HTML5”, and the technologies will only grew stronger, hopefully beyond the catchphrase. As of the machines, a form of Semantic Web is indeed slowly emerging, but not in the ideal open, collaborative way and in a more top-down, proprietary approach, limited to the centralized design in which many are asked to interface with few proprietary platforms who encapsulates all data.

It’s a good thing for people to start enjoying what it is possible when the data is linked — yet, having them linked in a more open way while safeguarding our privacy remain the work to be done. This week we saw the 20th anniversary of W3C, and the HTML5 spec reaches Recommendation status — the events themselves are testimony of the rough history of the Web. I remain cautiously optimistic on this one — the organizations rally on this front is largely intact, and people will, eventually more aware of the danger when as they enjoy the convenience of the technologies.

The LogicError page was in my bookmark since a long time ago, like Firefox 0.8-ish, but not until now I realized LogicError is in fact an Aaron Swartz project. The footnote itself, with the name, kind of say something.

Use Promise, and what to watch out if you don’t

If you do know Promise, consider the following code; do you know the order of the resulting log? (answered below)

var p = new Promise(function(resolve, reject) {
p.then(function() {
setTimeout(function() {
p.then(function() {
setTimeout(function() {

The Promise interface

The Promise interface is one of the few generic interfaces that graduated from being a JavaScript library to be a Web platform API (The other being the JSON interface.) You already know about it if you have heard of Q, or Future, or jQuery.Deferred. They are similar, if not identical, things under different names.

Promise offers better asynchronous control to JavaScript code. It offers a chain-able interface, where you could chain your failure and success callbacks when the Promise instance is “rejected” or “resolved”. Any asynchronous call can be easily wrapped into a Promise instance by calling the actual interface inside the synchronous callback function passed when constructing the instance.

The ability to chain the calls might not be a reason appeal enough for the switch; what I find indispensable is the ability of Promise.all(); it manages all the Promise instances on your behalf and “resolves” the returned promise when all passed instances are resolved. It’s great if you want to run multiple asynchronous action in parallel (loading files, querying databases) and do your things only if everything have returned. (The other utility being Promise.race(), however I’ve not found a use case for myself yet.)

Keep in mind there is one caveat: compare to EventTarget callbacks (i.e. event handlers), this in all Promise callbacks are always window. You should wrap your own function in bind() for specific context.

The not-so-great alternatives

Before the Promise interface assume it’s throne in the Kingdom of Asynchronous Control, there are a few alternatives.

One being the DOMRequest interface. It feels “webby” because it’s inherited from the infamous EventTarget interface. If you have ever add a event listener to a HTML element, you have already worked with EventTarget. A lot of JavaScript developers (or jQuery developers) don’t work with EventTarget interface directly because they use jQuery, which absorb the verboseness of the interface (and difference between browser implementations). DOMRequest, being an asynchronous control interface simply dispatches success and error events, is inherently verbose, thus, unpopular. For example, you may find yourself fighting with DOMRequest interface if you want to do things with IndexedDB.

Another terrible issue with DOMRequest is that it’s usage is entirely reserved for native code, i.e. you can not new DOMRequest() and return the instance for the method of your JavaScript library. (likewise, your JavaScript function cannot inherit EventTarget either, which is the reason people turned to EventEmitter, or hopelessly dispatch custom event on the window object. That also means to mock the APIs inheriting EventTarget and/or returning DOMRequests, you must mock them too.)

Unfortunately, given the B2G project (Firefox OS) was launched back in 2011, many of the Web API methods return DOMRequest, and new methods of these APIs will continue to return DOMRequest for consistency.

The other alternative would be rolling your own implementation of generic asynchronous code. In the Gaia codebase (the front-end system UIs and preload web apps for B2G), there are tons of example because just like many other places in Mozilla, we are infected with Not-Invented-Here syndrome. The practices shoot us in the foot because what thought to be easily done is actually hard to done right. For example, supposedly you have the following function:

function loadSomething(id, callback) {
    if (isThere(id)) {
      getSomething(id, callback);


    var xhr = new XMLHttpRequest();
    xhr.onloadend = function() {
      registerSomething(id, xhr.response);

To the naïve eyes there is nothing wrong with it, but if you look closely enough you will realize this function does not return the callback asynchronously every time. If I want to use it:

loadSomething(id, function(data) {
  console.log(1, data);

The timing of 1 is non-deterministic; it might return before 2, or after. This creates Schrödinger bugs and races that will be hard to reproduce, and fix.

You might think a simple solution to the problem above would be simply wrap the third line in setTimeout(). This did solve the problem but it comes with issues of its own, not to mention it further contribute to the complexity of the code. Wrap the entire function, instead, in a Promise instance, guarantees the callbacks runs asynchronously even if you have the data cached.

(Keep in mind that the example above have lots of detail stripped; good luck finding the same pattern when someone else hides it in a 500-line function between 10 callbacks.)

Not-Invented-Here syndrome also contribute to other issues, like every other software project; more code means more bugs, and longer overhead for other engineers to pick up.


In the B2G project, we want to figure out what’s needed for the Web to be considered a trustworthy application platform. The focus has been enabling hardware access for web applications (however sadly many of the APIs was then restricted to packaged apps because of their proprietary nature and security model), yet I think we should be putting more focus on advancing common JavaScript interfaces like Promise. I can’t say for sure that every innovation nowadays are valid solutions to the problems. However, as the saying goes, the first step toward fixing a problem is to admit there is one. Without advances in this area, browser as an application runtime will be left as-is, fill with legacies for its document reader era and force developers to load common libraries to shim it. It would be “a Web with kilobytes of jquery.js overhead.”, one smart man once told me.

(That’s one reason I kept mention EventTarget v.s. EventEmitter in this post: contrary to Promise v.s. DOMRequest, the EventEmitter use case have not yet been fulfilled by the platform implementations.)

The answer to the question at the beginning is: 1, 3, 6, 8, 2, 5, 4, 7. Since all the callbacks are asynchronous except (1), only (1) happens before (3), (6), and (8). Promise callbacks (2) and (5) are run asynchronous and they return before setTimeouts.

English Vinglish: People’s journey across the language barrier

I don’t remember I have ever go to the cinema for a Bollywood movie, but I am glad I enjoyed it very much when I did this for the first time.

The movie remind me of the ESL classes I took. The frustration of not being able to express thoughts in English efficiently echoes the wider range of non-English speaking audiences, evidently by the success of the movie in these traditionally non-Bollywood markets, including Taiwan.

Compare to India, the English-speaking culture is different in Taiwan. English is not the working language of the mess, except for some white-collar works in forgiven companies or tech sectors (e.g., Mozilla in Taiwan). There is indeed a tread (or, “debate”) on wider-adoption of English usage in colleges. And of course, English dominance and culture invasion, and so on and so on.

That said, the English-learning students depicted in the movies are very true. If you only speak English and had (or, having) the experience working with people from other cultures, I wholeheartedly recommend you to see the movie. Pay attention to the thoughts and the minds of these the characters. In retrospect, think about the inherent behavior of these people as they went through their life-long journey of working with you in English.

This is the only reason I wrote this post, in English.