CSSS! and computational complexity of selectors.

Lets say we have following markup:

<ul>
   <li>First</li>
   <li>Second (with <a href=#>hyperlink</a>)</li>
   <li>Third</li>
   <li>Fourth (with <a href=#>hyperlink</a> too)</li>
</ul>

and the styling task: all <li> that have <a> elements inside have yellow background.

If we would have hypothetic selector :with-child(selector) then we could write this as:

li:with-child(a:link) { background:yellow; }

Problem with this selector lies in its complexity – to compute such selectors you need to scan whole subtree of the element (<li> here). That is a task of complexity O(n). n here is a number of children of the element. Potentially it could be the whole DOM ( html:with-child(a:link) ).

And again that O(n) is not that bad if it happens only once but for the case:

html:with-child(a:hover) p { ... }

all <p> elements of the document should get new styles each time when some <a> in document will get :hover state. And that is really bad. Standard methods of optimizing such cases (e.g. caching) may not be acceptable as they may require significant amount of RAM and/or again CPU resources.

… back to the original task of styling "<li> that have <a> elements inside". It is still possible to solve it while keeping resource consumption at the same level.

Imagine that we would have some magic engine that will assign, say, attribute has-a-inside to such <li> elements. Then our task of styling <li>s will turn into simple rule that is known to be effective:

li[has-a-inside] { background:yellow; }

In real world scenarios people these days use JavaScript for that. Requires some programming but works reasonably well.

But our desire to do this solely by CSS as JS is not always available. Here is the method of how to accomplish this by CSSS! (so called active CSS) introduced in recent versions of h-smile core (htmlayout and the Sciter).

In CSSS! we need to create one helper rule that will create attribute has-a-inside for each <li> with <a> inside:

li  
{  
  assigned!: self.has-a-inside = self.$1( a[href] )? "true" # null;
}

This rule simply means:

When element li will get this style declaration first time (so when assigned!) this statement do:

  1. assign "true" to the attribute has-a-inside if that li element (self) has at least one element inside matching the selector a[href].
  2. or assign null to that attribute – so to remove it form the element.

The only rule left is the one that I’ve already mentioned:

li[has-a-inside] 
{ 
  background:yellow;
}

Thus after loading such document all <li>s with hyperlinks will have yellow background. Done.

Of course it is not so abstractedly perfect if to compare with with-child(a[href]). E.g. our solution will not reflect dynamic DOM updates but that was the problem we were trying to solve anyway.

Nevertheless it will allow to solve the task in most of practical cases and will at least not make selectors matching implementation worse than it is now.

CSS, selectors and computational complexity

CSS (Cascading style sheets) has pretty extensive set of so called selectors. Example:

ul[type="1"] > li
{
  list-style-type: decimal;
}

Selector here means the following: each li element that is immediate child of ul that has attribute type with the value "1" has that style.

Selectors and assosiated styles constitute static system of styles. And so they obey following rule (the law of CSS selectors):

At any given moment of time all elements in the DOM that satisfy set of selectors have corresponding styles applied.

Such static nature of CSS selectors establish some obvious (and not so) limitations on what could be done with selectors and what types of selectors CSS can support. For example:

  • Selectors cannot use information defined by styles. Imagine hypothetical selector and style:
    div:visible { visibility:hidden; }
    

    Such pseudo-class always fails and so breaks the rule defined above. There is simply no such moment of time when the rule above is valid.

  • Selectors can use only those features/characteristics of the DOM that can be evaluated at constant (or near) time. Otherwise observer will be able to catch moment of time when the rule will not be true.
    • Consequence of this: selectors that require scan of number of elements of the DOM unknown upfront ( have complexity O(n) in other words ) are "persons non grata" in CSS.

There is nothing wrong in principle with O(n) complexity. At the end resolution of all styles of the DOM that has n number of elements is O(n). The problem arises when you use selectors that each has O(n) complexity. Total complexity in this case will be O(n*n) and that is the cause of troubles.  In case of frequent dynamic updates each such update may require re-computation of all styles of all elements so web page may easily become non-responsive – CPU will be busy resolving styles without doing any good to the user.

Imagine that we have selector like:

ul:not-exist(a:hover) {color:red;}

where :not-exist(…) is true when DOM does not contain element in brackets.

Computation of such pseudo-class is a task of complexity O(n) where n is a number of elements in the DOM.

No one of currently supported selectors in CSS selector module (http://www.w3.org/TR/css3-selectors/) have complexity of O(n). But there are selectors like : only-of-type pseudo-class that are "not good" as in some conditions their computation are as complex as O(n).

Next article will explain what could be done in CSS to overcome complexity problem and still be able to accomplish tasks like ul:not-exist(a:hover) {color:red;}

Some numbers if you wish:

Let’s assume that we have

  1. DOM with 1000 elements and
  2. single selector that has complexity  O(n) that requre 1 microsecond for computation.

Total time requred for the DOM to get its styles will be:

1000 * 1000 * 1μS = 1000000 μS = 1 S (one second)

Lets double number of elements in the DOM so we will get:

2000 * 2000 * 1μS = 4000000 μS = 4 S.

So by simply doubling number of elements we’ve increased time needed by four.

In real time scenario (on complex sites) it will easily make such sites too heavy to deal with.