Usage statistics - Evolution over the period

Period

Start date: 1 April 2009

End date: 30 April 2009

Duration: 30 days

Number of sub-tests failed

Evolution over the period
Per number of sub-tests failed
Sub-tests failed 30 Mar 2009
- 5 Apr 2009
6 Apr 2009
- 12 Apr 2009
13 Apr 2009
- 19 Apr 2009
20 Apr 2009
- 26 Apr 2009
27 Apr 2009
- 3 May 2009
mobileOK pages 15% 17% 15% 10% 8%
1 sub-test failed 5% 5% 6% 4% 6%
2 sub-tests failed 5% 4% 5% 4% 4%
3 sub-tests failed 4% 4% 5% 5% 4%
4 sub-tests failed 4% 5% 5% 4% 5%
5 sub-tests failed 8% 8% 7% 7% 8%
6 sub-tests failed 7% 7% 8% 6% 7%
7 sub-tests failed 7% 6% 7% 7% 7%
8 sub-tests failed 6% 6% 6% 5% 5%
9 sub-tests failed 5% 6% 6% 6% 5%
10 sub-tests failed 5% 5% 4% 4% 5%
11 sub-tests failed 6% 4% 5% 5% 5%
12 sub-tests failed 6% 6% 5% 6% 6%
13 sub-tests failed 6% 5% 5% 5% 5%
14 sub-tests failed 3% 4% 3% 5% 3%
15 sub-tests failed 3% 3% 2% 3% 4%
16 sub-tests failed 1% 2% 2% 3% 3%
17 sub-tests failed 1% 2% 2% 3% 2%
18 sub-tests failed 1% 1% 1% 2% 5%
19 sub-tests failed 1% 1% <1% 2% 1%
20 sub-tests failed 1% 1% 1% 1% 1%
21 sub-tests failed 1% <1% 1% <1% 1%
22 sub-tests failed <1% <1% <1% 1% <1%
23 sub-tests failed <1% <1% <1% 1% <1%
24 sub-tests failed <1% 0% <1% <1% <1%
25 sub-tests failed <1% <1% 0% <1% 0%
26 sub-tests failed 0% 0% 0% <1% 0%
28 sub-tests failed 0% 0% <1% 0% 0%

A few notes

mobileOK, tests, and sub-tests

Tests performed by the mobileOK checker are defined in the mobileOK Basic Tests 1.0 document. The term sub-test used below refers to an aspect of a given test that may FAIL.

How to read the percentages

The statistics only take into account the URIs for which the checker could run the tests. The checker typically cannot run the tests when:

A given URI may yield different results in time. When a URI is checked more than once during the period, only the most recent result appears in the statistics. This ensures that statistics are not biased by pages that include a mobileOK logo along with a link to the checker for instance.

URIs or domains?

Viewing statistics per URI isn't the most representative view one could think of. A better view would be "per website". Unfortunately, there's no automatic way to link pages of a website together. The statistics include a view per domain name, but domain names are often used by many different websites, and thus this view is probably even more misleading than the view per URI. The better view is somewhere between the view per URI and the view per domain name, closer to the view per URI...

Does it represent the state of the Web?

No! One must keep in mind that the URIs that were checked come from users that chose to use the mobileOK checker. This does introduces a bias in the sample oriented towards mobile-friendliness. Figures in terms of the number of Mobile Web Best Practices not followed would likely be far worse if the stats were computed from a representative set of URIs. However, these stats do provide a useful view of the most common problems content authors encounter when moving to mobile.

On missing Mobile Web Best Practices

Also keep in mind that the tests only cover a restricted set of the Mobile Web Best Practices. Besides, they sometimes cover only a portion of one given Best Practice. In particular, a few Mobile Web Best Practices that cannot be automatically checked, such as CENTRAL_MEANING, SCROLLING are not covered by the tests, and as such cannot appear in these statistics.

Alternatively, some generic tests are performed that do not match any of the Mobile Web Best Practices and are identified by their internal name within the mobileOK Basic Tests 1.0 document, e.g. HTTP_RESPONSE, META_HTTP_EQUIV.