Archive for the ‘Trends’ Category
Some of the largest sites on the internet — sites you probably interact with on a daily basis — are written in PHP. If PHP sucks so profoundly, why is it powering so much of the internet?
The only conclusion I can draw is that building a compelling application is far more important than choice of language. While PHP wouldn’t be my choice, and if pressed, I might argue that it should never be the choice for any rational human being sitting in front of a computer, I can’t argue with the results.
“In Rainbows,” a new record album from an English band Radiohead, has challenged the traditional music sales model by allowing listeners to determine the price they are willing to pay for the album which is available for download on its website.
Figures coming out from comScore – an internet marketing research company – 38% paying an average of about $6 seem much more realistic than the numbers floated earlier – average price between $5 and $8 for 1.2 million downloads.
I’m one of those 62%. Radiohead’s music is decent but not my taste. But, had the album not been available for download, I wouldn’t even have heard of them or their music. Moreover, they probably made much more money from these downloads than they could ever make from regular album release.
Latest version of BitTorrent client (v 6.0), which is based on closed-source uTorrent (acquired a while back), has not been released, neither has been the protocol specs. The “official” BitTorrent client has never been very popular compared to other protocol implementations like Azureus and uTorrent.
Protocol specs, although technically closed, are available with very tightly maintained SDK license. And, all previous version of protocol as well as the client are available openly.
In lifecycle of any technology “The step after ubiquity is invisibility“. I always hoped BitTorrent to follow that curve. Not anymore probably.
However this piece on Yahoo! Search Blog is welcome news
webmasters can now mark parts of a page with a ‘robots-nocontent’ tag which will indicate to our crawler what parts of a page are unrelated to the main content and are only useful for visitors.
If the trend catches on, and becomes a standard (has to get Google’s support), it would be greatly helpful.
The Internet browser is the new OS. What if a “thin client application” becomes thicker than the “Thick”s of the lot. The problem with web applications– “[…]is that they have tried too hard to make the web into a complete application platform, to the point where they don’t even bother holding themselves to the same standards by which desktop application developers are judged.”
Intersting blog- Uncov