Htscanner

If you’re a PHP developer who use Windows for development platform, you might want to try htscanner. It’s a PHP extension that parses configuration files (for example .htaccess) and changes the PHP settings according to it. You can find its windows binaries here. It’s a very useful extension especially when you need to simulate PHP …

Leave Comments Everywhere (Not You, Spammer!)

So, you want to let everyone commenting on any of your web pages? Easy, let the kitty does the work for you. JS-Kit.com gives an elegant yet simple solution for commenting in social network era. buy 127 hours film on dvd All you need to do is to put this simple scripts on your web …

IE6 and IE7 Running on a Single Machine

With the release of IE7 many web developers (like me) were faced with a need to test their applications on different versions of IE, but it was impossible for them since there is only one version of IE can be installed on Windows. the full easy a movie thor hd But today, as announced by …

Thanks For Your Votes

PHPClasses.org has just announced the winner of Innovation Award during the month of October 2006. As you can see on my badge here, Ngeblog was ranked in second position with 17.24% from total votes. move 127 hours jack goes boating characters movie no strings attached download movie the last airbender hd buy the bedrooms film …

What Geek Like Me Wants from Mobile Phone

I was preparing my new Zendbox alike server to host all of my toys when i found some interesting feed items on my reading list, all talking about the same thing, mobile phone. Which brought me to Ian Hay’s top ten list of what people want from his/her mobile phone.

I’m not really good at making list, so if someone ask me what i want from my mobile phone i’ll give him one single answer: control.

For that matter, i want my mobile phone to support open source software and has open hardware architecture. That’s all i need, i’ll take care the rest myself, thanks.

Against the System: Rise of the Robots

…big difference between the web and traditional well controlled collections is that there is virtually no control over what people can put on the web. Couple this flexibility to publish anything with the enormous influence of search engines to route traffic and companies which deliberately manipulating search engines for profit become a serious problem.

That was the quote from Sergey Brin and Lawrence Page’s paper about the prototype of Google search engine which then was in http://google.stanford.edu/.

But i don’t think even Brin nor Page would expect that their invention could bring another problem that emphasize what they meant with “no control over what people can put on the web”.

Yesterday post from Securiteam blog shows us that people can use Googlebot to attack other websites anonimously.

The idea is quite simple, all you have to do is to create a malicious website that contains links attacking web application (CSRF), like this:

http://the-target.com/csrf-vulnerable?url=http://maliciousweb.com/attackcode

and submit this to Google. When Googlebot comes to your website and find this link it will dutifully try to index the URL. And when it does .. bang! the robot do the job for you, attacking your target.

This is not a new idea though. Michal Zalewski wrote about this in 2001 in title “Against the System: Rise of the Robots“. His introduction tells us the whole idea,

Consider a remote exploit that is able to compromise a remote system without sending any attack code to his victim. Consider an exploit which simply creates local file to compromise thousands of computers, and which does not involve any local resources in the attack. Welcome to the world of zero-effort exploit techniques. Welcome to the world of automation, welcome to the world of anonymous, dramatically difficult to stop attacks resulting from increasing Internet complexity.

However, this kind of attack is not only Googlebot’s problem, other search engine bot have the same kind of ability to do the dirty job for you like MSN, Yahoo and dozen of others.

So who’s to blame? Surely, the bad guy who run the original website. Although you can also put the blame to the owner of the victim websites which ignore the security factor and leave all their pages open to any bot for higher pagerank.