Sunday, June 14, 2009

What happened in the last quarter of 2007

This post is totally unrelated to "Application Security", I was quering Google Trends for some websites and I found a very strange peek at the last quarter of 2007 for all major websites, does anyone know what happened?









































Try any other major website and you'll see this peak, is it a bug in Trends or was there an important event that I missed?

Tuesday, June 2, 2009

Torpig: The most advanced pieces of crimeware ever created, Part 1

A group of researchers from the University of California, Department of Computer Science, Security Group made a comprehensive research[must read] about what is known to be "The most advanced pieces of crimeware ever created".

Torpig worth a lot of attention from the community so I decided to write two posts about Torpig.
part 1 will be just an overview about how Torpig works, and most importantly part 2 which will discuss the weaknesses of the Torpig and raise the question of "If this malware were designed without these weaknesses -I'll suggest how-. How can good guys defeat it"

Torpig is a sophisticated malware program that is designed to harvest sensitive information (such as banks account and credit cards data) from the machines it infects.
What is very unique about this malware is its ability to spread, collect sensitive information from infected machines and communicate with the botmaster to provide him with the collected data.

First, let's start by understanding how this malware works -according to the paper-.

* The malware spread using different techniques (drive-by download).
The fact here that we have at least 182,800 infected machines and the number is increasing

















* The downloaded executable acts as an installer for Mebroot. The installer injects a DLL into the file manager process (explorer.exe), and execution continues in the file manager’s context.
This makes all subsequent actions appear as if they were performed by a legitimate system process. The installer then loads a kernel driver that wraps the original disk driver (disk.sys). At this point, the installer has raw disk access on the infected machine. The installer can then overwrite the MBR of the machine with Mebroot. After a few minutes, the machine automatically reboots, and Mebroot is loaded from the MBR.














* The malware gather sensitive information using a lot of techniques (e.g. gather all web traffic made by the infected machine, use phishing attacks for banks and major financial sites (PayPal), email clients, instant messengers, etc)










* The malware uses the "domain flux" (periodically generate a large list of domain names infected machines are to report to)












well this is how generally Torpig works, please refer to the paper to get more details about it. and follow the next post that will discuss the weaknesses of Torpig and how it could have been mitigated.

Thursday, May 28, 2009

Web Application Security Trends Q3-Q4 2008

The "Web Application Security Trends Q3- Q4 2008" is published, I guess there are a lot of interesting findings in this report, I'm sharing here with you what I see the most important stuff

1- SQL injection got its first position back over XSS














2- as expected more and more hackers are joining the club












3- IE and FF are gaining almost the same attention












4- CSRF is gaining more attention everyday















5- more important, CSRF was usually exploited by whitehats for demonstrations, Q3-Q4 2008 is the first time for blackhats to use it. So I guess more attention should be paid now for it.















if you have any comments about this report please share it with me in the comments area.

Tuesday, May 19, 2009

Yahoo Groups Voting Vulnerability

I found a very interesting security bug at Yahoo Groups Voting System, exploiting this bug leads to complete control of the voting result. 

You can watch the video to see how I made 4 polls -they could be more- using a single account


<br/><a href="http://video.msn.com/video.aspx?vid=cf581209-ad25-4920-821f-0a11bf849cf4" target="_new" title="Yahoo Groups Voting Bug">Video: Yahoo Groups Voting Bug</a>

Bug Demonstration
It's clear that the voting system is differentiating between different users by examining the email address that is associated with the group, and since you can add unlimited number of emails associated with single Yahoo ID and the group settings allow you to change between these emails. you can simply add your vote then change the associated email then vote again as a new voter, you can keep repeating this until you fully manipulate the voting results to the one of your choice.

Simple steps to reproduce this security bug
1- go to the poll of your choice
2- select your candidate of the choices
3- click "Edit Membership"
4- under "Email Address" click "Add new email address" and verify it
5- keep repeating step 4 until you add sufficient number of email addresses
6- now choose any of them as your default associated email
7- go to the poll again and congratulations you can add your vote as it's your first time to add it
8- keep changing the associated email address until your candidate of the vote options win :)

Conclusion
It's very clear that this bug is a security logic vulnerability and hince no static code analysis tool is able to find it (never depend on static code analysis tools only).
Although it's very easy to exploit this vulnerability (I didn't write scripts, didn't run automated scans or use any complex method to exploit this vulnerability) the imact of the vulnerability is very high (maybe all the votes that were created before were manipulated).
There could be more vulnerabilities in Yahoo Groups that I didn't investigate if they have more stuff depending on the associated email 

Keep checking this blog as I decided to publish more and more security vulnerabilities at major websites, since they never fix their issues unless you fully disclose their bugs :)


Sunday, April 26, 2009

CSRF session at Microsoft innovation day (22nd April, 2009)



I was invited by the CuttingEdge Club to make a presentation about "Application Security" in Microsoft Innovation Day, I thought first that most of the attendees will be professional developers so I decided to exclude common topics in application security like "XSS, SQL Injection, Input Validation".

I decided to make it about "Cross Site Request Forgery" specifically the session title was "How do I: Protect from Cross Site Request Forgery in ASP.NET". I think it was quite interesting for the audience -specially that most of the other sessions were about SharePoint-.

I found out later that most of the attendees are students so I tried to use only simple terms -I don't think I managed to do it- as CSRF is quite complicated by nature and most developers confuse it with XSS.

Anyway I think I managed to spread the awareness of application security vulnerabilities and their huge impact -either financially or from privacy prospective- on the internet today.


Here is the presentation

Monday, September 29, 2008

Introduction to Web Crawling


Why this post is so important?




Actually I'm not going to tell why do we need to learn web crawling or how much stuff that could be done and money that could be gained if you can develop a good crawler, because I believe that since you are here then you need to crawl and since you are here it doesn't mean that I'm one of the best bloggers around here but it means that THERE IS NO HELP IN THIS TOPIC. that's why this blog is here because people need web crawling and they can't do it because there is no help. I started searching for some material -just like you- and I found almost nothing so I started to do it by my own and here I am publishing what I know for everyone.

What are my resources?

as I said there aren't many resources, however I list here the few resources I found and used

1-Book: "Http Recipes For C# Bots" by Jeff Heaton

This is maybe the only book that talks about Web Crawling from a developer point of view, however I believe that it doesn't go deep enough in order to push you to the real work. I read that book several times and always was able to access his website by crawlers I wrote, however I wasn't able to access other websites because as I stated earlier the book doesn't go deep enough


2-Web Article: Tools for access site .NET

This is maybe the only post I found that talks about real crawlers development, it list a great tools for crawler development, beside describing one crawler that logins to Yahoo Address -with source code included- however in a very superficial way. no description about using the listed tools, no code snippets were used. I use most of the tools listed at that post and they are all very beneficial, I'll try at my blog to list each tool and how to deal with in more details. in order to get you on the road very fast


3-Web Crawler, spider, ant, bot... how to make one?

Another interesting article that gives a complete crawler example in VB.NET, again I don't think the article can help you write your own crawler for your own purpose but reading it is still beneficial as there isn't a lot of resources like I stated before


4-My own experience

I work as a Software Engineer in ITWorx besides being a freelancer I developed several crawlers for many websites and here is a list of the recent crawlers I made

1-Yahoo Answers Crawler
2-People search crawler that is gathering information from

  • Yahoo People
  • Lycos People Search
  • Peopledata.com
  • Superpages.com
3- Script Enjection finder in which the crawler scan list of websites for injected scripts -Cross Site Scripting-

and lots of other crawlers that I built -you can check my recents freelancing projects here-, I believe that sharing my experiences in this field will be very benificial to you -that's why I'm writing this blog-

What are the tools needed to write your own crawler?

here they are ...

1. Mozilla's FireFox -the browser that your bot will simulate-
2. Microsoft's Fiddler -Network analysis tool-
3. RAD Software's Regular Expression Designer -for extracting important data from web pages
4. Piriform's CCleaner -for clearing out your cookies (it's not that important)
5. Mozilla's Firefox Addon: Web Developer -will helps you in analyzing the pages
6. Mozilla's Firefox Addon: Firebug -actually I didn't use it during my work-
beside these tools (my suggsetions)
7. Wireshark -another network analysis tool. I'll state later why I would need two network analysis programs-
8. Visual Studio 2005 or later -I'll primary use C# .NET 2.0 to build my crawlers and I may add later a java version of the crawlers I'll build-

Finally

please try to read all the resources and get all the tools I listed above in order to be ready for my later posts.

comments are very welcomed for this post or anyone later I'll try to start writing my nexts posts as soon as I can.