Sunday, November 14, 2010

Ten iptables firewall rules to secure Linux box

Linux users have some inherent advantages over our fellow Windows users when it comes to security. Linux is both more secure and less common than Windows based systems with the consequence that attacks on Linux systems occur less frequently than on Windows systems. Having said that it would be foolish to be complacent about securing any system regardless of whether it runs Windows, Linux or any other operating system.
How do we do this? 
Use a Firewall, The first line of defense. Firewall is one of the gateways which filter the incoming and outgoing packets of network as per the pre- defined firewall rules. In IT world there are two types of firewalls: software and hardware. Cisco PIX/ASA, NetASQ are examples of hardware firewalls and ISA, SELinux, Iptables are examples of software firewalls.

The iptables tool is a magnificent means of securing a Linux box. But it would be rather overwhelming to say, even after you gain a solid understanding of the command structure and know what to lock down and how to lock it down, iptables can be confusing. But the nice thing about iptables is that it’s fairly universal in its protection. So having a few iptables rules to put together into a script can make this job much easier.
With that in mind, let’s take a look at 10 such commands. Some of these rules will be more server oriented, whereas some will be more desktop oriented. For the purpose of this article, I'm not going to explain all of the various arguments and flags for iptables. Instead, I’ll just give you the rule and explain what it does. For more information on the specifics of the rule, you can read the man page for iptables, which will outline the arguments and flags for you.

1: iptables -A INPUT -p tcp -syn -j DROP

This is a desktop-centric rule that will do two things: First it will allow you to actually work normally on your desktop. All network traffic going out of your machine will be allowed out, but all TCP/IP traffic coming into your machine will simply be dropped. This makes for a solid Linux desktop that does not need any incoming traffic. What if you want to allow specific networking traffic in -- for example, ssh for remote management? To do this, you'll need to add an iptables rule for the service and make sure that service rule is run before the rule to drop all incoming traffic.

2: iptables -A INPUT -p tcp --syn --destination-port 22 -j ACCEPT

Let's build on our first command. To allow traffic to reach port 22 (secure shell), you will add this line. Understand that this line will allow any incoming traffic into port 22. This is not the most secure setup alone. To make it more secure, you'll want to limit which machines can actually connect to port 22 on the machine. Fortunately, you can do this with iptables as well. If you know the IP address of the source machine, you can add the
-s SOURCE_ADDRESS option (Where SOURCE_ADDRESS is the actual address of the source machine) before the --destination-port portion of the line.

3: /sbin/iptables -A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT

This will allow all previously initiated and accepted exchanges to bypass rule checking. The ESTABLISHED and RELATED arguments belong to the --state switch. The ESTABLISHED argument says, "Any packet that belongs to an existing connection," and the RELATED argument says, "Any packet that does not belong to an already existing connection but is related to an existing connection." The "state machine" of iptables is a means for iptables to track connections with the help of the kernel level "conntrack" module. By tracking connections, iptables knows what connections can be allowed and what can’t. This reduces the amount of work the administrator has to do.
Here's how state works. If the local user initiates a connection, that packet (to that connection) is set as NEW in the pre-routing chain. When the local user gets a return packet, the state is changed to ESTABLISHED in the pre-routing chain. So when a state is set as ESTABLISHED, it can be allowed with the right iptables rule.

4: iptables -N LOGDROP

With this handy chain, iptables will log all dropped packets. Of course, this is only part of the chain. To complete it, you need to add the follow two rules:

a) iptables -A logdrop -J LOG

b) iptables -A logdrop -J DROP

Now all matching packets (in this case, anything that has been dropped) will be added to the logdrop chain which will log them and then drop them.

5: iptables -t nat -A PREROUTING -i WLAN_INTERFACE -p tcp –dport PORTNUMBERS -j DNAT --to-destination DESTINATION_IP

When you need to route packets from external sources to specific ports on specific internal machines, this is what you have to do. This rule takes advantage of network address translation to route packets properly. To suit your needs, the WLAN_INTERFACE must be changed to the WLAN interface that bridges the external network to the internal network, the PORTNUMBERS must be changed, and DESTINATION_IP must be changed to match the IP address of the destination machine.

6: iptables -A INPUT -p tcp --syn --dport 25 -j ACCEPT

This is the beginning of a SYN flood protection rule. This portion of the rule blocks DoS attacks on a mail server port. (You can change this to suit your mail server needs)
There are three more portions of this rule set. The first is to add the same rule but modify the port to whatever is being served up by whatever ports you have open. The next portion is iptables -A INPUT -p tcp --syn -m limit --limit 1/s --limit-burst 4 -j ACCEPT, which is the actual SYN flood protection. Finally, iptables -A INPUT -p tcp --syn -j DROP will drop all SYN flood packets.

7: iptables -A INPUT -p tcp -m tcp -s MALICIOUS_ADDRESS -j DROP

This is where you can take care of malicious source IP addresses. For this to work properly, you must make sure you know the offending source IP address and that, in fact, it's one you want to block. The biggest problem with this occurs when the offending address has been spoofed. If that's the case, you can wind up blocking legitimate traffic from reaching your network. Do your research on this address.

8: iptables -N port-scan

This is the beginning of a rule to block furtive port scanning. A furtive port scan is a scan that detects closed ports to deduce open ports. Two more lines are needed to complete this rule:

iptables -A port-scan -p tcp --tcp-flags SYN,ACK,FIN,RST RST -m limit --limit 1/s -j RETURN

iptables -A port-scan -j DROP

Notice that the above rule set is adding a new chain called "port-scan". You don't have to name it such; it's just easier to keep things organized. You can also add timeouts to the above rule set like so:

iptables -A specific-rule-set -p tcp --syn -j syn-flood

iptables -A specific-rule-set -p tcp --tcp-flags SYN,ACK,FIN,RST RST -j port-scan

9: iptables -A INPUT -i eth0 -p tcp -m state --state NEW -m multiport --dports ssh,smtp,http,https -j ACCEPT

What you see here is a chain making use of the multiport argument, which will allow you to set up multiple ports. Using the multiport argument lets you write one chain instead of multiple chains. This single rule saves you from writing out four separate rules, one each for ssh, smtp, http, and https. Naturally, you can apply this to ACCEPT, DENY, REJECT.

10: iptables -A PREROUTING -i eth0 -p tcp --dport 80 -m state --state NEW -m nth --counter 0 --every 4 --packet 0 -j DNAT --to-destination 192.168.1.10:80

If you're looking to load balance between multiple mirrored servers (in the example case, load balancing a Web server at 192.168.1.10), this rule is what you want. At the heart of this rule is the nth extension, which tells iptables to act on every "nth" packet. In the example, iptables uses counter 0 and acts upon every 4th packet. You can extend this to balance out your mirrored sites this way. Say you have four mirrored servers up and you want to balance the load between them. You could have one line for each server like so:

iptables -A PREROUTING -i eth0 -p tcp --dport 80 -m state --state NEW -m nth --counter 0 --every 4 --packet 0 -j DNAT --to-destination 192.168.1.10:80

iptables -A PREROUTING -i eth0 -p tcp --dport 80 -m state --state NEW -m nth --counter 0 --every 4 --packet 1 -j DNAT --to-destination 192.168.1.20:80


iptables -A PREROUTING -i eth0 -p tcp --dport 80 -m state --state NEW -m nth --counter 0 --every 4 --packet 2 -j DNAT --to-destination 192.168.1.30:80


iptables -A PREROUTING -i eth0 -p tcp --dport 80 -m state --state NEW -m nth --counter 0 --every 4 --packet 3 -j DNAT --to-destination 192.168.1.40:80

As you can see the server on .10 will be routed every 0 packet, the server on .20 will be routed every 1st packet, the server on .30 will be routed every 2nd packet, and the server on .40 will be routed every 3rd packet.

These 10 iptables rules will help you secure your Linux server. Of course, with anything Linux, there are multiple possibilities to achieve the same results. But these rules should serve as an outstanding springboard for Linux server security -- as well as Linux security discussion.

Saturday, November 13, 2010

Five Reasons why Linux is not winning over Windows:

 Linux has soared in recent years both in servers and desktops. Several factors are contributing to this surge, all happening at once. First, there is a change in the trend from powerful desktops to smaller, but less powerful, notebooks -- and now, netbooks. In addition, more user (and media) friendly Linux distributions, such as Ubuntu, have hit the scene. Not to forget the surge in the embedded world, especially with the Linux-based Android giving it a boost in smartphone sales. Finally, here comes the OS everybody loves to hate: Windows Vista -- significantly more resource hungry than XP and perhaps released a little too soon. All these factors probably made many users think of giving a shot to Linux.

Most people have heard that Linux is something tedious they came across at college. Other people think of it as something used by scientists and run on powerful but expensive workstations. Main concern with Linux is that it is not as easy as MacOS or Windows, and users simply miss their familiar Windows functions. Likewise, there are a number of reasons why Linux isn’t winning over Windows. I'm going to look at 5 of these reasons, some that apply primarily to servers, some to desktops, and some to both.

1)     Misleading Cost comparisons:

Let’s get what may be the most controversial point out of the way early. First, in the server space especially, we should try to compare apples to apples. This means comparing Windows Server to paid Linux. By far the most dominant flavor is Red Hat Enterprise Linux (RHEL), with about a two-thirds share of the paid enterprise Linux market, so this seems the most logical comparison. While there are plenty of free options out there, such as CentOS, for a business running mission-critical workloads, an unsupported operating system is a hard pill to swallow.
There are a couple of ways we can look at cost, neither of which is nearly as flattering to Linux as one might expect. First, we can look at the costs directly related to the acquisition of the platform. RHEL is a subscription-based license, meaning that rather than pay for the software itself, you pay for support. This doesn’t mean just phone tech support or troubleshooting (although that is included too, whether you want it or not) but also includes standard patches and bug fixes. Standard support for RHEL 5 Advanced Platform is $1,499 per year per server, or $4,047 for three years. Compare this with $3,999 for Windows Server 2008 Enterprise edition with free patching and bug fixes, and you can basically call it a wash unless you use a bulk of phone support. And there are also features that aren’t included and must be purchased separately, such as Red Hat Directory Server –few more bucks per annum. The other way of looking at cost is total cost of ownership (TCO) of the platform, and this explains our next reason.

2)     Windows Experts are more readily available

When looking at TCO, we’re not just looking at the software costs but also at staffing and administration costs, costs due to downtime, hardware costs, etc. Of these, staffing is the largest, accounting for more than half of the TCO. Here, Windows wins out because IT pros experienced with Windows are much more plentiful and generally cheaper to hire than Linux experts and because they can often be more productive.
With Linux, efficient management over many machines usually means going to the command line and pounding out a script to automate a process -- which is cool. However, with Windows Server 2008, PowerShell is now built in, which means the Windows guys can do that too, arguably better. Add that to the System Center family of tools, where virtually all management tasks are available at the click of a button (and which really have no peer on the Linux side), and Windows is simply easier to manage.

3)     Linux, not competing Head to Head

The last reason Linux isn’t winning over Windows in the server side is that it’s not really the primary focus. Right now, both Linux and Windows are gaining in server market share. How is that possible? Old granddaddy UNIX is being thrown under the bus to make it happen. Today, companies are dumping their old mainframe or proprietary UNIX servers for cheaper x86-based commodity hardware. It’s easy for a Linux sales guy to come in and make the value proposition: “It’s essentially the UNIX you know and love, but it runs on hardware a fraction of the cost.”
Unfortunately, the market for UNIX conversions and mainframe modernization is drying up. When those deals are gone, Linux will have to compete head-to-head with Windows to continue its growth, and this is a much harder proposition to make. Why should an organization already using Windows change platforms and have to build whole new skill sets around Linux?

4)     Advancement in Hardware

While Windows 7 is significantly faster than Vista, I don’t claim that it will be as friendly to the lowest end hardware as Linux. As time marches on, hardware improves. We can now get a quad-core processor and 8 gigs of RAM in our lappies. Intel has a dual-core Atom processor out, and even if it is made for nettops rather than netbooks, it's a safe bet that a dual-core Atom with netbook-friendly power consumption levels is right around the corner. In any case, as hardware continues to advance, that aspect of the Linux argument will become more and more irrelevant.
Also, while we’re on the topic of netbooks, let’s not forget that while these may seemingly be the perfect candidates for conversion from Windows to Linux, according to a Laptop Magazine interview of MSI’s director of U.S. sales, Andy Tung, the return rate of netbooks running Linux is much greater than the rate of those running Windows.
“They start playing around with Linux and start realizing that it’s not what they are used to. They don’t want to spend time to learn it so they bring it back to the store. The return rate is at least four times higher for Linux netbooks than Windows XP netbooks”

5)     The much acclaimed open source don’t stand up to scrutiny

Much of the hype about Linux is really more about open source development in general. The buzzwords all sound good: Open source is all about sharing, collaboration, proliferation of knowledge etc. To certain extent, there is nothing wrong with the open source model, and it surely helps in the advancement software development. As a business model and a model for end-user products, though, it's less reasonable. Here, it causes a lack of standardization. Egos issues among different developers collide, and the final product suffers. Let's not forget the old saying “Too many cooks spoil the broth.”
Another claim is that Linux and open source software are more secure than Windows and Microsoft software. This is largely based on problems with legacy versions of Windows. Back in the NT and Windows 2000 days, there were valid points to be made, but this is far less true today. The last several years have seen a massive emphasis on security across the industry. And now, with Windows Server 2008, Windows Vista, and the whole Forefront line of products, Microsoft is running a pretty tight ship -- enough so that major competitors such as Red Hat are not really bringing up the security argument against Windows anymore.

Sunday, November 7, 2010

Top IT Certifications! Are they worth it?


IT certifications boast numerous benefits. They bolster resumes, encourage higher salaries, and assist in job retention. IT managers need to have security certifications that will enhance their standing as generalists, who will be prudent in any situation. But the time and effort is also a worthwhile investment that can lead to better pay, so 

Which IT certifications are best?

Technology professionals debate over just this question. Many claim vendor-specific programs best measure a candidate’s skills, while others propose vendor-neutral certifications are the only worthy way of measuring real-world expertise.

Here is the answer:-
A recent survey by Certification Magazine suggests that high-level security certifications such as CISSP are paying off handsomely. According to a survey by InfoSecurity magazine in 2002, IT professionals' average salaries overall decreased by 5.5%, while those in IT security increased by 3.1%. While this statistic is independent of certification, it does show that experience in security is a valuable skill. And it should also be evident that in most, if not all, cases, certifications should be vendor-neutral. This is because IT managers need a broad view of security that transcends the specific technical platforms that their department manages. Vendor-neutral certifications go beyond the specific technologies and deal with how the technologies are used. 
                  Once a career road map is in place, selecting a potential certification path becomes much easier. And that’s where this list of the industry’s best IT certifications comes into play. While this list may not include the best accreditations for you, it does catalog the top IT certifications that possess significant value for a wide range of technology professionals.

Here are some of the best and most widely known certifications available:

MCITP

The new-generation Microsoft Certified IT Professional credential, or MCITP for short, is likely to become the next big Microsoft certification. Available for a variety of fields of expertise—including database developer, database administrator, enterprise messaging administrator, and server administrator—an MCITP validates a professional’s proven job-role capabilities. Candidates must pass several Microsoft exams that track directly to their job role before earning the new designation.
                          As with Microsoft’s other new-generation accreditations, the MCITP certification will retire when Microsoft suspends mainstream support for the platforms targeted within the MCITP exams. By matching the new certification to popular job roles, as has been done to some extent with CompTIA’s Server+ (server administrator), Project+ (project manager), and A+ (desktop support) certifications, Microsoft has created a new certification that’s certain to prove timely, relevant, and valuable.

MCTS

The new-generation Microsoft Certified Technology Specialist (MCTS) helps IT staff validate skills in installing, maintaining, and troubleshooting a specific Microsoft technology. The MCTS certifications are designed to communicate the skills and expertise a holder possesses on a specific platform. For example, candidates won’t earn an MCTS on SQL Server 2008. Instead, they'll earn an MCTS covering SQL Server business intelligence (MCTS: SQL Server 2008 Business Intelligence), database creation (MCTS: SQL Server 2008, Database Development), or SQL server administration (MCTS: SQL Server 2008, Implementation and Maintenance).

These new certifications require passing multiple, tightly targeted exams that focus on specific responsibilities on specific platforms. MCTS designations will expire when Microsoft suspends mainstream support for the corresponding platform. These changes, as with other new-generation Microsoft certifications, add value to the accreditation.

Security+

Security continues to be a critical topic. That’s not going to change. In fact, its importance is only going to grow bigger and bigger. One of the quickest ways to lose shareholder value, client confidence, and sales is to suffer a data breach. And no self-respecting technology professional wants to be responsible for such a breach.
                 CompTIA’s Security+ accreditation provides a respected, vendor-neutral foundation for industry staff (with at least two years of experience) seeking to demonstrate proficiency with security fundamentals. While the Security+ accreditation consists of just a single exam, it could be argued that any IT employee charged with managing client data or other sensitive information should, at a minimum, possess this accreditation. The importance of ensuring staff are properly educated as to systems security, network infrastructure, access control, auditing, and organizational security principles is simply too important to take for granted.

MCPD

There’s more to information technology than just administration, support, and networking. Someone must create and maintain the applications and programs that power organizations. That’s where the new-generation Microsoft Certified Professional Developer (MCPD) credential comes into play. The MCPD accreditation measures a developer’s ability to build and maintain software solutions using Visual Studio 2008 and Microsoft .NET Framework 3.5. Split into three certification paths (Windows Developer 3.5, ASP.NET Developer 3.5, and Enterprise Applications Developer 3.5), the credential targets IT professionals tasked with designing, optimizing, and operating those Microsoft technologies to fulfill business needs.
A redesigned certification aimed at better-measuring real-world skills and expertise, the MCPD will prove important for developers and programmers. Besides requiring candidates to pass several exams, the MCPD certification will retire when Microsoft suspends mainstream support for the corresponding platform. The change is designed to ensure the MCPD certification remains relevant, which is certain to further increase its value.

CCNA

The Cisco Certified Internetwork Expert (CCIE) accreditation captures most of the networking company’s certification glory. But the Cisco Certified Network Associate (CCNA) might prove more realistic within many organizations.

In a world in which Microsoft and Linux administrators are also often expected to be networking experts, many companies don’t have the budgets necessary to train (or employ) a CCIE. But even small and midsize corporations can benefit from having their technology professionals earn basic proficiency administering Cisco equipment, as demonstrated by earning a CCNA accreditation.

As smaller companies become increasingly dependent upon remote access technologies, basic Cisco systems skills are bound to become more important. Although many smaller organizations will never have the complexity or workload necessary to keep a CCIE busy, Cisco’s CCNA is a strong accreditation for technology professionals with a few years’ experience seeking to grow and improve their networking skills.

A+

Technology professionals with solid hardware and support skills are becoming tougher to find. There’s not much glory in digging elbow-deep into a desktop box or troubleshooting Windows boot errors. But those skills are essential to keeping companies running.
Adding CompTIA’s A+ certification to a resume tells hiring managers and department heads that you have proven support expertise. Whether an organization requires desktop installation, problem diagnosis, preventive maintenance, or computer or network error troubleshooting, many organizations have found A+-certified technicians to be more productive than their non-certified counterparts.
Changes to the A+ certification, which requires passing multiple exams, are aimed at keeping the popular credential relevant. Basic requirements are now followed by testing that covers specific fields of expertise (such as IT, remote support, or depot technician). The accreditation is aimed at those working in desktop support, on help desks, and in the field, and while most of the staff is new to the industry, the importance of an A+ certification should not be overlooked.

PMP

Some accreditations gain value by targeting specific skills and expertise. The Project Management Professional (PMP) certification is a great example. The Project Management Institute (PMI), a nonprofit organization that serves as a leading membership association for project management practitioners, maintains the PMP exam. The certification measures a candidate’s project management expertise by validating skills and knowledge required to plan, execute, budget, and lead a technology project. Eligible candidates must have five years of project management experience or three years of project management experience and 35 hours of related education.
As organizations battle tough economic conditions, having proven project scheduling, budgeting, and management skills will only grow in importance. The PMI’s PMP credential is a perfect conduit for demonstrating that expertise on a resume.

MCSE/MCSA

Even, years after their introduction, Microsoft Certified Systems Engineer (MCSE) and Microsoft Certified Systems Administrator (MCSA) credentials remain valuable. But it's important to avoid interpreting these accreditations as meaning the holders are all-knowing gurus, as that’s usually untrue. In my mind, the MCSE and MCSA hold value because they demonstrate the holder’s capacity to complete a long and comprehensive education, training, and certification program requiring intensive study. Further, these certifications validate a wide range of relevant expertise (from client and server administration to security issues) on specific, widely used platforms.
Also important is the fact that these certifications tend to indicate holders have been working within the technology field for a long time. There’s no substitute for actual hands-on experience. Many MCSEs and MCSAs hold their certifications on Windows 2000 or Windows Server 2003 platforms, meaning they’ve been working within the industry for many years. While these certifications will be replaced by Microsoft’s new-generation credentials, they remain an important measure of foundational skills on Windows platforms.

High-level and recognizable: **CISSP**

The most comprehensive, prestigious and recognized security certification is the CISSP, or Certified Information Systems Security Professional.

As mentioned with the Security+ accreditation earlier, security is only going to grow in importance. Whatever an organization’s mission, product, or service, security is paramount. (ISC) ², which administers the Certified Information Systems Security Professional (CISSP) accreditation, has built a respected, vendor-neutral security certification. Designed for industry pros with at least five years of full-time experience, and accredited by the American National Standards Institute (ANSI), the CISSP is internationally recognized for validating candidates' expertise with operations and network and physical security, as well as their ability to manage risk and understand legal compliance responsibilities and other security-related elements.
The CISSP certification has been around since 1989, long before security was considered cool. The International Information Systems Security Certification Consortium administers the certification. In mid-2002, the 10,000th CISSP was certified. 
Exams are offered frequently in most parts of the world. More information is available at www.isc2.org . 



CISA: Focusing on verifiability 

The first runner-up certification is the CISA, or Certified Information Systems Auditor. Once the exclusive domain of IT auditors, the CISA is quickly becoming a sought-after certification for senior-level personnel and management. The CISA's subject areas have moderate overlap with the CISSP, but it focuses more on business procedures than technology. And as you might expect, the CISA places an emphasis on auditing, which is glossed over by the CISSP. 

Linux+

The open source alternative is an important platform. Those professionals who have Linux expertise and want to formalize that skill set will do well adding CompTIA’s Linux+ certification to their resumes. The vendor-neutral exam, which validates basic linux client and server skills, is designed for professionals with at least six to 12 months of hands-on Linux experience. In addition to being vendor-neutral, the exam is also distribution neutral (meaning the skills it covers work well whether a candidate is administering Red Hat, SUSE, or Ubuntu systems).

Possession of one or more certifications, even CISSP or CISA, doesn't necessarily indicate the existence of good security intuition. However, these certifications are probably as good an objective measure as you can get on paper. Aim high and get the best certification you can within the next three to 12 months. The CISSP certification should be the long-term goal.

Friday, November 5, 2010

A Serious Little-known Threat: The Zombie Networks (aka Botnets)

A common axiom in the world of network security is that, "the only way to truly protect your computer is to disconnect it from the Internet." In today's society, this is absolutely impractical.
What is a "Zombie?"
For those unfamiliar with the term "zombie", it is synonymous with the monsters made famous in George Romero's horror films. A Zombie is a computer that has been infected by a malware, often a trojan, allowing an attacker to take control of the compromised PC and use it to serve malicious purposes like sending spam,
hosting illegal websites or launching DDoS attacks. A zombie computer is also known as a bot.
*DDoS ???*
DDoS (Distributed Denial of Service) attacks are launched with the aim of crashing a server by flooding it with packets of data. DDoS attacks are effective and dangerous because the traffic can rise from hundreds of
thousands of zombies. Users are not even aware that they participate in an attack as a zombie. Imagine your server being bombarded by so many zombies!

Those who are not worried should reconsider their position: an expert estimated that zombie networks count between one million and two million infected machines! And, each day, this number increases dramatically! Yours could really be one!

*Another side effect of zombie networks: SPAM!*
Who never received spam in their email box? Did you ever wonder why spammers almost never get arrested? Reasons are numerous; however, remember that they often use stolen resources, such as zombie computers, or get protected by corrupt Internet service providers (ISPs) in some foreign countries. That, and many other tricks to obfuscate their traces, makes it a challenge to catch them.

*Could my personal privacy be at risk?*
In fact, not really. Contrary to popular myth, most of attackers are not interested in your latest love affairs; what they want is your Internet bandwidth and sensitive data like your passwords to commit their crimes!

Maybe you ask yourself: “I have no important accounts and I don’t care about my bandwidth being stolen. So, what’s the deal?” Selfish question: on the Internet, what you do, affects everyone. And, as you can see, botnets are a major venom poisoning the entire network. Will you let a complete stranger “borrow” your car for a hold-up? No. Same goes for your computer.

*But, my computer is fine; it’s not slow at all !*
Attackers do not want you to know that your machine fell under their control. Yet, if you are severely infected, you can notice symptoms like unwanted pop-ups, hijacking of your browser, slow-downs, etc.

In all cases, be sure to scan your computer every week with an anti-virus and an anti-spyware. Do not forget a firewall or you will be infected in less than 30 minutes.

Is it Possible to avoid being infected???

Sure thing! Spend a bit of time to learn how to use a computer; scan emails, web pages, downloads and your hard drive for nasty stuff; resist the temptation to revise your human anatomy on doubtful sites; think before
clicking and you should be in business!

If you do not put the effort, you will get infected each time.   Happy surfing!

Thursday, November 4, 2010

Revealing Passwords behind asterisks, dots and hashes


This post is basically a continuation from my previous post on breaking in to computers by changing the admin/root passwords, gaining admin privileges from guest account (find here). Here we see different tools to find out :

  • what is behind the asterisks or dots of the password
  • the encrypted password hashes,
  • how to crack the SAM file,
  • the passwords saved in your browsers,
  • more on Sniffing passwords.

Tools and their purpose:

SNADBOY’S REVELATION:

This application is used to read the asterisks or dots of passwords.
Install and run the application and at the login page with username and password field, drag the circled cursor from the application to reveal the password behind the asterisks or dots. Check the screen shot below:



Once the circled cursor is dragged over the asterisk or the dots used to hide the password typed, it reads the information from that and displays in the application in the clear text form.


IEPASS VIEW:

When this application is run, it searches for all the stored user names and passwords stored in the internet explorer’s protected storage and displays them corresponding to the websites that we logged on. The application shows the website visited and user name and password used to logon to that particular website.

LCP tool:

This is used to crack password hashes from the SAM file. As you may or may not know, all of the passwords on a Windows computer are stored in a SAM File.
When you type your password into a Windows NT, 2000, or XP login Windows encrypts your password using an encryption scheme that turns your password into something that looks like this:

7524248b4d2c9a9eadd3b435c51404ee

This is a password Hash. This is what is actually being checked against when you type your password in. It encrypts what you typed and bounces it against what is stored in the Registry and/or SAM File.

SAM File - Holds the user names and password hashes for every account on the local machine, or domain if it is a domain controller. Simple enough, isn’t it?
This file is located on your computer’s hard drive in the directory
“C: WINDOWS\System32\Config” The file’s name is SAM. You may be thinking, “Wow, this was incredibly easy, you can copy the file anywhere and read the content., but unfortunately its not so simple. When you attempt to copy this file, you get an error saying some sort of “Access is denied. File is in use.” The SAM file is in use by the system, so you cannot just go to task manager, and end the process. You need to find alternate methods of starting up the computer without using the SAM file. As far as I know, this can be done several ways. Somewhat easier way of obtaining the SAM file is using a MSDOS boot disk.

Method of Obtaining SAM file:
Just insert a floppy, right click on it in My Computer, and click on format floppy. When the menu appears, mark the box for “Create a MS-DOS startup boot disk”, and then click the start button.
After you have made your disk, restart your computer with the disk still in the drive. Make sure your BIOS settings boot from the floppy drive before the hard-drive. When the computer boots, you should see a screen similar to the command prompt and  “A:>” is most likely the prompt you will see. First you need to change drives to the c drive. This is done various ways on different computers. “cd C:” or “C:” usually work.

Next you will need to use the copy command to copy the SAM and SYSTEM files to other areas of the hard drive. The syntax for the copy command is as follows without the quotes : “C:Copy (file to be copied) (destination)” so the correct command will look like this “C:\Copy C:\WINDOWS\System32\Config\SAM C:\” This will copy the SAM file to the C drive. Replace “SAM” with “SYSTEM” to get the system file. Next you might want to rename these files. This is the syntax for the Rename command: “C:ren (file to be renamed) (new file name)”. The command, which will rename your files for you, will be “C:ren C:SAM xyz” if you saved it to the C drive. This will rename you SAM file to xyz. Now restart your computer without the boot disk in and start up windows. Copy the files onto a floppy. The reason these methods work, are because the SAM file is not in use when you aren’t running Windows, and when u copied and renamed the file, it did not get used by windows when you logged on. Now on to the easy part, cracking the SAM file.

Run the LCP tool, import the SAM file and this tool sorts out all the user accounts with the corresponding NTLM and LM hashes of the passwords of the user accounts. If the assigned password is a dictionary work then it is cracked in a matter of seconds but if the password assigned is a strong one, then it may take some time to crack the password hash.
This tool uses methods like brute force attacks, dictionary attack or hybrid attacks to crack the hash.
Other tools used for cracking hash are tomas, brutusA2.  There are also web based tools which can crack the passwords online if you input your hash. Some of the sites are:
www.exploit-db.com (previously milw0rm.com)

and so on…

We can also use Packet Sniffers to capture all of the packets of data that pass through a given network interface. If the captured data has a password hash, we can use the above tools to recover the plain text used to produce a particular cipher text. More on Packet Sniffer in my next post! till then, happy cracking!!

Monday, November 1, 2010

Ethical Hacking in Nutshell

This is a content of a seminar on hacking I attended when I was in school days. I tried to recollect the points and summarized here. This gives a brief introduction on hacking and my forthcoming posts address different modules in hacking and its counter measures, all in nutshell for people who are bored of reading volumes of books on this subject yet interested to know the stuff. I have done all that homework of reading and experimenting so as to present you the content here in the most concise way possible. As this is addressed to variable knowledge audience, I will start everything from level zero. However, I assume readers have basic knowledge on computer and its terminology and also a bit of networking and security knowledge.  

When you hear the terms hacking and hackers, most people straightway start associating the term Hackers with computer criminals or people who cause harm to systems, release viruses and so on. And I do not blame them for holding such a negative opinion. Unfortunately, one tends to blindly accept what is being fed to them by popular media.

Types of hackers:

1) Black hat hackers: Masters of hacking!! Bad guys
A computer criminal sitting in one corner of a dark room and committing a crime. He is the one who breaks into a computer system or network with malicious intent.

2) White hat hackers: Good guys!
These are actually good, pleasant and extremely intelligent people, who by using their knowledge in a constructive manner help organizations to secure documents and company secrets, help the government to protect national documents of strategic importance and even sometimes help justice to meet its ends by ferreting out electronic evidence.

3) Grey hat hackers: Dangerous guys!!
A gray hat hacker is a combination of a Black Hat Hacker and a White Hat Hacker. A Grey Hat Hacker will surf the internet and hack into a computer system for the sole purpose of notifying the administrator that their system has been hacked and suggest him the ways to secure it by charging small fee.

Foot Printing is the first step in hacking, which is collecting information about the target system / network to be hacked. This can be done using the following applications:
Sam spade, neotrace visual route, email tracker pro etc. The same can be achieved via web from the following sites:
and many more. These websites are useful in gathering info about the target system.

The next step after gathering information is to find out the ways to enter in to the network which is called Scanning. This is the first action taken against the target network. In this phase we use info obtained from foot printing and move deep in to the network.
Scanning basically involves IP scanning, Ports and services scanning and vulnerability scanning. Some of the tools used here are Angry IP scanner, Super scan, Retina, Shadow Security scanner. These are basically used to identify the open ports and services and the application used to host the service, which in turn give us an idea of which functions the server is performing, and can also find the OS being run on the machine and so on.. Once the OS and the Services are determined exploitation of those services can begin. Taking advantage of the vulnerabilities is called exploitation which I discussed in my previous post Breaking in to computers- Newbie Series-2 here.


The next step is obviously the exploitation and is called “Exploits and Enumerations” involves checking the severity of the vulnerability or the weakness. Based on this we develop a piece of software, script or use the already existing tools to exploit. Some of the common vulnerabilities can be found in the below websites:
Securityfocus.com
Packetstrom.com
Milw0rm.com
These sites give a brief description of the vulnerability, and code/script to exploit and also security measures / the ways to patch this vulnerability. This is how we proceed further with tasks like sniffing, phishing, hacking email accounts, session hijacking, hacking websites, wifi hacking etc. In my future posts I will discuss each of these activities in detail and how can you identify and block any unwanted intrusions to your network or system. I will stop here and  I wish you happy hacking:)