The S R Store

Search This Blog

Showing posts with label TUTORIALS and OTHER INFO.. Show all posts
Showing posts with label TUTORIALS and OTHER INFO.. Show all posts

Tuesday, June 22, 2010

Tips To Make Your Computer Fast

The first thing you can do to speed up your pc is by speeding up the boot sequence

 Then you need to follow the following tips:

1.Let your PC boot up completely before opening any applications.



2.Refresh the desktop after closing any application. This will remove any unused files from the RAM.



3.Do not set very large file size images as your wallpaper. Do not keep a wallpaper at all if your PC is low on RAM (less than 64 MB).



4.Do not clutter your Desktop with a lot of shortcuts. Each shortcut on the desktop uses up to 500 bytes of RAM



5.Empty the recycle bin regularly. The files are not really deleted from your hard drive until you empty the recycle bin.



6.Delete the temporary internet files regularly.



7.Defragment your hard drive once every two months. This will free up a lot of space on your hard drive and rearrange the files so that your applications run faster.



8.Always make two or more partitions in your hard drive. Install all large Softwares (like PSP, Photoshop, 3DS Max etc) in the second partition. Windows uses all the available empty space in C drive as virtual memory when your Computer RAM is full. Keep the C Drive as empty as possible.



9.When installing new Softwares disable the option of having a tray icon. The tray icons use up available RAM, and also slow down the booting of your PC. Also disable the option of starting the application automatically when the PC boots. You can disable these options later on also from the Tools or preferences menu in your application.



10. Protect your PC from dust:
Dust causes the CPU cooling fan to jam and slow down thereby gradually heating your CPU and affecting the processing speed. Use compressed air to blow out any dust from the CPU. Never use vacuum.

# If you are looking to boost up your internet connection speed then read this post :  http://answersbyrahul.blogspot.com/2008/10/increase-speed-of-your-internet.html

#If you use firefox then to increase its speed see this post :
http://answersbyrahul.blogspot.com/2008/10/trick-to-make-your-firefox-fast.html

# If you use internet explore then to increase its speed see this post :
http://answersbyrahul.blogspot.com/2008/10/boosting-internet-explorer-by-10-times.html

Thursday, June 17, 2010

All About Ubuntu

Ubuntu is a computer operating system based on the Debian GNU/Linux distribution and is distributed as free and open source software with additional proprietary software available.
It is named after the Southern African ethical ideology Ubuntu (humanity towards others").

Ubuntu is composed of many software packages, of which the vast majority are distributed under a free software license (also known as open source). The main license used is the GNU General Public License (GNU GPL) which, along with the GNU Lesser General Public License (GNU LGPL), explicitly declares that users are free to run, copy, distribute, study, change, develop and improve the software. Ubuntu is sponsored by the UK-based company Canonical Ltd., owned by South African entrepreneur Mark Shuttleworth. By keeping Ubuntu free and open source, Canonical is able to utilize the talents of community developers in Ubuntu's constituent components. Instead of selling Ubuntu for profit, Canonical creates revenue by selling technical support and from creating several services tied to Ubuntu.

Features 
Ubuntu focuses on usability.The Ubiquity installer allows Ubuntu to be installed to the hard disk from within the Live CD environment, without the need for restarting the computer prior to installation. Ubuntu comes installed with a wide range of software that includes OpenOffice.org, Firefox, Empathy (Pidgin in versions before 9.10), Transmission, GIMP (in versions prior to 10.04), and several lightweight games (such as Sudoku and chess). Additional software that is not installed by default can be downloaded using the package manager. Ubuntu allows networking ports to be closed using its firewall, with customized port selection available. End-users can install Gufw and keep it enabled.[30] GNOME (the current default desktop) offers support for more than 46 languages.[31] Ubuntu can also run many programs designed for Microsoft Windows (such as Microsoft Office), through Wine or using a Virtual Machine (such as VMware Workstation or VirtualBox).

System requirements
The desktop version of Ubuntu currently supports the Intel x86, AMD64, and ARM[62] architectures. Some server releases also support the SPARC architecture.[63][64] Unofficial support is available for the PowerPC,[65] IA-64 (Itanium) and PlayStation 3 architectures (note however that Sony officially removed support for OtherOS on the PS3 with firmware 3.21, released on April 1, 2010[66]).
Recommended Minimum
System Requirements      CLI (Server)        GUI* (Desktop)
Processor (x86)              300 MHz              1 GHz
Memory                         128 MB                512 MB
Hard Drive (free space)   1 GB                    5 GB
Monitor Resolution         640×480              1024×768

*Note: If visual effects are desired, a supported GPU is required.


Installation
(Ubuntu 10.04 LTS Live Mode)

Installation of Ubuntu is generally performed with the Live CD. The Ubuntu OS can be run directly from the CD (albeit with a significant performance loss), allowing a user to "test-drive" the OS for hardware compatibility and driver support. The CD also contains the Ubiquity installer,[32] which then can guide the user through the permanent installation process. CD images of all current and past versions are available for download at the Ubuntu web site.[33] Installing from the CD requires a minimum of 256 MB RAM.

Users can download a disk image (.iso) of the CD, which can then either be written to a physical medium (CD or DVD), or optionally run directly from a hard drive (via UNetbootin or GRUB). Ubuntu is also available on ARM, PowerPC, SPARC, and IA-64 platforms, although none but ARM are officially supported [34]

Canonical offers Ubuntu[35] and Kubuntu[36] installation CDs at no cost, including paid postage for destinations in most countries around the world (via a service called ShipIt).

A Microsoft Windows migration tool, called Migration Assistant (introduced in April 2007),[37] can be used to import bookmarks, desktop background (wallpaper), and various settings from an existing MS Windows installation into a new Ubuntu installation.[38]

Ubuntu and Kubuntu can be booted and run from a USB Flash drive[39] (as long as the BIOS supports booting from USB), with the option of saving settings to the flashdrive. This allows a portable installation that can be run on any PC which is capable of booting from a USB drive.[40] In newer versions of Ubuntu, the USB creator program is available to install Ubuntu on a USB drive (with or without a LiveCD disc).

Wubi, which is included as an option on the Live CD,[41] allows Ubuntu to be installed and run from within a virtual Windows loop device (as a large image file that is managed like any other Windows program via the Windows Control Panel). This method requires no partitioning of a Windows user's hard drive. Wubi also makes use of the Migration Assistant to import users' settings. It is only useful for Windows users; it is not meant for permanent Ubuntu installations and it also incurs a slight performance loss.

Download Ubuntu for free from here:
www.ubuntu.com/desktop/get-ubuntu/download
                       or
Request a free ubuntu cd here
 https://shipit.ubuntu.com/

Monday, April 20, 2009

EVOLUTION & HISTORY OF COMPUTER VIRUSES

Part 1


Like any other field in computer science, viruses have evolved -a great deal indeed- over the years. In the series of press releases which start today, we will look at the origins and evolution of malicious code since it first appeared up to the present.

Going back to the origin of viruses, it was in 1949 that Mathematician John Von Neumann described self-replicating programs which could resemble computer viruses as they are known today. However, it was not until the 60s that we find the predecessor of current viruses. In that decade, a group of programmers developed a game called Core Wars, which could reproduce every time it was run, and even saturate the memory of other players’ computers. The creators of this peculiar game also created the first antivirus, an application named Reeper, which could destroy copies created by Core Wars.

However, it was only in 1983 that one of these programmers announced the existence of Core Wars, which was described the following year in a prestigious scientific magazine: this was actually the starting point of what we call computer viruses today.

At that time, a still young MS-DOS was starting to become the preeminent operating system worldwide. This was a system with great prospects, but still many deficiencies as well, which arose from software developments and the lack of many hardware elements known today. Even like this, this new operating system became the target of a virus in 1986: Brain, a malicious code created in Pakistan which infected boot sectors of disks so that their contents could not be accessed. That year also saw the birth of the first Trojan: an application called PC-Write.

Shortly after, virus writers realized that infecting files could be even more harmful to systems. In 1987, a virus called Suriv-02 appeared, which infected COM files and opened the door to the infamous viruses Jerusalem or Viernes 13. However, the worst was still to come: 1988 set the date when the “Morris worm” appeared, infecting 6,000 computers.

From that date up to 1995 the types of malicious codes that are known today started being developed: the first macro viruses appeared, polymorphic viruses … Some of these even triggered epidemics, such as MichaelAngelo. However, there was an event that changed the virus scenario worldwide: the massive use of the Internet and e-mail. Little by little, viruses started adapting to this new situation until the appearance, in 1999, of Melissa, the first malicious code to cause a worldwide epidemic, opening a new era for computer viruses.


Part 2


This second installment of ‘The evolution of viruses will look at how malicious code used to spread before use of the Internet and e-mail became as commonplace as it is today, and the main objectives of the creators of those earlier viruses.
Until the worldwide web and e-mail were adopted as a standard means of communication the world over, the main mediums through which viruses spread were floppy disks, removable drives, CDs, etc., containing files that were already infected or with the virus code in an executable boot sector.

When a virus entered a system it could go memory resident, infecting other files as they were opened, or it could start to reproduce immediately, also infecting other files on the system. The virus code could also be triggered by a certain event, for example when the system clock reached a certain date or time. In this case, the virus creator would calculate the time necessary for the virus to spread and then set a date –often with some particular significance- for the virus to activate. In this way, the virus would have an incubation period during which it didn’t visibly affect computers, but just spread from one system to another waiting for ‘D-day’ to launch its payload. This incubation period would be vital to the virus successfully infecting as many computers as possible.

One classic example of a destructive virus that lay low before releasing its payload was CIH, also known as Chernobyl. The most damaging version of this malicious code activated on April 26, when it would try to overwrite the flash-BIOS, the memory which includes the code needed to control PC devices. This virus, which first appeared in June 1998, had a serious impact for over two years and still continues to infect computers today.

Because of the way in which they propagate, these viruses spread very slowly, especially in comparison to the speed of today’s malicious code. Towards the end of the Eighties, for example, the Friday 13th (or Jerusalem) virus needed a long time to actually spread and continued to infect computers for some years. In contrast, experts reckon that in January 2003, SQLSlammer took just ten minutes to cause global communication problems across the Internet.

Notoriety versus stealth

For the most part, in the past, the activation of a malicious code triggered a series of on screen messages or images, or caused sounds to be emitted to catch the user’s attention. Such was the case with the Ping Pong virus, which displayed a ball bouncing from one side of the screen to another. This kind of elaborate display was used by the creator of the virus to gain as much notoriety as possible. Nowadays however, the opposite is the norm, with virus authors trying to make malicious code as discreet as possible, infecting users’ systems without them noticing that anything is amiss.



Part 3


This third installment of ‘The evolution of viruses’ will look at how the Internet and e-mail changed the propagation techniques used by computer viruses.

Internet and e-mail revolutionized communications. However, as expected, virus creators didn’t take long to realize that along with this new means of communication, an excellent way of spreading their creations far and wide had also dawned. Therefore, they quickly changed their aim from infecting a few computers while drawing as much attention to themselves as possible, to damaging as many computers as possible, as quickly as possible. This change in strategy resulted in the first global virus epidemic, which was caused by the Melissa worm.

With the appearance of Melissa, the economic impact of a virus started to become an issue. As a result, users -above all companies- started to become seriously concerned about the consequences of viruses on the security of their computers. This is how users discovered antivirus programs, which started to be installed widely. However, this also brought about a new challenge for virus writers, how to slip past this protection and how to persuade users to run infected files.

The answer to which of these virus strategies was the most effective came in the form of a new worm: Love Letter, which used a simple but effective ruse that could be considered an early type of social engineering. This strategy involves inserting false messages that trick users into thinking that the message includes anything, except a virus. This worm’s bait was simple; it led users to believe that they had received a love letter.

This technique is still the most widely used. However, it is closely followed by another tactic that has been the center of attention lately: exploiting vulnerabilities in commonly used software. This strategy offers a range of possibilities depending on the security hole exploited. The first malicious code to use this method –and quite successfully- were the BubbleBoy and Kakworm worms. These worms exploited a vulnerability in Internet Explorer by inserting HTML code in the body of the e-mail message, which allowed them to run automatically, without needing the user to do a thing.

Vulnerabilities allow many different types of actions to be carried out. For example, they allow viruses to be dropped on computers directly from the Internet -such as the Blaster worm-. In fact, the effects of the virus depend on the vulnerability that the virus author tries to exploit.



Part 4


In the early days of computers, there were relatively few PCs likely to contain “sensitive” information, such as credit card numbers or other financial data, and these were generally limited to large companies that had already incorporated computers into working processes.

In any event, information stored in computers was not likely to be compromised, unless the computer was connected to a network through which the information could be transmitted. Of course, there were exceptions to this and there were cases in which hackers perpetrated frauds using data stored in IT systems. However, this was achieved through typical hacking activities, with no viruses involved.

The advent of the Internet however caused virus creators to change their objectives, and, from that moment on, they tried to infect as many computers as possible in the shortest time. Also, the introduction of Internet services -like e-banking or online shopping- brought in another change. Some virus creators started writing malicious codes not to infect computers, but, to steal confidential data associated to those services. Evidently, to achieve this, they needed viruses that could infect many computers silently.

Their malicious labor was finally rewarded with the appearance, in 1986, of a new breed of malicious code generically called “Trojan Horse”, or simply “Trojan”. This first Trojan was called PC-Write and tried to pass itself off as the shareware version of a text processor. When run, the Trojan displayed a functional text processor on screen. The problem was that, while the user wrote, PC-Write deleted and corrupted files on the computers’ hard disk.

After PC-Write, this type of malicious code evolved very quickly to reach the stage of present-day Trojans. Today, many of the people who design Trojans to steal data cannot be considered virus writers but simply thieves who, instead of using blowtorches or dynamite have turned to viruses to commit their crimes. Ldpinch.W or the Bancos or Tolger families of Trojans are examples of this


Part 5


Even though none of them can be left aside, some particular fields of computer science have played a more determinant role than others with regard to the evolution of viruses. One of the most influential fields has been the development of programming languages.

These languages are basically a means of communication with computers in order to tell them what to do. Even though each of them has its own specific development and formulation rules, computers in fact understand only one language called "machine code".

Programming languages act as an interpreter between the programmer and the computer. Obviously, the more directly you can communicate with the computer, the better it will understand you, and more complex actions you can ask it to perform.

According to this, programming languages can be divided into "low and high level" languages, depending on whether their syntax is more understandable for programmers or for computers. A "high level" language uses expressions that are easily understandable for most programmers, but not so much for computers. Visual Basic and C are good examples of this type of language.

On the contrary, expressions used by "low level" languages are closer to machine code, but are very difficult to understand for someone who has not been involved in the programming process. One of the most powerful, most widely used examples of this type of language is "assembler".

In order to explain the use of programming languages through virus history, it is necessary to refer to hardware evolution. It is not difficult to understand that an old 8-bit processor does not have the power of modern 64-bit processors, and this of course, has had an impact on the programming languages used.

In this and the next installments of this series, we will look at the different programming languages used by virus creators through computer history:

- Virus antecessors: Core Wars

As was already explained in the first chapter of this series, a group of programs called Core Wars, developed by engineers at an important telecommunications company, are considered the antecessors of current-day viruses. Computer science was still in the early stages and programming languages had hardly developed. For this reason, authors of these proto-viruses used a language that was almost equal to machine code to program them.

Curiously enough, it seems that one of the Core Wars programmers was Robert Thomas Morris, whose son programmed -years later- the "Morris worm". This malicious code became extraordinarily famous since it managed to infect 6,000 computers, an impressive figure for 1988.

- The new gurus of the 8-bits and the assembler language.

The names Altair, IMSAI and Apple in USA and Sinclair, Atari and Commodore in Europe, bring memories of times gone by, when a new generation of computer enthusiasts "fought" to establish their place in the programming world. To be the best, programmers needed to have profound knowledge of machine code and assembler, as interpreters of high-level languages used too much run time. BASIC, for example, was a relatively easy to learn language which allowed users to develop programs simply and quickly. It had however, many limitations.

This caused the appearance of two groups of programmers: those who used assembler and those who turned to high-level languages (BASIC and PASCAL, mainly).

Computer aficionados of the time enjoyed themselves more by programming useful software than malware. However, 1981 saw the birth of what can be considered the first 8-bit virus. Its name was "Elk Cloner", and was programmed in machine code. This virus could infect Apple II systems and displayed a message when it infected a computer.



Part 6


Computer viruses evolve in much the same way as in other areas of IT. Two of the most important factors in understanding how viruses have reached their current level are the development of programming languages and the appearance of increasingly powerful hardware.

In 1981, almost at the same time as Elk Kloner (the first virus for 8-bit processors) made its appearance, a new operating system was growing in popularity. Its full name was Microsoft Disk Operating System, although computer buffs throughout the world would soon refer to it simply as DOS.

DOS viruses

The development of MS DOS systems occurred in parallel to the appearance of new, more powerful hardware. Personal computers were gradually establishing themselves as tools that people could use in their everyday lives, and the result was that the number of PCs users grew substantially. Perhaps inevitably, more users also started creating viruses. Gradually, we witnessed the appearance of the first viruses and Trojans for DOS, written in assembler language and demonstrating a degree of skill on the part of their authors.

Far less programmers know assembler language than are familiar with high-level languages that are far easier to learn. Malicious code written in Fortran, Basic, Cobol, C or Pascal soon began to appear. The last two languages, which are well established and very powerful, are the most widely used, particularly in their TurboC and Turbo Pascal versions. This ultimately led to the appearance of “virus families”: that is, viruses that are followed by a vast number of related viruses which are slightly modified forms of the original code.

Other users took the less ‘artistic’ approach of creating destructive viruses that did not require any great knowledge of programming. As a result, batch processing file viruses or BAT viruses began to appear.

Win16 viruses

The development of 16-bit processors led to a new era in computing. The first consequence was the birth of Windows, which, at the time, was just an application to make it easier to handle DOS using a graphic interface.

The structure of Windows 3.xx files is rather difficult to understand, and the assembler language code is very complicated, as a result of which few programmers initially attempted to develop viruses for this platform. But this problem was soon solved thanks to the development of programming tools for high-level languages, above all Visual Basic. This application is so effective that many virus creators adopted it as their ‘daily working tool’. This meant that writing a virus had become a very straightforward task, and viruses soon appeared in their hundreds. This development was accompanied by the appearance of the first Trojans able to steal passwords. As a result, more than 500 variants of the AOL Trojan family -designed to steal personal information from infected computers- were identified.


Part 7

This seventh edition on the history of computer viruses will look at how the development of Windows and Visual Basic has influenced the evolution of viruses, as with the development of these, worldwide epidemics also evolved such as the first one caused by Melissa in 1999.

While Windows changed from being an application designed to make DOS easier to manage to a 32-bit platform and operating system in its own right, virus creators went back to using assembler as the main language for programming viruses.

Versions 5 and 6 of Visual Basic (VB) were developed, making it the preferred tool, along with Borland Delphi (the Pascal development for the Windows environment), for Trojan and worm writers. Then, Visual C, a powerful environment developed in C for Windows, was adopted for creating viruses, Trojans and worms. This last type of malware gained unusual strength, taking over almost all other types of viruses. Even though the characteristics of worms have changed over time, they all have the same objective: to spread to as many computers as possible, as quickly as possible.

With time, Visual Basic became extremely popular and Microsoft implemented part of the functionality of this language as an interpreter capable of running script files with a similar syntax.

At the same time as the Win32 platform was implemented, the first script viruses also appeared: malware inside a simple text file. These demonstrated that not only executable files (.EXE and .COM files) could carry viruses. As already seen with BAT viruses, there are also other means of propagation, proving the saying "anything that can be executed directly or through a interpreter can contain malware." To be specific, the first viruses that infected the macros included in Microsoft Office emerged. As a result, Word, Excel, Access and PowerPoint become ways of spreading ‘lethal weapons’, which destroyed information when the user simply opened a document.

Melissa and self-executing worms

The powerful script interpreters in Microsoft Office allowed virus authors to arm their creations with the characteristics of worms. A clear example is Melissa, a Word macro virus with the characteristics of a worm that infects Word 97 and 2000 documents. This worm automatically sends itself out as an attachment to an e-mail message to the first 50 contacts in the Outlook address book on the affected computer. This technique, which has unfortunately become very popular nowadays, was first used in this virus which, in 1999, caused one of the largest epidemics in computer history in just a few days. In fact, companies like Microsoft, Intel or Lucent Technologies had to block their connections to the Internet due to the actions of Melissa.

The technique started by Melissa was developed in 1999 by viruses like VBS/Freelink, which unlike its predecessor sent itself out to all the contacts in the address book on the infected PC. This started a new wave of worms capable of sending themselves out to all the contacts in the Outlook address book on the infected computer. Of these, the worm that most stands out from the rest is VBS/LoveLetter, more commonly known as ‘I love You’, which emerged in May 2000 and caused an epidemic that caused damage estimated at 10,000 million euros. In order to get the user’s attention and help it to spread, this worm sent itself out in an e-mail message with the subject ‘ILOVEYOU’ and an attached file called ‘LOVE-LETTER-FOR-YOU.TXT.VBS’. When the user opened this attachment, the computer was infected.

As well as Melissa, in 1999 another type of virus emerged that also marked a milestone in virus history. In November of that year, VBS/BubbleBoy appeared, a new type of Internet worm written in VB Script. VBS/BubbleBoy was automatically run without the user needing to click on an attached file, as it exploited a vulnerability in Internet Explorer 5 to automatically run when the message was opened or viewed. This worm was followed in 2000 by JS/Kak.Worm, which spread by hiding behind Java Script in the auto-signature in Microsoft Outlook Express, allowing it to infect computers without the user needing to run an attached file. These were the first samples of a series of worms, which were joined later on by worms capable of attacking computers when the user is browsing the Internet.

Sunday, November 30, 2008

BandWidth Explained

This is well written explanation about bandwidth, very useful info. I found this article over the net & liked it . Hope you like it too.



BandWidth Explained

Most hosting companies offer a variety of bandwidth options in their plans. So exactly what is bandwidth as it relates to web hosting? Put simply, bandwidth is the amount of traffic that is allowed to occur between your web site and the rest of the internet. The amount of bandwidth a hosting company can provide is determined by their network connections, both internal to their data center and external to the public internet.


Network Connectivity

The internet, in the most simplest of terms, is a group of millions of computers connected by networks. These connections within the internet can be large or small depending upon the cabling and equipment that is used at a particular internet location. It is the size of each network connection that determines how much bandwidth is available. For example, if you use a DSL connection to connect to the internet, you have 1.54 Mega bits (Mb) of bandwidth. Bandwidth therefore is measured in bits (a single 0 or 1). Bits are grouped in bytes which form words, text, and other information that is transferred between your computer and the internet.

If you have a DSL connection to the internet, you have dedicated bandwidth between your computer and your internet provider. But your internet provider may have thousands of DSL connections to their location. All of these connection aggregate at your internet provider who then has their own dedicated connection to the internet (or multiple connections) which is much larger than your single connection. They must have enough bandwidth to serve your computing needs as well as all of their other customers. So while you have a 1.54Mb connection to your internet provider, your internet provider may have a 255Mb connection to the internet so it can accommodate your needs and up to 166 other users (255/1.54).


Traffic

A very simple analogy to use to understand bandwidth and traffic is to think of highways and cars. Bandwidth is the number of lanes on the highway and traffic is the number of cars on the highway. If you are the only car on a highway, you can travel very quickly. If you are stuck in the middle of rush hour, you may travel very slowly since all of the lanes are being used up.

Traffic is simply the number of bits that are transferred on network connections. It is easiest to understand traffic using examples. One Gigabyte is 2 to the 30th power (1,073,741,824) bytes. One gigabyte is equal to 1,024 megabytes. To put this in perspective, it takes one byte to store one character. Imagine 100 file cabinets in a building, each of these cabinets holds 1000 folders. Each folder has 100 papers. Each paper contains 100 characters - A GB is all the characters in the building. An MP3 song is about 4MB, the same song in wav format is about 40MB, a full length movie can be 800MB to 1000MB (1000MB = 1GB).

If you were to transfer this MP3 song from a web site to your computer, you would create 4MB of traffic between the web site you are downloading from and your computer. Depending upon the network connection between the web site and the internet, the transfer may occur very quickly, or it could take time if other people are also downloading files at the same time. If, for example, the web site you download from has a 10MB connection to the internet, and you are the only person accessing that web site to download your MP3, your 4MB file will be the only traffic on that web site. However, if three people are all downloading that same MP at the same time, 12MB (3 x 4MB) of traffic has been created. Because in this example, the host only has 10MB of bandwidth, someone will have to wait. The network equipment at the hosting company will cycle through each person downloading the file and transfer a small portion at a time so each person's file transfer can take place, but the transfer for everyone downloading the file will be slower. If 100 people all came to the site and downloaded the MP3 at the same time, the transfers would be extremely slow. If the host wanted to decrease the time it took to download files simultaneously, it could increase the bandwidth of their internet connection (at a cost due to upgrading equipment).


Hosting Bandwidth

In the example above, we discussed traffic in terms of downloading an MP3 file. However, each time you visit a web site, you are creating traffic, because in order to view that web page on your computer, the web page is first downloaded to your computer (between the web site and you) which is then displayed using your browser software (Internet Explorer, Netscape, etc.) . The page itself is simply a file that creates traffic just like the MP3 file in the example above (however, a web page is usually much smaller than a music file).

A web page may be very small or large depending upon the amount of text and the number and quality of images integrated within the web page. For example, the home page for CNN.com is about 200KB (200 Kilobytes = 200,000 bytes = 1,600,000 bits). This is typically large for a web page. In comparison, Yahoo's home page is about 70KB.


How Much Bandwidth Is Enough?

It depends (don't you hate that answer). But in truth, it does. Since bandwidth is a significant determinant of hosting plan prices, you should take time to determine just how much is right for you. Almost all hosting plans have bandwidth requirements measured in months, so you need to estimate the amount of bandwidth that will be required by your site on a monthly basis

If you do not intend to provide file download capability from your site, the formula for calculating bandwidth is fairly straightforward:

Average Daily Visitors x Average Page Views x Average Page Size x 31 x Fudge Factor

If you intend to allow people to download files from your site, your bandwidth calculation should be:

[(Average Daily Visitors x Average Page Views x Average Page Size) +
(Average Daily File Downloads x Average File Size)] x 31 x Fudge Factor

Let us examine each item in the formula:

Average Daily Visitors - The number of people you expect to visit your site, on average, each day. Depending upon how you market your site, this number could be from 1 to 1,000,000.

Average Page Views - On average, the number of web pages you expect a person to view. If you have 50 web pages in your web site, an average person may only view 5 of those pages each time they visit.

Average Page Size - The average size of your web pages, in Kilobytes (KB). If you have already designed your site, you can calculate this directly.

Average Daily File Downloads - The number of downloads you expect to occur on your site. This is a function of the numbers of visitors and how many times a visitor downloads a file, on average, each day.

Average File Size - Average file size of files that are downloadable from your site. Similar to your web pages, if you already know which files can be downloaded, you can calculate this directly.

Fudge Factor - A number greater than 1. Using 1.5 would be safe, which assumes that your estimate is off by 50%. However, if you were very unsure, you could use 2 or 3 to ensure that your bandwidth requirements are more than met.

Usually, hosting plans offer bandwidth in terms of Gigabytes (GB) per month. This is why our formula takes daily averages and multiplies them by 31.


Summary

Most personal or small business sites will not need more than 1GB of bandwidth per month. If you have a web site that is composed of static web pages and you expect little traffic to your site on a daily basis, go with a low bandwidth plan. If you go over the amount of bandwidth allocated in your plan, your hosting company could charge you over usage fees, so if you think the traffic to your site will be significant, you may want to go through the calculations above to estimate the amount of bandwidth required in a hosting plan.

Saturday, October 25, 2008

TOP 10 HACKS OF HISTORY

Here is a list off the top 10 hacks of all time. I got on some other site and thought many of you will be intersted in this list.

Early 1990s
Kevin Mitnick, often called by many as god of hackers, broke into the computer systems of the world's top technology and telecommunications companies Nokia, Fujitsu, Motorola, and Sun Microsystems. He was arrested by the FBI in 1995, but later released on parole in 2000. He never termed his activity hacking, instead he called it social engineering.

November 2002
Englishman Gary McKinnon was arrested in November 2002 following an accusation that he hacked into more than 90 US military computer systems in the UK. He is currently undergoing trial in a British court for a "fast-track extradition" to the US where he is a wanted man. The next hearing in the case is slated for today.

1995
Russian computer geek Vladimir Levin effected what can easily be called The Italian Job online - he was the first person to hack into a bank to extract money. Early 1995, he hacked into Citibank and robbed $10 million. Interpol arrested him in the UK in 1995, after he had transferred money to his accounts in the US, Finland, Holland, Germany and Israel.

1990
When a Los Angeles area radio station announced a contest that awarded a Porsche 944S2 for the 102nd caller, Kevin Poulsen took control of the entire city's telephone network, ensured he is the 102nd caller, and took away the Porsche beauty. He was arrested later that year and sentenced to three years in prison. He is currently a senior editor at Wired News.

1983
Kevin Poulsen again. A little-known incident when Poulsen, then just a student, hacked into Arpanet, the precursor to the Internet was hacked into. Arpanet was a global network of computers, and Poulsen took advantage of a loophole in its architecture to gain temporary control of the US-wide network.

1996
US hacker Timothy Lloyd planted six lines of malicious software code in the computer network of Omega Engineering which was a prime supplier of components for NASA and the US Navy. The code allowed a "logic bomb" to explode that deleted software running Omega's manufacturing operations. Omega lost $10 million due to the attack.

1988
Twenty-three-year-old Cornell University graduate Robert Morris unleashed the first Internet worm on to the world. Morris released 99 lines of code to the internet as an experiment, but realised that his program infected machines as it went along. Computers crashed across the US and elsewhere. He was arrested and sentenced in 1990.

1999
The Melissa virus was the first of its kind to wreak damage on a global scale. Written by David Smith (then 30), Melissa spread to more than 300 companies across the world completely destroying their computer networks. Damages reported amounted to nearly $400 million. Smith was arrested and sentenced to five years in prison.

2000
MafiaBoy, whose real identity has been kept under wraps because he is a minor, hacked into some of the largest sites in the world, including eBay, Amazon and Yahoo between February 6 and Valentine's Day in 2000. He gained access to 75 computers in 52 networks, and ordered a Denial of Service attack on them. He was arrested in 2000.

1993
They called themselves Masters of Deception, targeting US phone systems. The group hacked into the National Security Agency, AT&T, and Bank of America. It created a system that let them bypass long-distance phone call systems, and gain access to the pbx of major carriers.

Friday, October 24, 2008

INTRODUCTION TO A SERVER

A server is a computer that hosts some sort of data, and makes it available for access, across a LAN or the Internet.

There is a major misunderstanding in regard to servers among people i.e "a server is very difficult and costly to set up." Actually it's true for only an extent which means costly servers sre needed only when a large amount of internet traffic is expected and some others reasons too. This means you can set up a small server in your home too.Game servers are an ideal example.You can host multiplayer games on your server. Now lets come to the main job and set up a server.

Things To Be Considered:

NETWORK REQUIREMENTS
- A high speed connection. (Cable/DSL or faster)
- Something to split your internet connection, such as router(a router is something through which you can split your internet connection).You can get a router at http://newegg.com after spending a few bucks unless you have one.
NOTE: You need to split your internet connection so that you can share the internet from the same computer with your server and surf while your server is running.

MINIMUM HARDWARE REQUIREMENTS (according to me):
- A CPU running at 350 MHz will do fine
- 256 Mbs of RAM
- A Network Card (NIC)
- A 10 Gb Hard Drive (HDD).
- A 315 Watt PSU
- An AGP Video Card (It's not needed nowdays as most of the motherboards today have onboard graphics.)

SOFTWARE REQUIREMENTS
You have basically two choices for an Operating system:

- Windows: If you are not familiar with Linux, then you should buy a copy of any Windows Server Edition operating systems.Use atleast Win2k Server Edition or higher.

- Linux: If you ask me then i suggest Linux is a better choice for an OS. The reason is one can easily administrate over it and it's "totally free". I suggest you use "Ubuntu server edition" cause it's the latest version of Linux. You can easily get a copy of it at "The ubuntu website" as an image file in iso format which is totally free or "order a cd of it for a nominal charge".


HOSTING & NETWORK SPEED EVALUATION:
Your network speed depends on what you host, e.g if you are hosting some sort of video files it will consume more bandwidth than file hosting. So basically what you will host depends on your bandwidth.

Now you must be thinking what you can host on your newly setup server.the answer is here:

-Game server > You can use your server for hosting multiplayer games and enjoy with your friends on the network.
-Internet Radio Station >Your server can also be used for an Internet Radio Station by using shoutcast.( http://shoutcast.com/ : This is the home page for the shoutcast audio stream program.)
-A File Server >Using your server as a file server is a good choice if you have a slow bandwidth.
-Audio and video stuff >Hosting Audio and Video stuff should be considered only if you have a good bandwidth, that's because this stuff soaks up more bandwith than others (game servers too need more bandwidthdepending on the game type).
-Web page hosting >This is also a good choice if you are ready to spend some extra bucks for more space and faster processors. More space is needed because you have to provide it to your clients in return for some cash.

BASICS OF FORWARDING PORTS:
The most important thing you need to do after you've setup your server, is to forward the proper ports to it. As i have found, the best place to learn about forwarding ports is http://www.portforward.com/routers.htm. There find your router, and follow it's guide

UNDERSTANDING IP ADDRESSES

An IP ADDRESS is the address of a computer on the network which means a computer is recognized through it's IP ADDRESS.

What is IP?
INTERNET PROTOCOL or IP is a unique number assigned to each computer on a network. It is this unique address which represents the system on the network. Generally the IP of a particular system changes each time you log on to the network by dialing to your ISP(Internet Service Provider) and it is assigned to you by your ISP. IP of a system which is always on the network remains generally the same.

Lets take the example of the following IP address: 209.144.49.110 Now the first part, the numbers before the first decimal i.e. 209 is the Network number or the Network Prefix.. This means that it identifies the number of the network in which the host is. The second part i.e. 144 is the Host Number that is it identifies the number of the host within the Network. This means that in the same Network, the network number is same. In order to provide flexibility in the size of the Network, here are different classes of IP addresses:



Address Class Dotted Decimal Notation Ranges:

Class A ( /8 Prefixes) 1.xxx.xxx.xxx through 126.xxx.xxx.xxx

Class B ( /16 Prefixes) 128.0.xxx.xxx through 191.255.xxx.xxx

Class C ( /24 Prefixes) 192.0.0.xxx through 223.255.255.xxx

These classes will be clearer after reading the next few lines.

Each Class A Network Address contains a 8 bit Network Prefix followed by a 24-bit host number. They are considered to be primitive. They are referred to as "/8''s" or just "8's" as they have an 8-bit Network prefix.

In a Class B Network Address there is a 16 bit Network Prefix followed by a 16-bit Host number. It is referred to as "16's".

A class C Network address contains a 24-bit Network Prefix and a 8 bit Host number. It is referred to as
"24's" and is commonly used by most ISP's.


Due to the growing size of the Internet the Network Administrators faced many problems. The Internet routing tables were beginning to grow and now the administrators had to request another network number from the Internet before a new network could be installed at their site. This is where sub-netting came in.

Now if your ISP is a big one and if it provides you with dynamic IP addresses then you will most probably see that whenever you log on to the net, your IP address will have the same first 24 bits and only the last 8 bits will keep changing. This is due to the fact that when sub-netting comes in then the IP Addresses structure becomes:

xxx.xxx.zzz.yyy

where the first 2 parts are Network Prefix numbers and the zzz is the Subnet number and the yyy is the host number. So you are always connected to the same Subnet within the same Network. As a result the first 3 parts will remain the same and only the last part i.e. yyy is variable.
For Example, if say an ISP xyz is given the IP: 203.98.12.xx Network address then you can be awarded any IP, whose first three fields are 203.98.12.

So, basically this means that each ISP has a particular range in which to allocate all its subscribers. Or in other words, all subscribers or all people connected to the internet using the same ISP, will have to be in this range. This in effect would mean that all people using the same ISP are likely to have the same first three fields of their IP Addresses.

This means that if you have done a lot of of research, then you could figure out which ISP a person is using by simply looking at his IP.
For e.g say in a country,there are three main ISP’s:

ISP Name - Network Address Allotted
ISP I - 203.94.47.xx
ISP II - 202.92.12.xx
ISP III - 203.91.35.xx

Now if anyone sends me a mail from 203.94.47.23 then i can easily tell that he/she belongs to ISP I and where do he/she lives. I can tell that because i have done research from various sources. You gotta do the same.