How to Build a Computer and Component Selection

I get asked the same question very often, how to build a computer? It can be very deceiving to the average person’s eyes when they glance into an open computer. There are different “pieces” to be found and quite a few wires running around from one part to the next. All of this can be quite confusing and discouraging. The best advice I can give is that it looks more complicated that what it actually is. Don’t be afraid to do some trial and error to figure things out, within reason. Of course you have to be careful and not force components in spots they do not belong, but it is difficult to mix up as most things can only go one way, and will only fit in the correct location. Over the years it is how I learned most of what I do today, trial and error. One something goes wrong, there is always a way to fix it, and sometimes it just takes some patience and research to figure out a way to fix the problem.

I started building computers around 1996, when I was ten years old, and it became a hobby of mine which led into a business about 16 months ago. Over the years there has been a lot to learn and pick up on, and that is the thing with technology and computers. Computers are always changing and updating for more performance, reliably, smaller size, ease of use and less energy consumption in more recent years to become “green” on the environment.

But that is enough with the history, it is now time to move onto the actual computer building process from start to finish. There is quite a bit to cover and there are many ways to go about the process, but I will share my personal views and opinions along the way.

To start things off you have to ask yourself what you want the computer for. It could be a basic machine for simple web browsing, such as Facebook and E-mail. Another need could be simply for a media center, a computer hooked up with an entertainment center for movie watching purposes, music, recording, and internet television as well as any other uses hooked up to a television full time. The machine may be used primarily for gaming. A gaming computer can be a touchy subject as everyone’s views are different. Some may just be happy with playing a game on lower setting, and others may want everything turned up to the max with room to spare for future game titles. The final use I will touch into would be photo and video editing. A lot of times a high end gaming computer, and a photo/video editing machine will have many similarities. You do not necessarily have to have an extremely powerful system for videos and photos, but it will certainly cut down on the time required. If someone is looking to produce lengthy videos, it could take ages to accomplish on a less powerful computer. One thing I will say is no matter what you are looking to build a computer for, figure out a budget of available funds and go from there. There is no sense in looking at very costly premium components, when there is just no budget for it. Many would be quite surprised at how inexpensive a fairly powerful system can be “now days.” I never recommend to buy the latest and greatest as it will cost a premium and will be replaced by something better in around six months’ time. That is just how the computer world works.

Once the purpose of the computer and a budget is planned out, there are a handful of main components that are required to assemble a fully functional computer. These core components include;

The chassis which houses and protects all of the components,

The power supply (PSU) which supplies power to the computer from the wall,

Motherboard which is the central location for all the components to communicate with each other,

Graphics card, which is responsible for putting an image on your computer screen that you can see and interact with,

Processor (CPU) which functions as the brains of the operation, calculating millions of operations every second,

Memory (RAM) which stores temporary information calculated by the processor for fast access,

Hard drive, or hard disk which is the permanent storage device, holding all of the user’s data and programs,

Removable storage such as CD/DVD/Blu-Ray readers and burners, USB drives and other storage devices.

The first component we will start with is the motherboard. This is pretty much the central location where everything plugs into. The motherboard is full of “highways” which pass data between all the components. There are many kinds of motherboards available, most commonly there are AMD and Intel based boards. An AMD bases motherboard must be matched with an AMD processor and vice versa. An Intel based motherboard with an Intel processor. There are also different form factors, or sizes available, which must be matched with a proper case for a proper fit. Most commonly there are Micro ATX, ATX, Extended ATX, and more recently Mini ITX which is of a very small size for when space is extremely limited, such as in an entertainment center. There are also other sizes available but these are the most common platforms I deal with. The computer case chosen must be compatible with the size of the motherboard, if the motherboard is an ATX form factor, the case has to support an ATX size. Another main factor to take into consideration is the type of socket. The processor (CPU) mounts onto the motherboard and the socket must be the same. Both AMD and Intel have their own sockets and naming schemes. For example a modern Intel machine may be a socket 1155, and AMD machine may be an AM3 socket. So if the motherboard is an 1155 board the CPU must also be an 1155. Just be sure to do your research and make sure that the chosen CPU and motherboard are compatible with each other. It is worth noting that a CPU cannot be returned after purchase unless it is defective. So a mistake pairing up an incorrect CPU with a motherboard would not be a good thing to happen

The processor or CPU is a small chip made of silicon wafers that calculates millions of numbers extremely fast. In the computer world everything is a binary system which is made up of ones and zeros. Different combinations result in different things. The CPU is usually referred to as the brains of the computer. On modern Intel processors the contact side contains a large amount of pads which make contact with pins on the motherboard socket. AMD is the opposite having the pins on the CPU itself, and the pads on the motherboard. Quite a few years ago, Intel also had the pins on the CPU but that changed over the years. I remember having bent pins and straightening them out with a tiny pointy object. That would be very hard to do now days as there can be in excess of over 1,000 pins in close proximity to each other.

A processor makes a significant amount of heat, and needs something to keep the temperatures under control. Otherwise in a matter of seconds under a lot of work, a CPU would “burn” itself up and become a paperweight. In most cases a heat sink and a cooling fan will accomplish this task. A thin layer of thermal compound is applied in between the surface of the processor and the heat sink to optimally unload heat. A heat sink will usually be made up of several cooling fins which are then cooled by a fan blowing air across. Many retail boxed AMD and Intel processors will come with a stock cooling solution which is sufficient for the average user. The manufacturer will not sell a cooler along with their product that is not adequate. An aftermarket cooler may be chosen for less fan noise, or for those who like to push their computers passed the manufactures settings, which creates more heat, resulting in a higher performing cooling unit.

In more extreme cases, there may be liquid cooling or sometimes called water cooling. Technically water should not be used as it is conductive and will create corrosion over time without proper additives. In a water cooling setup, there will still be a block attached to the top of the processor, which allows liquid to pass through the inside and keep it cool. There will then be a radiator just like an automobile, and a fan(s) to cool the liquid as it passes through the radiator with the use of a pump.

Next, we will move onto the memory (RAM). RAM stands for random access memory. It is used to temporarily store data and will lose all stored data upon power loss. Memory is very fast and is right in line with the processor to quickly pass data back and forth to be calculated. Memory comes in different forms as well. On most modern computers, the type of memory being required is DDR3. There are different speeds which memory comes in and must be matched up with the chosen motherboard as well, just like the processor. Memory comes in various capacities. Different memory kits will have a varying number of modules and capacities. I would recommend either 8GB or 16GB of memory as it is in a very good price range this day and age. For a memory hungry video editing/photo editing machine, 16GB would come in very handy or possibly even 32GB if going all out!

For the graphics card there are several possibilities. Many times the chosen motherboard/CPU combination will have integrated on board video. Which means the graphics card is already built into the computer. No additional hardware would be needed. In some cases, a discrete graphics card can be used in conjunction with the on board video to further performance. In other situations there may be no integrated video at all and a graphics cards is required for the computer to function. Current graphics cards will occupy a PCI express slot (PCIe). AMD and their A series processors are a great solution currently for integrated graphics. They perform very well and are plenty for a general use computer and light duty gaming. For higher end gaming a dedicated graphics card will be needed, and if wanting some bragging rights, many modern day graphics cards can be paired up to work with each other. Sometimes two, three, or even four graphics cards to extreme gaming performance. But most likely if you are reading this, you would not be looking into that as it is fairly advanced and requires a fair amount of knowledge to master.

Next, we will move onto the power supply, which is responsible for taking the AC (alternating current) power from the outlet in the wall, and converting it into DC (direct current). The power supply or PSU (power supply unit) is another crucial component as without it, there is no juice to get the computer running. Power supplies come in different wattage’s and efficiency ratings. Recently the power requirements of computers have gone way down from what they were a couple of years ago. What once required a 1200 watt power supply can be accomplished with an 800 watt power supply on new hardware. Of course that is just an estimate I threw out there, but you get the idea. Many general use computers will be just fine somewhere along the lines of a 500 watt unit. It is always good to go a little bigger just to allow for expansion in the future. Depending on the chosen components, various amounts of power will be required. It is always best practice to get a decent, quality unit and not the cheapest one available. A cheap unit can actually cause problems in the future with “unclean” power and may not last for a decent length of time. A quality power supply should last for many years to come and may be reused in a future build as well. In most cases a standard ATX power supply with a 24 pin main power connector will do the job. There are other cables as well to take account for such as SATA, Molex, and 4/8 pin EPS connectors which supply modern motherboard with extra power that the 24 pin connector cannot provide.

Moving on the chassis to house the computer, there are many possibilities to choose from. There are so many designs to take into consideration and different sizes. Some may have an acrylic window on the side to see all of the components once they are inside. As noted above with motherboards, the case has to match up to support the correct form factor motherboard. Be it ATX or Extended ATX, or whatever else it may be. Same cases may just be plain and simple for a nice clean look, while others may be all futuristic with their design and flashy lights. It all comes down to personal preference and what the case has to offer. Good airflow is key to keeping all of the components cool and quiet. Cases can be customized with many different size fans featuring different air flows and noise levels, again depending on personal preference. A case will last for a very long time and can be reused in multiple computer builds. I prefer a full tower chassis for my personal computers as they allow plenty of space inside and upgrade ability to last for years and years to come.

Moving onto the hard drives, there are a couple different possibilities. This is the device that stores all the data and programs, even when power is turned off, unlike with memory or RAM. Many get the memory and hard drives confused when looking into computer purchases. They are not the same thing and come in completely different sizes. There are the traditional mechanical hard drives which are very cheap now days. The cost per Gigabyte is extremely low and they work great for large amount of storage room at a very good price. More recently we now have affordable SSD (solid state drive) solutions which have no mechanical moving parts and are much faster and more responsive then a tradition mechanical drive. An SSD makes for a much snappier system, and is one of the best upgrades that can happen for many modern computers. Computers are so fast now that traditional mechanical drives act as a bottleneck in many cases. This is where the computer system has to take a break and pause while it waits for the drive to gather its data and send it out. With an SSD this process is considerably faster, resulting in a much faster overall system. I tell many first time SSD users that they will be greatly amazed at the difference in the responsiveness of their computer after swapping out a mechanical drive for an SSD. The fallback to an SSD is that they come in much smaller capacities and the price per Gigabyte is much higher, even though it continues to drop. To get the best of both worlds, an SSD drive can be used to install the operating system onto, such as Windows, as well as frequently used programs. Then a mechanical drive can be used as well for tasks like less used programs, backups, and large files that would otherwise take up way too much space on an SSD.

In my opinion CD and DVD drives are starting to become a thing of the past. They like to fail and have read and write errors after a while and can be not so reliable at time. A successful computer can be build now days without even having an optical drive installed. Just about anything you could possibly need can be download from the internet or installed with a USB device such as a thumb or pen drive. These devices are much faster and more reliable, not to mention they can be erased and written to over and over again with different data and application. Sure there are re-writable CD/DVD/Blu-Ray drives, but it is just not nearly as practical or convenient in my opinion to other non-mechanical solutions. Personally I have a computer with all my programs and data on it, which can be accessed over the network where I can then pull all that information onto another computer and install and run those programs. No CD’s or thumb drives even needed. There are a lot of external storage options to choose from, external USB/eSATA hard drives can be a great solution for performing backups or moving data from one computer to another.

Well there you have it, a rundown on what the components inside a computer are and what their role is. This is in no way an all inclusive list, but it does cover the main components to get a fully functional system that will satisfy most of the population. By purchasing your own computer parts, and putting everything together, it gives off a certain sense of pride and ownership knowing that the system was put together with your own hands. It is always nice to know what each component is and what its job is in the compute cycle. One big bonus to a custom built computer, and not going with one from the big companies, is you don’t have to deal with all the so called “junk” they load them with. Such as a hold bunch of trail versions and other software which is really not needed and hogging up resources. Taking the DIY route, you decide what goes on the computer and what does not, which is a big plus in my opinion!

Article Source: http://EzineArticles.com/expert/Scott_A_H/1599528

Article Source: http://EzineArticles.com/7752633


Troubleshooting a Broken Computer

When computers fail we are helpless. Sometimes the failure is due to a virus attack. However, other hardware failures could be the source of the problem. With new computers there are some simple hardware component replacements that fix ailing computers. These are simple replacements that literally anyone can do with a screwdriver and patience. Most often the failed component to replace is the power supply, the hard disk drive, or a failed CD or DVD drive. This article helps you determine the source of the computer failure so that if it is a simple problem, you could repair your computer. Alternatively, you would know when to have a professional help you repair your computer after the problem is identified.

Let us start from scratch. In this case we have our malfunctioning computer powered off. The first step is to power it on and observe. When the power button is hit, do lights light up in the computer? On both tower computers and laptop computers there are lights that light when power is applied. A good idea is to take a picture of your computer when it is running properly so it helps you remember how it looks, what lights are lighted, and what the color of the lights are. Good power to the computer is often indicated by a green or blue light. Orange lights indicate a malfunction and indicate that the computer has power somewhere.

When there are no lights, the power supply is likely to be the source of the failure. Laptop external transformers can be tested and replaced. They run under $100. There are aftermarket replacements. The next test for a laptop and a desktop computer is to use a new external laptop transformer or a new power supply as a test for the failed power supply.

Power supplies for desktop computers are available on-line and from local stores. They also cost around $100. A new supply does not need to be installed in a computer to test the power supply. Just lay the computer on its side with the chassis open, place the new power supply on top the old power supply, disconnect the old power supply one connector at a time, and connect the equivalent connector into the computer one power supply connector at a time. Once the new power supply is connected try powering on the computer. This tests the power supply.

A new power supply that provides more watts is fine. This means a 300 W power supply can be replaced by a 500 W power supply. It is best not to reduce power supply wattage (replace a 500 W power supply with a 300 W power supply), but OK to increase it. Power connections to floppy disk drives can easily short out the power supply when the connector is not plugged in precisely correct. If the computer does not turn on, then disconnect the floppy drive power and try with the floppy drive power disconnected. Finally, all power supply connections are keyed and have a connector clip. When connecting the power connectors make sure the connector clip is lined up correctly.

If the new transformer does not fix the laptop computer power, then you can return the laptop to the manufacturer for repairs or buy a new laptop. If you buy a new laptop, the data can be removed from the old laptop’s hard disk drive and moved to the new laptop’s hard disk drive. With a desktop PC, just un-bolt the old power supply and bolt in the new one. Replacing the power supply in a computer is cleaner and easier than replacing the spark plugs in a car.

Now let us return to our troubleshooting. The next possible error indication is that the computer does not boot and there is no display. Monitors have a light on the bottom right. When this light is green, the computer is sending a video signal to the monitor. If the light is a large, then the computer is not sending a video signal to the monitor.

When there is no video signal sent to the monitor, it indicates that the problem resides in the computer hardware. This suggests that we look inside the computer itself and check the Main Logic Board (MLB) capacitors (these are round tower like components that stick up from the MLB). Main Logic Boards fail when they are five years or older due to the capacitors on the MLB failing. The capacitors burst causing a complete failure of the MLB. When this occurs, the solution is purchasing a new computer.

When a computer is first powered on, it typically displays the BIOS setup information prior to attempting to start Windows. This information should flash briefly on the monitor as the computer starts. When it does, this shows that the computer itself and the display are operating properly. After this display, Windows can boot to a black or blue screen. The is sometimes referred to as the “blue screen of death”. In either case there is no starting Windows. This error points to a failed hard disk drive. It typically means that the computer is working fine, but the disk drive has failed provide information to the computer needed to start Windows. Because you cannot boot into Windows, there is no way to test the disk drive. In this event the disk drive must be removed from the computer and attach to a another Windows computer for testing. The disk drive test command in Windows is CHKDSK /R. When this command is run in Windows, it tests the disk drive, corrects any data corruption on the disk drive, and determines if there are physical errors on the drive.

Physical drive errors are indicated by any number greater than zero in the bad sectors test results report. When physical errors or bad sectors are reported, it means that your disk drive has cancer. While the drive is not dead, it should be replaced immediately and the data copied from it to the new drive. Continued use of a drive with bad sectors, risks losing all the data on the drive. The difficult part of replacing a hard disk drive is copying all the data from the old drive to the new drive. There are programs that permit imaging the entire hard disk drive and then onto a new hard disk drive. If an image is successfully created and then copy to a new disk drive, the Windows computer often returns to normal operation as though nothing has happened.

In some cases when the Windows data is placed on a new disk drive, Windows still fails to start. In this event installing Windows over Windows for Windows XP or installing a fresh copy of Windows for Windows 7 typically fixes the problem. To install a completely fresh copy of Windows in either case (Windows XP or Windows 7) while preserving the data on the disk drive it is only necessary to delete the Windows folder from the drive on which the fresh copy of Windows is to be installed. It is also a good idea to rename the Documents and Settings or the Users folders so that the data contained in them is preserved.

This completes the basic PC hardware troubleshooting procedure. When a computer boots into Windows and still has problems, then it is likely a software issue. Software issues are commonly resolved by reinstalling Windows or by removing viruses and little use software from the computer. A complete procedure for removing viruses and spyware is beyond what I can present in this article. A detailed virus removal and spyware procedure is covered in my “Pete The Nerd’s Do It Yourself Virus Removal” book. The goal of this article was to get you started in troubleshooting your PC and to give you some idea of what the next effective steps to pursue are so that you may return your PC to normal operation.

Thank you for your time.

Pete the Nerd
“Your Friend on a Technically Challenged Planet©”

©P D Moulton

Pete is the original Dial-A-Nerd (http://www.DialANerd.com). Advertised in the 1990 USA Today classifieds, Dial-A-Nerd concept was created in the late 1980’s to provide telephone computer help. Dial-A-Nerd became the Dial-A-Nerd Radio show on WJFK Radio and then the Technically Correct TV Show on WMAR ABC Channel 2 in Baltimore.

Pete has worked on computer before the earliest days of personal computers. In his early years working in data communications he personally met some pioneers of the Internet, but he never met Al Gore.

Pete wrote several books for Prentice-Hall Publishers including “A+ Certification and PC Repair Guide”,”The Telecommunications Survival Guide”, and “SOHO Networking.”

Pete’s PC support and troubleshooting experience comes from building and supporting PCs, and training non-technical users to maintain and troubleshoot PCs over the last 30 years. His work continues today and has led to writing and publishing “Pete the Nerd’s Do It Yourself Virus Removal” at Amazon.com.



Beginner’s Guide to Computer Forensics

Computer forensics is the practice of collecting, analysing and reporting on digital information in a way that is legally admissible. It can be used in the detection and prevention of crime and in any dispute where evidence is stored digitally. Computer forensics has comparable examination stages to other forensic disciplines and faces similar issues.

About this guide
This guide discusses computer forensics from a neutral perspective. It is not linked to particular legislation or intended to promote a particular company or product and is not written in bias of either law enforcement or commercial computer forensics. It is aimed at a non-technical audience and provides a high-level view of computer forensics. This guide uses the term “computer”, but the concepts apply to any device capable of storing digital information. Where methodologies have been mentioned they are provided as examples only and do not constitute recommendations or advice. Copying and publishing the whole or part of this article is licensed solely under the terms of the Creative Commons – Attribution Non-Commercial 3.0 license

Uses of computer forensics
There are few areas of crime or dispute where computer forensics cannot be applied. Law enforcement agencies have been among the earliest and heaviest users of computer forensics and consequently have often been at the forefront of developments in the field. Computers may constitute a ‘scene of a crime’, for example with hacking [ 1] or denial of service attacks [2] or they may hold evidence in the form of emails, internet history, documents or other files relevant to crimes such as murder, kidnap, fraud and drug trafficking. It is not just the content of emails, documents and other files which may be of interest to investigators but also the ‘meta-data’ [3] associated with those files. A computer forensic examination may reveal when a document first appeared on a computer, when it was last edited, when it was last saved or printed and which user carried out these actions.

More recently, commercial organisations have used computer forensics to their benefit in a variety of cases such as;

Intellectual Property theft
Industrial espionage
Employment disputes
Fraud investigations
Matrimonial issues
Bankruptcy investigations
Inappropriate email and internet use in the work place
Regulatory compliance
For evidence to be admissible it must be reliable and not prejudicial, meaning that at all stages of this process admissibility should be at the forefront of a computer forensic examiner’s mind. One set of guidelines which has been widely accepted to assist in this is the Association of Chief Police Officers Good Practice Guide for Computer Based Electronic Evidence or ACPO Guide for short. Although the ACPO Guide is aimed at United Kingdom law enforcement its main principles are applicable to all computer forensics in whatever legislature. The four main principles from this guide have been reproduced below (with references to law enforcement removed):

No action should change data held on a computer or storage media which may be subsequently relied upon in court.

In circumstances where a person finds it necessary to access original data held on a computer or storage media, that person must be competent to do so and be able to give evidence explaining the relevance and the implications of their actions.

An audit trail or other record of all processes applied to computer-based electronic evidence should be created and preserved. An independent third-party should be able to examine those processes and achieve the same result.

The person in charge of the investigation has overall responsibility for ensuring that the law and these principles are adhered to.
In summary, no changes should be made to the original, however if access/changes are necessary the examiner must know what they are doing and to record their actions.

Live acquisition
Principle 2 above may raise the question: In what situation would changes to a suspect’s computer by a computer forensic examiner be necessary? Traditionally, the computer forensic examiner would make a copy (or acquire) information from a device which is turned off. A write-blocker[4] would be used to make an exact bit for bit copy [5] of the original storage medium. The examiner would work then from this copy, leaving the original demonstrably unchanged.

However, sometimes it is not possible or desirable to switch a computer off. It may not be possible to switch a computer off if doing so would result in considerable financial or other loss for the owner. It may not be desirable to switch a computer off if doing so would mean that potentially valuable evidence may be lost. In both these circumstances the computer forensic examiner would need to carry out a ‘live acquisition’ which would involve running a small program on the suspect computer in order to copy (or acquire) the data to the examiner’s hard drive.

By running such a program and attaching a destination drive to the suspect computer, the examiner will make changes and/or additions to the state of the computer which were not present before his actions. Such actions would remain admissible as long as the examiner recorded their actions, was aware of their impact and was able to explain their actions.

Stages of an examination
For the purposes of this article the computer forensic examination process has been divided into six stages. Although they are presented in their usual chronological order, it is necessary during an examination to be flexible. For example, during the analysis stage the examiner may find a new lead which would warrant further computers being examined and would mean a return to the evaluation stage.

Forensic readiness is an important and occasionally overlooked stage in the examination process. In commercial computer forensics it can include educating clients about system preparedness; for example, forensic examinations will provide stronger evidence if a server or computer’s built-in auditing and logging systems are all switched on. For examiners there are many areas where prior organisation can help, including training, regular testing and verification of software and equipment, familiarity with legislation, dealing with unexpected issues (e.g., what to do if child pornography is present during a commercial job) and ensuring that your on-site acquisition kit is complete and in working order.

The evaluation stage includes the receiving of clear instructions, risk analysis and allocation of roles and resources. Risk analysis for law enforcement may include an assessment on the likelihood of physical threat on entering a suspect’s property and how best to deal with it. Commercial organisations also need to be aware of health and safety issues, while their evaluation would also cover reputational and financial risks on accepting a particular project.

The main part of the collection stage, acquisition, has been introduced above. If acquisition is to be carried out on-site rather than in a computer forensic laboratory then this stage would include identifying, securing and documenting the scene. Interviews or meetings with personnel who may hold information which could be relevant to the examination (which could include the end users of the computer, and the manager and person responsible for providing computer services) would usually be carried out at this stage. The ‘bagging and tagging’ audit trail would start here by sealing any materials in unique tamper-evident bags. Consideration also needs to be given to securely and safely transporting the material to the examiner’s laboratory.

Analysis depends on the specifics of each job. The examiner usually provides feedback to the client during analysis and from this dialogue the analysis may take a different path or be narrowed to specific areas. Analysis must be accurate, thorough, impartial, recorded, repeatable and completed within the time-scales available and resources allocated. There are myriad tools available for computer forensics analysis. It is our opinion that the examiner should use any tool they feel comfortable with as long as they can justify their choice. The main requirements of a computer forensic tool is that it does what it is meant to do and the only way for examiners to be sure of this is for them to regularly test and calibrate the tools they use before analysis takes place. Dual-tool verification can confirm result integrity during analysis (if with tool ‘A’ the examiner finds artefact ‘X’ at location ‘Y’, then tool ‘B’ should replicate these results.)

This stage usually involves the examiner producing a structured report on their findings, addressing the points in the initial instructions along with any subsequent instructions. It would also cover any other information which the examiner deems relevant to the investigation. The report must be written with the end reader in mind; in many cases the reader of the report will be non-technical, so the terminology should acknowledge this. The examiner should also be prepared to participate in meetings or telephone conferences to discuss and elaborate on the report.

Along with the readiness stage, the review stage is often overlooked or disregarded. This may be due to the perceived costs of doing work that is not billable, or the need ‘to get on with the next job’. However, a review stage incorporated into each examination can help save money and raise the level of quality by making future examinations more efficient and time effective. A review of an examination can be simple, quick and can begin during any of the above stages. It may include a basic ‘what went wrong and how can this be improved’ and a ‘what went well and how can it be incorporated into future examinations’. Feedback from the instructing party should also be sought. Any lessons learnt from this stage should be applied to the next examination and fed into the readiness stage.

Issues facing computer forensics
The issues facing computer forensics examiners can be broken down into three broad categories: technical, legal and administrative.

Encryption – Encrypted files or hard drives can be impossible for investigators to view without the correct key or password. Examiners should consider that the key or password may be stored elsewhere on the computer or on another computer which the suspect has had access to. It could also reside in the volatile memory of a computer (known as RAM [6] which is usually lost on computer shut-down; another reason to consider using live acquisition techniques as outlined above.

Increasing storage space – Storage media holds ever greater amounts of data which for the examiner means that their analysis computers need to have sufficient processing power and available storage to efficiently deal with searching and analysing enormous amounts of data.

New technologies – Computing is an ever-changing area, with new hardware, software and operating systems being constantly produced. No single computer forensic examiner can be an expert on all areas, though they may frequently be expected to analyse something which they haven’t dealt with before. In order to deal with this situation, the examiner should be prepared and able to test and experiment with the behaviour of new technologies. Networking and sharing knowledge with other computer forensic examiners is also very useful in this respect as it’s likely someone else may have already encountered the same issue.

Anti-forensics – Anti-forensics is the practice of attempting to thwart computer forensic analysis. This may include encryption, the over-writing of data to make it unrecoverable, the modification of files’ meta-data and file obfuscation (disguising files). As with encryption above, the evidence that such methods have been used may be stored elsewhere on the computer or on another computer which the suspect has had access to. In our experience, it is very rare to see anti-forensics tools used correctly and frequently enough to totally obscure either their presence or the presence of the evidence they were used to hide.

Legal issues
Legal arguments may confuse or distract from a computer examiner’s findings. An example here would be the ‘Trojan Defence’. A Trojan is a piece of computer code disguised as something benign but which has a hidden and malicious purpose. Trojans have many uses, and include key-logging [7], uploading and downloading of files and installation of viruses. A lawyer may be able to argue that actions on a computer were not carried out by a user but were automated by a Trojan without the user’s knowledge; such a Trojan Defence has been successfully used even when no trace of a Trojan or other malicious code was found on the suspect’s computer. In such cases, a competent opposing lawyer, supplied with evidence from a competent computer forensic analyst, should be able to dismiss such an argument.

Accepted standards – There are a plethora of standards and guidelines in computer forensics, few of which appear to be universally accepted. This is due to a number of reasons including standard-setting bodies being tied to particular legislations, standards being aimed either at law enforcement or commercial forensics but not at both, the authors of such standards not being accepted by their peers, or high joining fees dissuading practitioners from participating.

Fitness to practice – In many jurisdictions there is no qualifying body to check the competence and integrity of computer forensics professionals. In such cases anyone may present themselves as a computer forensic expert, which may result in computer forensic examinations of questionable quality and a negative view of the profession as a whole.

Resources and further reading
There does not appear to be a great amount of material covering computer forensics which is aimed at a non-technical readership. However the following links at links at the bottom of this page may prove to be of interest prove to be of interest:

1. Hacking: modifying a computer in way which was not originally intended in order to benefit the hacker’s goals.
2. Denial of Service attack: an attempt to prevent legitimate users of a computer system from having access to that system’s information or services.
3. Meta-data: at a basic level meta-data is data about data. It can be embedded within files or stored externally in a separate file and may contain information about the file’s author, format, creation date and so on.
4. Write blocker: a hardware device or software application which prevents any data from being modified or added to the storage medium being examined.
5. Bit copy: bit is a contraction of the term ‘binary digit’ and is the fundamental unit of computing. A bit copy refers to a sequential copy of every bit on a storage medium, which includes areas of the medium ‘invisible’ to the user.
6. RAM: Random Access Memory. RAM is a computer’s temporary workspace and is volatile, which means its contents are lost when the computer is powered off.
7. Key-logging: the recording of keyboard input giving the ability to read a user’s typed passwords, emails and other confidential information.

Jonathan Krause has over eleven years’ experience in IT security and seven years’ experience in commercial and law enforcement digital forensics, having worked for the Metropolitan Police at the Hi-Tech Crime Unit at New Scotland Yard as a computer forensic analyst and latterly as a independent consultant who has conducted a very wide range of investigations on behalf of commercial organisations involving fraud, deception, IP theft, murder, drug trafficking and child protection cases. Jonathan set up http://forensiccontrol.com/ in 2008.



Choosing the Best Desktop Computer For You

Like a lot of people in the world today, you probably have a specific budget in mind when you buy a desktop computer. You may wonder, though, how to pick out the computer. What size and shape you need and with all the new technology out there, you may be unsure what to get. We are here to help you with this most difficult decision. Read on for some great insight into what to look for when purchasing a great desktop computer.

There are four different types of a PC user. Read below to find out which one you are. This will help you choose the best computer for you and your family.

General purpose user: a general purpose use desktop computer is perfect for those who like to make pictures, edit pictures, play games and surf the net. Depending on what you need a general purpose computer can range in price from $500 to $1500.

Power User: a power user computer is a computer that can be used to make and edit movies and videos. These types of computers also allow you to make digital designs and play mega games. With these computers you will typically need 2 or more hard drives and a great graphics card. These computers typically run higher in price due to the power behind the computer and the different running systems that it will include. A power user computer can range in price from $2500 to $3500 depending on what you will need to perform the types of work that you want to do on this computer.

Home Theater Enthusiast: Do you love movies and television? Why not get a computer that can handle all of your home theater needs? This type of computer is great for people who love to watch movies and television. Windows Media Center is on all of the windows programs including the new Windows 7. When considering this type of computer always keep in mind what type of media you will be playing. This will help you to decide the video card and how much memory and output you will need. You can also find surround sound for your home theater computer, which will make your home theater even more special. When purchasing this computer make sure that is has the proper DVD drive or if you desire you can get a computer with a blue ray player allowing you the maximum high definition display. This type computer can range in price from $500 to $1500 depending on what you need included with the great entertainment model computer.

Home Office Worker: this is a great computer for those who work from home. With this type of computer you do not need the massive graphics power as other computers unless you design graphics for your home office work and then you would want a computer with a higher graphics count. You will want a system with a dual power core so that you can multitask and get the work done that you need done. Windows has a great operating system in Windows 7 that is great for being able to multitask. The newest feature on Windows 7 allows you to have different windows open at one time and you are also allowed to have them up side by side. This will save you time and you will not have to worry about your computer crashing. Mac also has a great operating system for home office as well. A home computer can run in price from $500 to $2,000 depending on what you need and how you plan on using it.

There are many different features to consider when wanting to purchase any of the before mentioned computers. The main great features that you will want to consider are:

* Processor: The two most common type of processor is AMD Athlon 64 X2, or Intel Core 2 Duo processor. You will want a Duo processor if you will be doing a lot of work or something that calls for a high speed processor such as burning or making videos and DVD’s. A processor is at the most basic form the brains of your computer. The faster your processor is the better performance you will receive out of your computer.

* Memory: The memory in a desktop computer can vary due to the upcoming technology. Depending on what you need you can find a desktop that has 1G to 4G’s of memory. Memory is changing due to different technology every day. When considering the memory take into consideration what you will be using your computer for. The more pictures and videos you need to make will increase the gigabyte count. Having the maximum amount of memory allows you to have plenty of room for all those important things that you want to install on your computer.

* Hard Drive: Depending on what you will need to store on your computer, you can find the perfect hard drive. A hard drive for this type of desktop can range from 250 G to 500 G. When choosing the hard drive, keep in mind what kind of programs you want to install. If you are wanting to install games or photo editing software, it would be best to have a higher gigabyte count. This allows you to have plenty of space for these programs as well as other programs including music and videos.

* Running system: When choosing your computer you will want to pick out a operating system that will work great for you. You can choose Windows or Mac. Both are very reliable brands and will give you a great computer experience. The latest version of Windows is Windows 7 and the latest version of Mac is the Apple OS X Leopard. The feedback on both of these programs are great. You can also ask friends and family that have these running systems how they feel about them and this will help you to pick out the perfect running system for you.

* Video Card: the video card helps to control different graphics and video. The most typical video card is 128 mb and comes from NVIDIA and ATI. With ever changing technology the video card will increase in megabyte size and quality. Most computers come with this feature already installed for quality graphics.

* Keyboard: Your computer would be basically useless without a keyboard. There are several different choices of keyboard. If you have trouble with your wrists, you can find a keyboard that is ergonomic and has a wrist wrest built into the keyboard. If you do not want the headache of wires all over the place, you can purchase a wireless keyboard. A wireless keyboard gives you access to your computer without the clutter of wire.

* Mouse: The mouse is a key component in the running of your desktop computer. The mouse helps you click on links and scroll down throughout pages. A wireless mouse is the latest invention. You can move your mouse around without the constraint of wire. This is a great way to perform your task. A computer can function without a mouse but it would be very difficult to use.

* Monitor: The monitor is the most important part of the computer besides the hard drive. The monitor allows you to see images and what is on your computer. You can find average size monitors, wide screened monitors and flat screened monitors. You can even find HD monitors that deliver a even clearer pictures. The size of the monitor depends on what size you need for what you will be doing with your desktop computer. With larger screens you will have the ability to watch many different movies in television size.

* Web Camera: a web camera can be hooked up to your computer allowing you to take images and videos and save them directly to your computer. You can also video chat with people online using it with Yahoo, Skype or even Facebook. It is a great way to stay in contact with friends and family. When considering a web cam purchase, check and see what pixels that it has. The higher the pixel, the greater the image will be and less pixilated.

* Size: Desktop computers have changed in size drastically over the years and each year it seems like a smaller more compact model is being released. That is great for those of you that do not have a lot of space to store a large computer. You can find a desktop in the size you need including compact. Every year smaller and more compact computers are being released. When looking for the size, take into consideration, where you will set up your computer and how much room you will have. It would not hurt to take measurements and have these with you when you purchase your computer.

* DVD/Blue Ray: If you like to watch movies you can purchase a desktop computer that has a DVD player. This is a great way to watch your movies. When purchasing the computer make sure you check and see what regions that the DVD player can handle. This way you can enjoy movies from many different regions from around the world. Region 1 is the United States and Region 2 is the United Kingdom. So if you want a movie from across the pond you can purchase it and watch it with your desktop. If you desire high quality and high definition you can purchase a computer that is equipped with a blue ray player. This is great for watching high definition movies.

* Warranty: When purchasing a computer, you may consider purchasing an extended warranty. This will give you ease of mind in case of something happening to your computer. Always remember to register for your warranty and it will make you and your computer feel safer.

Have this guide with you when you go to purchase your new desktop. This guide will help you make a choice on what you need for your new desktop computer. Remember take into consideration all of these features and what you will be using the computer for.

Article Source: http://EzineArticles.com/expert/Beth_Mccall/518255

Article Source: http://EzineArticles.com/4750015


How to Keep Your Computer Cool

When your computer is on, nearly all of its components become hot. Constant exposure to high temperature can cause serious damage to your computer.

Here is a list of ways in keeping your PC cool.

Check if your fans are running.

This is the first step when you find your computer overheating. Open the case, and then check if all fans are still working. If at least one is not working anymore, consider doing repairs or getting a replacement.

Regularly clean your computer.

It is essential to regularly clean your computer, especially the cooling fans. The fans attached inside the computer case is used for active cooling of the computer. Over time, dust and dirt can accumulate in these fans. The accumulate dirt can slow down or, in worse, stop fans from working. If fans fail in expelling the hot air fast enough, some internal parts will eventually overheat.

To clean your cooling fan:

1. Shut down your PC.

2. Open the computer case.

3. If there is excessive dirt inside the computer case, take out the computer fan.

4. You can use compressed air, small electronic vacuum or duster, or damp cloth in cleaning the fan.

5. If you use moisten cloth, make sure that the cooling fan is dry or there is no remaining moisture before connecting it again.

Clean other computer parts as well such as the monitor, mouse, and keyboard.

Before cleaning any hardware component, make sure that your machine is turned off. Otherwise, your computer is susceptible to electrostatic discharge that can damage its parts and you are also prone to grounding yourself.

Before applying any cleaning procedures to hardware, make sure to check its manufacturer’s manual if they have provided you with the recommended instructions in cleaning or maintaining it.

Do not spray or spill any liquid directly in computer parts.

Do not limit the air flow around your computer.

Place your computer in a room that can provide sufficient air flow. Make sure that it is not sitting right next into other objects that prevent air circulation, like walls or other computers. There should be at least two to three inches of space on both sides. Since most of the hot air comes out from the air vent at the back end of the computer case, this part should be completely clear and open.

Move your computer to a cooler and cleaner environment.

Move your PC in a place with proper ventilation. It is important that the physical location will not contribute further heat to the computer. Make sure that your PC is not placed near a furnace, refrigerator, cooking appliances, and other things that can blow hot air or can transfer heat into your computer system.

To prevent your PC from overheating, it is advised to place it in an air-conditioned room.

Note: be careful when moving your computer in order to avoid damage on sensitive components inside it like the CPU, graphics card, hard drive, and motherboard.

Use your computer with case closed.

It seems logical to let the case open while the computer is running to keep it cooler. This is true. However, dirt and dust will accumulate and clog the computer fans faster when the case is opened. This can cause the fans to slow down or fail at cooling your computer.

Upgrade your CPU fan.

The CPU is the most important component inside the computer. When you are running demanding applications, the CPU and graphics card induce more heat. It can get so hot that it can be cooked.

Consider purchasing a high-quality and larger CPU fan that can keep the CPU temperature lower than the pre-built CPU fan in your computer could.

Consider installing a component-specific fan.

If you have observed that the other components are overheating, install a component-specific fan to cool them down.

Consider installing a case fan.

This small fan can be attached to either the front or back of the computer case. There are two types of case fan: one that can draw cooler air into the case, and one that can expel warm air from the case. Installing both is a great way to cool your computer.

Turn off your computer when not in use.

A computer continues to produce heat as long as it running, even if you aren’t using it. If you will only have a few minutes of inactivity, at least set your computer to hibernation. Basically, it will also turn off your computer but the opened files and programs are stored in your hard disk.

Also, unplug external hardware of no longer use like printers and scanners.

Overheating can destroy and shorten the lifespan of components inside your computer. The major upside of keeping your computer cool is that it can help you avoid expensive repairs or unnecessary upgrades.

If you are looking for a dll error repair tool to restore missing corrupted files, you can download for free on http://www.dlltool.com/



Top quality of graphic with HD of the quality really can boost your inside spirit to perform with the game. The activities are exciting and much fun. The best aspect about these activities is the fact that all age groups’ people find it comfortable to play with the sport.

The concept of bike-racing in fact is elderly nevertheless the sporting on notebook monitor and the computer screen is becoming common and tad newer as well. Mankind always locate solutions for desires and their wishes and sporting games too will be the byproduct of these human needs only.
Vehicle, bicycles and what-not:

Sporting activities aren’t confined towards the category of bike-racing nevertheless the different activities are also about the contribution of heavy weight autos. You can have exciting using the gaming online or with the video games if your dream of bike racing isn’t fulfilled in true at the least.

Phone friends and family and obtain started with all the games-which are not worthlessness period partaking and enjoying. Don’t assume all era was not unlucky to own computer-game but when you’re the element of such cheerful creation absolutely making this chance is whole ridicule. And, who affirms gaming can’t reward your energy of concentration cans definitely boost and focus as while playing you will never shed your attentiveness.

Studies that are free offered:

Many activities are not unavailable for demo that is free. Various websites provides free online games aswell. Free studies would be the respectable supply to learn about any recreation. Incase when you have to buy any game subsequently if that’s the case you’ll be able to usually rely before buying the recreation.

To play any games that are online any techniques it is uncomplicated. But if you discover the different gambling websites that may lead one to different sounding games are a bit wearisome only gone by it select that may provide you with numerable selection of the games and the atv games, choose one and acquire started with-it.

Regarding rushing together with the sporting activities, simply, boost up your interest and luxuriate in games!!


Computing Crunch Power And The Simulation Hypothesis

It has been postulated that our reality might in fact be a virtual reality. That is, some unknown agency, “The Others”, have created a computer simulation and we ‘exist’ as part of that overall simulation. One objection to that scenario is that in order to exactly simulate our Cosmos (including ourselves) we would require a computer the size of our Cosmos with the sort of crunch power that could duplicate our Cosmos on a one-to-one basis, which is absurd. The flaw is that realistic simulations can be made without resorting to a one-on-one correlation.


Here’s another thought on the Simulation Hypothesis which postulates that we ‘exist’ as a configuration of bits and bytes, not as quarks and electrons. We are virtual reality – simulated beings. Here is the “why” of things.

Really real worlds (which we presume ours to be) are simulating virtual reality worlds – lots and lots and lots of them – so the ratio of virtual reality worlds to really real worlds is lots, and lots and lots to one. That’s the main reason why we shouldn’t presume that ours is a really real world! If one postulates “The Other”, where “The Other” might be technologically advanced extraterrestrials creating their version of video games, or even the human species, the real human species from what we’d call the far future doing ancestor simulations, the odds are our really real world is actually a really real virtual reality world inhabited by simulated earthlings (like us).

Now an interesting aside is that we tend to assume that “The Other” are biological entities (human or extraterrestrial) who like to play “what if” games using computer hardware and software. Of course “The Other” could actually be highly advanced A.I. (artificial intelligence) with consciousness playing “what if” scenarios.


Anyway, each individual simulated world requires just so many units of crunch power. We humans have thousands of video games each ONE requiring a certain amount of computing crunch power. There may be in total is an awful lot of computing crunch power going on when it comes to these video games collectively, but what counts is the number of video games divided by the number of computers playing them. Not all video games are being played on just one computer at the same time. If you have a ten-fold increase in video games, and a ten-fold increase in the number of computers they are played on, there’s no need for ever increasing crunch power unless the nature of the game itself demands it. Video games today probably demand more crunch power than video games from twenty years ago, but we’ve to date met that requirement.

Now if a really real world created thousands of video games, and the characters in each and every one of those video games created thousands of video games and the characters in those video games created thousands of their video games, okay, then ever increasing crunch power within that original really real world is in demand. That’s not to say that that ever increasing need for crunch can’t be met however. But that’s NOT the general scenario that’s being advocated. For the immediate here and now, let’s just stick with one really real world creating thousands of uniquely individual simulated virtual reality worlds (i.e. – video games). Ockham’s Razor suggests that one not overly complicate things unnecessarily.

That said, a variation on Murphy’s Law might be: The ways and means to use computing crunch power expands to meet the crunch power available and is readily on tap.

Sceptics seem to be assuming here that if you can simulate something, then ultimately you will pour more and more and more and more crunch power (as it becomes available) into that which you are simulating. I fail to see how that follows of necessity. If you want to create and sell a video game, if you put X crunch power into it you will get Y returns in sales, etc. If you put 10X crunch power into it, you might only get 2Y returns in sales. There is a counterbalance – the law of diminishing returns.

Video gamers may always want more, but when the crunch power of the computer and the software it can carry and process exceeds the crunch power of the human gamer (chess programs / software anyone), then there’s no point in wanting even more. A human gamer might be able to photon-torpedo a Klingon Battlecruiser going at One-Quarter Impulse Power, but a massive fleet of them at Warp Ten might be a different starship scenario entirely. Gamers play to win, not to be universally frustrated and always out performed by their game.

It makes no economic sense at all to buy and get a monthly bill for 1000 computer crunch units and only need and use 10.

But the bottom line is that computer crunch power is available for simulation exercises as we have done. Anything else is just a matter of degree. If us; them; them of course being “The Other” or The Simulators.


Are there limits to crunch power? Well before I get to agreeing to that, which I ultimately do, are opponents assuming that crunch power won’t take quantum leaps, perhaps even undreamed of quantum leaps in the generations to come? I assume for starters that we in the early 21st Century don’t have enough computing power to simulate the Cosmos at a one-to-one scale. Would quantum computers alter this analysis? I’m no expert in quantum computers – I’ve just heard the hype. Still, are available crunch power sceptics’ game to predict what might or might not be possible in a 100 years; in a 1000 years? Still, the ability to increase computing crunch power could go on for a while yet. Isn’t the next innovation going from a 2-D chip to a 3-D chip?

Still, Moore’s Law (computing crunch power doubles every 18 to 24 months) can’t go on indefinitely and I wasn’t aware that I.T. people have postulated that Moore’s Law could go on “forever”. That’s a bit of a stretch.

Okay, even if we accept that fact that we’re all greedy and want more, more, more and even more crunch power – and ditto by implication our simulators – then there will ultimately be limits. There might be engineering limits like dealing with heat production. There may be resolution limits. There may be technological limits as in maybe quantum computing isn’t really feasible or even possible. There will be economic limits as in you may want to upgrade your PC but your budget doesn’t allow for it; you ask for a new research grant to buy a new supercomputer and get turned down, and so on.

Perhaps our highly advanced simulators have hit the ultimate computer crunch power wall and that’s all she wrote; she could write no more. There’s probably a ‘speed of light’ barrier equivalent limiting computer crunch power. Then too, our simulators have competing priorities and have to divide the economic / research pie.

I’ve never read or heard about any argument that the Simulation Hypothesis assumes ever and ever and ever increasing crunch power. It assumes that the computer / software programmer has sufficient crunch power to achieve their objective, no more, no less.

In other words, the computer / software simulator is going to be as economical with the bits and bytes as is as possible to achieve that’s still compatible with the degree of realism desired. That makes sense.

The bottom line is that our simulated reality just has to be good enough to fool us. In fact, if we ‘exist’ as a simulation, then from the get-go you have experienced nothing but a simulated ‘reality’ and thus you wouldn’t be able to recognize really real reality even if it clobbered you over the head!


There’s one obvious objection to those who propose that there’s not enough computer power to create 100% realistic simulations. Here realistic means a one-to-one relationship. But such a degree of realism isn’t necessary and we might not even not even be able to conceive of our simulator’s really real reality since we’ve known no other reality other than the one we exist in right now. We have no other reality to compare ours to other than other realities (i.e. – simulations of our reality) that we create, which of course includes our dreams and say films.

The degree of realism now possible with CGI is in fact equal to the actual degree of realism we experience in our everyday world; with everyday experiences. I’m sure you must have seen over the last five years movies that had loads of CGI embedded in them, and even while knowing that what you were seeing was CGI, you couldn’t actually detect apart the simulation (say the dinosaurs in “Jurassic World”) from what was actually real (like the actors). Still, you have little trouble telling the difference between film action, even 3-D film action, and live action.

Maybe in this reality you can tell the difference between a film and live action, but what if that live action was as simulated as the film? If you have spent your entire existence as live action virtual reality (without knowing it of course) and now and again watching virtual reality film which you can distinguish from your live action virtual reality, then you can have absolutely no idea of the nature of the really real reality where our simulators reside and of the simulators themselves (although it might be a best guess to speculate that there will be a lot of similarities) and how much crunch power they have devoted to their hobby / gaming / research (we could be a grand “what if” sociological experiment). Maybe their Moore’s Law gives them in theory 1000 units of crunch power, but they only need or can afford 100 units. Just because you might be able to afford a fleet of sports cars, several yachts, a 28 bedroom mansion, a half-dozen holiday homes and a half-yearly round-the-world holiday and can buy all of the women you might want doesn’t of necessity mean you will spend that money.

Anyway, my objection to the one-on-one objection is that in a simulation, not everything has to be simulated to an exacting standard. The computing power required to make our immediate environment seem really real is vastly different than what is required to make the Universe outside of our immediate environment seem really real. I mean a planetarium does a great job of simulating all the sorts of things a planetarium simulates, but you wouldn’t claim that a planetarium requires the same amount of bits and bytes to simulate that which are required for the really real object it is simulating. Two really real galaxies in collision would be composed of way more bits and bytes than required by astronomers simulating two galaxies in collision on their PC. The astronomers don’t need that extra crunch power. So, perhaps 90% of our simulator’s computer power is devoted to making our immediate neighbourhood (i.e. – the solar system) seem really realistic, and the other 10% simulates everything external to our immediate neighbourhood. Further, even within our solar system you don’t have to simulate each and every particle, atom and molecule that would – in a really real solar system – reside inside say the Sun or Jupiter or even the Earth. Things that you may think need to be computed may in fact not need to be computed in order to achieve the goal of making things seem really real to us.

In our ‘reality’, when any scientist postulates some theory or hypothesis or other, they ignore many possible variables. A biologist doing “what if” evolution scenarios probably doesn’t concern himself with each and every possible astronomical scenario that may impact on evolution at each and every possible moment. You gotta draw the line somewhere.

The only one-on-one simulation I can think of that we do would be in the realm of particle and quantum physics. Simulating two protons smashing together is about as one-on-one as you can get.


To date, when talking about our virtual reality, the Simulation Hypothesis, I’ve pretty much had in mind the idea that our programmers, The Others otherwise known as The Simulators, were monitoring us pretty much like we monitor our simulations – from a distance on a monitor. But what if The Simulators actually walk among us? That is, their simulation is more akin to a Star Trek holodeck than a standard video game.

We have always tended to immerse ourselves in virtual reality, sometimes involuntarily as in our dreams and dream-worlds, but more often as not voluntarily, from telling ghost stories around the camp-fire; to reading novels; to watching soap, horse or space operas; even just by daydreaming. In more recent times that immersion has extended to video and computer games, but usually from the outside looking in at a monitor while fiddling with a mouse or a joystick or other controls. You sometimes quasi-immerse yourself inside virtual reality as in creating an avatar hence creating a virtual copy of yourself (or make-believe copy of yourself) and interacting with other virtual people via their avatars on-line, as in “Second Life”. But what we really desire, truth be known, is to actually immerse our real selves into virtual reality scenarios.


A training simulation needs to be only as realistic as is required to train the trainee into perfecting whatever skills are required. Take a driver training simulation package. Apart from the fact that the simulation can be almost of average animation standard, the images constantly shift – the turnpike software retreats into the background as one turns off onto a country road and new software is now to the fore. The image constantly changes and so does the software required for that image. The computer only has to crunch a fraction of the overall software at any one time.

Taking Planet Earth, the number of particles, atoms, molecules, etc. requiring simulation hasn’t changed very much over geological time. For example, there’s no need any more to simulate dinosaurs or trilobites so those bits and bytes are now freed up for other and newer species. If you have simulated Planet Earth, you haven’t needed to pour more and more and more crunch power resources into the simulation since you’re dealing with a finite object that is ever recycling those particles, atoms and molecules.

The simulators do not have to simulate each and every elementary particle in their simulation just in case one day their virtual beings (that’s us) decide to interact with elementary particles that should be there but aren’t. Their simulation software could be tweaked / upgraded as necessary as their simulation virtual reality scenario unfolds. Take Mars. Our simulators could for the longest time just use software that simulated a moving reddish dot in the sky that made strange retrograde motions (loop-the-loops) from time to time. Then the telescope scenario came to pass and the software was upgraded to show features – polar caps, areas of apparent ‘vegetation’, two moons, dust storms and of course ‘canals’. Then came Mariner 4, 6 & 7 and 9 and the simulator’s software had to be upgraded again to show close-up features from those fly-by Mariners and Mariner 9 that went into orbit. Then of course came the landers like Viking, and kin and another tweak was required. It’s all too easy.

Software past its use-by date can just be deleted – no memory required. If it is ever needed again, well that’s just another tweak or upgrade. Your memory has deleted lots of events in your life, but coming across an old letter, photograph, diary, etc. can restore what your brain didn’t feel it needed to store any more.


If I put a character, let’s call him Rob, into a video game and Rob gets zapped, no guts will appear because I didn’t program them in. If we are on the other hand the simulation; characters in the video game not of our making, our guts are there but will appear if and only if the unfolding scenario requires it. The bottom line remains that not all software is front-and-centre at the same time. Further, software can be tweaked as the simulation scenario unfolds, just like we get upgrades to our software on our PC’s.

As for having to simulate each and every thing that is required, like Rob’s heart, lungs, liver, etc., in any simulation only a part of the whole is active and ‘in your face’ at any one time. When the scenario demands that something else now has to be ‘in your face’ instead, well that software is available, but other software now retires to the background until and if it is needed again. In other words, not 100% of the software that comprises the entire simulation is actually front-and-centre at any one time so the computer’s ability to cope isn’t taxed beyond its means.

I’ve said above that you do NOT have to do a one-on-one correlation between what is being simulated and the simulation. If I simulate Rob as a character in a video game I don’t have to also simulate his heart, lungs, liver, and all of his other internals. That’s a big savings in bits and bytes. So the simulated Rob is indeed simpler than any really real Rob, but the simulated Rob does the job as far as video gamers are concerned.


It’s been oft noted that if one is going to simulate one’s entire Cosmos in exacting one-on-one detail, then one would need a computer that’s as large as the Cosmos that one is trying to simulate in the first place, which is ridiculous. The fallacy lies in the phrase “in exactly one-on-one detail”. A simulation doesn’t require that amount of exacting detail in order to be realistic. There’s many a slight-of-hand short-cut that can be entered into when simulating an entire Cosmos, as in a planetarium for instance. No matter how you slice and dice things, planetariums do an excellent job of simulating the Cosmos.

Still, a Doubting Thomas keeps assuming that to simulate the Cosmos you need a one-to-one correlation, that each and every last fundamental particle in the Cosmos has to be accounted for and simulated in order to have a simulation of the Cosmos. That’s not the purpose of simulations. When cosmologists simulate the Cosmos, they are interested in the broad-brush picture. They don’t need to know about each and every fundamental particle within the Cosmos in order to understand the broad-brush picture. A simulation is NOT trying to recreate 100% of reality but only those bits and pieces that are of interest. Thus, the bits and bytes required to simulate the Cosmos as required by cosmologists need only be a tiny, tiny fraction of the bits and bytes needed to simulate 100% of the entirety of the Cosmos.

Despite any sceptical position to the contrary, our cosmologists have done simulations of our Cosmos without having to resort to simulating the Cosmos down to dotting the very last ‘I’ and crossing the very last ‘T’.

If scientists want to simulate two galaxies colliding but their research grant doesn’t give them unlimited funds for crunch power, then they make do with what their budget allows. In the case of our simulators, maybe they have maxed out their bits and bytes; maybe their expenditure has been minimal – on a shoestring budget. We don’t know. We can’t know.

I would argue that astronomers / cosmologists have not only simulated possible planetary worlds and whole virtual solar systems but the entire Universe from the Big Bang event on up the line. Of course those simulations are vastly simpler than what they are simulating but they do the job that requires doing.

Extrapolating one level up, if some agency is simulating our Cosmos, or what we perceive as our Cosmos, then that simulation is NOT meant to be a one-on-one replica of their Cosmos. To those entities, that agency, what they have simulated (our Cosmos) is easily achievable because it is NOT a one-to-one representation of their Cosmos, any more than our cosmologists try to simulate one-on-one what they believe is our Cosmos. We think our virtual reality Cosmos is the be-all-and-end-all of all there is when it’s just a tiny fraction of really real reality – our simulator’s Cosmos.

Of course in one sense we, even as simulations, are a part of The Simulators Cosmos in the same way as our simulations, our virtual realities are part of our Cosmos. We might be the same ‘stuff’ as in we are a part of The Simulators Cosmos too, which let us say is the Full Monty of all things A to Z. But when The Simulators simulated or built or crafted us (yes, you too), they simplified things and say left out all of the vowels. So yes, we ‘exist’ in their Cosmos, but in a simplified virtual reality simulation of their Cosmos. In other words, there’s no one-on-one correlation.


Now to my mind the only valid objection against the Simulation Hypothesis is that one has absolute free will. That argument absolutely undermines the Simulation Hypothesis. The fly in the ointment is that all anyone need to do is prove to the satisfaction of the rest of the world that they actually have free will, and therefore by extension all humans have free will. Then various web sites and publishing houses can delete free will from their inventory and thus free up a massive amount of data storage space for other topics. Meantime, I can put my time, efforts and energy to better use that pondering over our possible virtual reality.


In conclusion, once upon a time in a galaxy far, far away, well let’s just say there existed this technologically advanced civilization who I shall call The Simulators! Let’s also say that for The Simulators to simulate one-on-one their own Big Cosmos would require 100,000 units of computing crunch power. Alas, The Simulators only have 100 units of computing crunch power on tap, so obviously they don’t try to simulate their own Big Cosmos on a one-to-one basis – in its entirety. However, they do simulate a 100 unit computing crunch power mini-Cosmos. That’s us, that’s our mini-Cosmos by the way. So we ‘exist’ in a simulated 100 units of computer crunch power mini-Cosmos. We can in turn maybe manage 1 (one) unit of simulation (within the simulation that we already ‘exist’ in) computing crunch power. We can no more simulate our simulated mini-Cosmos one-on-one than The Simulators can simulate their Big Cosmos one-on-one. And that’s where it all ends, at least for now. Our mini-Cosmos is a simulated mini-Cosmos, simulated by The Simulators in their Big Cosmos. There’s no one-on-one identity correlation anywhere to be had, in any Cosmos. Is everything crystal clear now?

Science librarian; retired.

Article Source: http://EzineArticles.com/expert/John_Prytz/784091

Article Source: http://EzineArticles.com/9343734


Since that time the actual heralding in the World-wide-web (nevertheless initially A military principle), the planet since citizenry recognized it turned out being totally changed constantly. Now, the world wide web is really a suggestions regarding flexibility independence associated with expression while on an available community forum, independence of to touch base with people, liberty regarding alternatives coming from a nearly infinite listing, and by not any suggests the very least the freedom from waiting times as well as border. Aforementioned is perhaps the greatest effect the internet creates for the life of those around the globe. Chatroulette can be a online video media speaking assistance the epitome for these prices with the Web.

Video shows are the epitome in the successes on the Web. To be able to broadcast stay online video nourish from personal computer towards other is a huge cutting-edge which noticed about a large difference in the way that the folks linked, prodigious the actual familiarity as well as homo really feel in the I’m (Instantaneous Messenger) assistance. Though it has got the problem with being a bandwidth hog, video clip chat is without a doubt the best way to interact with citizenry. However the Net represents independence, most social sites hold the debt instrument to be able to complete actions societal user profile before signing onto their particular sites, for his or her have plus the personal base hit. However companies like Omegle ar exceptions to the cosmopolitan process. This website are part of the category of sites named cultural communicating internet sites, tend to be websites in which comprehensive namelessness can be looked after through the persons recording onto the sites. Needless to say, camzap along with web sites too have their unique restrictions offer conditions. s.
The popularity of such websites is usually related to the following elements:

=> Anonymity: The most significant cashcard of such internet sites such as chatroulette. Now the people that visit the social network internet sites will often be necessary to fill up these kind of really elaborate and troublesome users, with data and so on. but most folks don’t want that much forms to handle. These days connected with impaired dates in addition to quick tracked relationships, anonymous video recording talking could be the brand-new development one of many teens.

=> Level of privacy controller: Although websites like these are generally unknown internet websites, the personal privacy of your companion who’s drenched about along with the human being at the opposite end from the collection could be entirely custom-made according to this personal preferences of the people. They have got the option of display simply a photo, or simply an item, or a live video clip steady flow. All these are just safety precautions with the customers. The majority of the internet websites are available furnished with “document mistreatment” buttons, that alert the actual admins concerning outlawed task one specific, or perhaps bad habits regarding one of the folks. Therefore, at areas just like omegle an individual’s level of privacy might be entirely as well as ideally controlled by means of these kinds of measures.
642-642 exam content. Net ball’azines carry good thing about 642-374 exam materials successfully and obtain secured good results. Look at free demonstration of most certifications Examination.

Fair Use – An Exception to Copyright

The term -Fair Use’ in the context of copyright law refers to the use of material subject to copyright protection without permission or authorization from the copyright owner, in a way that would not infringe the copyright of the owner. The rationale behind the provision of the -Fair Use’ concept is to allow the general public to reap the benefits of the literary or artistic work in a way that would not be prejudicial to the moral and financial rights of the author. It allows the public to analyze, comment and criticize the works and helps in creating and sustaining a healthy environment for the growth of art and literature in the society.

The Berne Convention through its Article 10 permits the making of quotations from a work with appropriate details as to the source from where the quotation is taken and the author of the said quotation provided that the following conditions are met:
1.The work should have been made lawfully available to the public. This rule clarifies that only the work that is published (made available to the public through any of the modes of communication) lawfully can be subject to fair use. The word -lawfully’ included in the above rule stresses on the fact that fair use shall not rise from unauthorized publications.
2.The making of the quotations must be compatible with the fair practice. Every country has the right to provide the limitation regulation for the extent and purpose of the fair use of a work that is permissible. The use of a work should be compatible to the said regulations.
3.The extent of the quotations must not exceed the extent that is justified by purpose. There may be various reasons that justify fair use such as critic, comment, parody, education etc., and thus for each the extent of use varies. A parody may require much more use than a critic, and therefore it is essential to limit the use only to the extent of requirement.
The UAE provides for -Fair Use’ through its articles 22, 23 and 24 of the Federal law no. 7 of 2002 regarding Copyrights and Related Rights (herein after referred to as -the law’).
Article 22 of the law provides for the use of the copyrighted work in the following ways and circumstances provided that two conditions namely 1) the moral rights of the author are not being prejudiced and 2) the work has been lawfully published, must be satisfied:
1.The reproduction of a single copy of the work for the purpose of personal, non-profit and non-professional use. Works of fine or applied arts are an exception to this rule and they are subject to fair use only when they are exposed in a public place with the consent of the right owner or his successor. Architectural works are also an exception to the concept of Fair Use and would be subject to Fair Use only when they permanently exist in public places.
2.The Fair Use of computer programs, applications or databases is allowed only for making a single copy with the knowledge of the legitimate possessor for the following purposes:

a)For a purpose that falls within the licensed purpose.

b)For the purpose of saving or substitution in case the original copy is lost, damaged or becomes unfit for use, on a condition that the spare of extracted copy be destroyed when no more necessary.

3.Reproduction of protected works for use in judicial proceedings, or their equivalent, within the limits prescribed by such procedures, with mention of the source and the name of the author.
4.Making a single copy of the work through the non-profit archives, libraries or authentication offices, either directly or indirectly, in one of the two following instances:

a)Reproduction is made for the purpose of preserving the original copy or of substituting a lost, damaged copy or one unfit for use, if it has become impossible to obtain a substitute thereof under reasonable conditions.

b)The purpose of reproduction is the satisfaction of a request made by a physical person, to use it in a study or research provided it is done only once and on different intervals in case it was impossible to obtain a license for reproduction pursuant to the provisions of the present Law.

5.For citations of short paragraphs, excerpts, or analyses, within the customary limits of the work, for the purpose of criticism, discussion or information, with mention of the source and name of the author.
6.Performance of a work in meetings with family members or by pupils in an educational institution, so long as such performance has not been made for direct or indirect consideration.
7.Exhibition of works of fine, applied, plastic or architectural arts in broadcasts, if such works permanently exist in public places.
8.Reproduction within justified and reasonable limits, of short abstracts of a work in the form of manuscripts or audio, visual, or audiovisual recordings, for the purposes of cultural or religious education, or vocational training. The name of the author and the title of the work are to be mentioned whenever possible. The reproduction should not be made for the purpose of direct or indirect profits. Such reproduction is allowed only in case the license for the same cannot be availed.

Further, article 23 of the law brings under the scope of -Fair Use’ the reproduction of the work in such a way and limit that it justifies the objective behind the reproduction which is done through newspapers, periodicals or broadcasting organizations. This provision of the law applies to the following works:
1.Extracts of the works regarding current incidents that have been lawfully made available to the public. The source and author of the work must be mentioned.
2.Published articles relating to discussions of issues, which have preoccupied public opinion at a certain time. Here too, the source and name of the author of the article used must be mentioned. The articles that are prohibited upon publication cannot be used.
3.Speeches, lectures, and addresses delivered in the course of public sessions of the Parliament, judicial councils and public meetings; so long as such speeches, lectures and addresses are addressed to the public, and are reproduced within the framework of reporting current news.
This also provides that -Fair Use’ is allowed while using works that are protected under the title of -related rights’ of copyright. Thus even the owners of such related rights cannot complain of infringement if the use of the work falls within the scope of -Fair Use’ as provided for by the law.
Author : Mrs Jyothi Shyam esq.
Source : Fair Use – An Exception to Copyright