Sunday, December 21, 2008

PC - Introduction

With this post I start a serial with the main aim of explaining how a Personal Computer (PC) works and which are the most important features which have to be considered to buy a suitable PC for our needs.


Introduction

A computer is composed of several physical components, the hardware. The most important ones are the microprocessor, the memory and the motherboard (which controls the in/out devices and the memory and links them to the processor). The in/out devices are called peripherals. Examples of peripherals are the hard disk, the DVD unit, the graphic card, the sound card, the ethernet card, the modem and so on.

Elements of a current Personal Computer: Case, Screen, Keyboard and mouse


The microprocessor is the brain of the system and executes the instructions loaded in the memory, the software. This memory, called RAM (Random Access Memory), is really fast but can't store any information if the computer is off. On the other side, a hard disk is a massive memory (much more than the RAM but much slower) which can store the information indefinitely.

Computers only understand binary information. This language only has two symbols: o and 1, called bits. 0 means no electrical signal at all. 1 means some electrical signal. With eight bits we create a "word", called a byte. Each word represents a letter, a symbol or a number. So, when we type an 'a' with the keyboard, eight bits are generated, stored in the memory and sent to the processor. The amount of information a memory can store is precisely measured in bytes or Megabytes (a megabyte is a million of bytes).

There is a kind of software quite important: The Operating System (OS). The OS is loaded in the RAM memory when the computer is turned on. The OS has to manage the peripherals and tells the processor what to do every moment and where the data it need is.

The data needed by the processor is introduced through an IN device, like a keyboard, a mouse, a hard disk, a DVD unit. The results are shown in an out device, like a screen, a speaker...

The microprocessor, the RAM memory and the peripherals are all connected to the motherboard, and all of them are linked each other through a BUS. Through the data bus the information travels from one device to another one. The address bus is used by the processor to tell the memory where the next instruction or data it needs is stored.


An exploded view of a modern personal computer and peripherals:




1. Scanner, 2. CPU (Microprocessor), 3. Primary storage (RAM), 4. Expansion cards (graphics cards, etc), 5. Power supply, 6. Optical disc drive, 7. Secondary storage (Hard disk), 8. Motherboard, 9. Speakers, 10. Monitor, 11. System software, 12. Application software, 13. Keyboard, 14. Mouse, 15. External hard disk, 16. Printer



The components of a computer work syncronized by a clock. A clock tells each component when it has to send some information, when it has to get some information, when it has to process some information, when it has to show a result and so on. In short, the clock let the components talk and understand each other. A clock generates a lot of electrical impulses per second, and always the same number of impulses, so it has a frequency. The frequency is measured in Herz (Hz) or Megaherz (MHz). For example, if a clock generates a million of electrical impulses per second, it has a frequency of 1 MHz. The higher is the frequency, the faster the computer can work.

In the next post I will try to explain deeper the most important parts of a microprocessor and the features we have to consider to buy the processor more suitable for us.

Sunday, October 5, 2008

AEROSPACE TECHNOLOGY (III): SPACE AND COMMUNICATIONS



Communications Satellites



In 500 years, when humankind looks back at the dawn of space travel, Apollo's landing on the Moon in 1969 may be the only event remembered. At the same time, however, Lyndon B. Johnson, himself an avid promoter of the space program, felt that reconnaissance satellites alone justified every penny spent on space. Weather forecasting has undergone a revolution because of the availability of pictures from geostationary meteorological satellites--pictures we see every day on television. All of these are important aspects of the space age, but satellite communications has probably had more effect than any of the rest on the average person. Satellite communications is also the only truly commercial space technology- -generating billions of dollars annually in sales of products and services.


The Billion Dollar Technology


Perhaps the first person to carefully evaluate the various technical options in satellite communications and evaluate the financial prospects was John R. Pierce of AT&T's Bell Telephone Laboratories who, in a 1954 speech and 1955 article, elaborated the utility of a communications "mirror" in space, a medium-orbit "repeater" and a 24-hour-orbit "repeater." In comparing the communications capacity of a satellite, which he estimated at 1,000 simultaneous telephone calls, and the communications capacity of the first trans-atlantic telephone cable (TAT-1), which could carry 36 simultaneous telephone calls at a cost of 30-50 million dollars, Pierce wondered if a satellite would be worth a billion dollars.

After the 1957 launch of Sputnik I, many considered the benefits, profits, and prestige associated with satellite communications. By the middle of 1961, NASA had awarded a competitive contract to RCA to build a medium-orbit (4,000 miles high) active communication satellite (RELAY); AT&T was building its own medium-orbit satellite (TELSTAR) which NASA would launch on a cost-reimbursable basis; and NASA had awarded a sole- source contract to Hughes Aircraft Company to build a 24-hour (20,000 mile high) satellite (SYNCOM).

Sputnik satellite, the first Satellite launched into space


By 1964, two TELSTARs, two RELAYs, and two SYNCOMs had operated successfully in space. This timing was fortunate because the Communications Satellite Corporation (COMSAT), formed as a result of the Communications Satellite Act of 1962, was in the process of contracting for their first satellite. On April 6, 1965 COMSAT's first satellite, EARLY BIRD, was launched from Cape Canaveral. Global satellite communications had begun.


The Global Village: International Communications


Some glimpses of the Global Village had already been provided during experiments with TELSTAR, RELAY, and SYNCOM. These had included televising parts of the 1964 Tokyo Olympics. Although COMSAT and the initial launch vehicles and satellites were American, other countries had been involved from the beginning. AT&T had initially negotiated with its European telephone cable "partners" to build earth stations for TELSTAR experimentation. Further negotiations in 1963 and 1964 resulted in a new international organization, which would ultimately assume ownership of the satellites and responsibility for management of the global system. On August 20, 1964, agreements were signed which created the International Telecommunications Satellite Organization (INTELSAT).


Three Crew Members Capture Intelsat VI


By the end of 1965, EARLY BIRD had provided 150 telephone "half- circuits" and 80 hours of television service. The INTELSAT II series was a slightly more capable and longer-lived version of EARLY BIRD. The INTELSAT III series was the first to provide Indian Ocean coverage to complete the global network. This coverage was completed just days before one half billion people watched APOLLO 11 land on the moon on July 20, 1969.

From a few hundred telephone circuits and a handful of members in 1965, INTELSAT has grown to a present-day system with more members than the United Nations and the capability of providing hudreds of thousands of telephone circuits. Cost to carriers per circuit has gone from almost $100,000 to a few thousand dollars. Cost to consumers has gone from over $10 per minute to less than $1 per minute. If the effects of inflation are included, this is a tremendous decrease! INTELSAT provides services to the entire globe, not just the industrialized nations.


New Technology


In the early 1960s, converted intercontinental ballistic missiles (ICBMs) and intermediate range ballistic missiles (IRBMs) were used as launch vehicles. These all had a common problem: they were designed to deliver an object to the earth's surface, not to place an object in orbit. Upper stages had to be designed to provide a delta-Vee (velocity change) at apogee to circularize the orbit. The DELTA launch vehicles, which placed all of the early communications satellites in orbit, were THOR IRBMs that used the VANGUARD upper stage to provide this delta-Vee. It was recognized that the DELTA was relatively small and a project to develop CENTAUR, a high-energy upper stage for the ATLAS ICBM, was begun. ATLAS-CENTAUR became reliable in 1968 and the fourth generation of INTELSAT satellites used this launch vehicle.


Atlas Centaur rocket launch. A rocket like this one launched the satellite Intelsat IV


The fifth generation used ATLAS-CENTAUR and a new launch-vehicle, the European ARIANE. Since that time other entries, including the Russian PROTON launch vehicle and the Chinese LONG MARCH have entered the market. All are capable of launching satellites almost thirty times the weight of EARLY BIRD.



Ariane V lift-off


In the mid-1970s several satellites were built using three-axis stabilization. They were more complex than the spinners, but they provided more despun surface to mount antennas and they made it possible to deploy very large solar arrays. The greater the mass and power, the greater the advantage of three-axis stabilization appears to be. Perhaps the surest indication of the success of this form of stabilization was the switch of Hughes, closely identified with spinning satellites, to this form of stabilization in the early 1990s.

Much of the technology for communications satellites existed in 1960, but would be improved with time. The basic communications component of the satellite was thr traveling-wave-tube (TWT). These early tubes had power outputs as low as 1 watt. Higher- power (50-300 watts) TWTs are available today for standard satellite services and for direct-broadcast applications. An even more important improvement was the use of high-gain antennas. Focusing the energy from a 1-watt transmitter on the surface of the earth is equivalent to having a 100-watt transmitter radiating in all directions. Focusing this energy on the Eastern U.S. is like having a 1000-watt transmitter radiating in all directions. The principal effect of this increase in actual and effective power is that earth stations are no longer 100-foot dish reflectors with cryogenically-cooled maser amplifiers costing as much as $10 million (1960 dollars) to build. Antennas for normal satellite services are typically 15-foot dish reflectors costing $30,000 (1990 dollars). Direct-broadcast antennas will be only a foot in diameter and cost a few hundred dollars.


Mobile Services


In February of 1976 COMSAT launched a new kind of satellite, MARISAT, to provide mobile services to the United States Navy and other maritime customers. In the early 1980s the Europeans launched the MARECS series to provide the same services. In 1979 the UN International Maritime Organization sponsored the establishment of the International Maritime Satellite Organization (INMARSAT) in a manner similar to INTELSAT. INMARSAT initially leased the MARISAT and MARECS satellite transponders, but in October of 1990 it launched the first of its own satellites, INMARSAT II F-1. The third generation, INMARSAT III, has already been launched.

An aeronautical satellite was proposed in the mid-1970s. A contract was awarded to General Electric to build the satellite, but it was canceled--INMARSAT now provides this service. Although INMARSAT was initially conceived as a method of providing telephone service and traffic-monitoring services on ships at sea, it has provided much more. The journalist with a briefcase phone has been ubiquitous for some time, but the Gulf War brought this technology to the public eye.

The United States and Canada discussed a North American Mobile Satellite for some time. In the next year the first MSAT satellite, in which AMSC (U.S.) and TMI (Canada) cooperate, will be launched providing mobile telephone service via satellite to all of North America.


Competition


In 1965, when EARLY BIRD was launched, the satellite provided almost 10 times the capacity of the submarine telephone cables for almost 1/10th the price. This price-differential was maintained until the laying of TAT-8 in the late 1980s. TAT-8 was the first fiber-optic cable laid across the Atlantic. Satellites are still competitive with cable for point-to-point communications, but the future advantage may lie with fiber-optic cable. Satellites still maintain two advantages over cable: they are more reliable and they can be used point-to-multi-point (broadcasting).

Cellular telphone systems have risen as challenges to all other types of telephony. It is possible to place a cellular system in a developing country at a very reasonable price. Long-distance calls require some other technology, but this can be either satellites or fiber-optic cable.


The LEO Systems


Cellular telephony has brought us a new technological "system"-- the personal communications system (PCS). In the fully developed PCS, the individual would carry his telephone with him. This telephone could be used for voice or data and would be usable anywhere. Several companies have committed themselves to providing a version of this system using satellites in low earth orbits (LEO).

The most ambitious of these LEO systems was Iridium, sponsored by Motorola. Iridium planned to launch 66 satellite into polar orbit at altitudes of about 400 miles. Each of six orbital planes, separated by 30 degrees around the equator, would contain eleven satellites. Iridium originally planned to have 77 satellites-- hence its name.


Prospect and Retrospect


Arthur C. Clarke's 1945 vision was of a system of three "manned" satellites located over the major land masses of the earth and providing direct-broadcase television. The inherent "broadcast" nature of satellite communications has made direct-broadcast a recurrent theme--yet one never brought to fruition. The problems are not technical--they are political, social, and artistic. What will people be willing to pay for? This is the question-- especially with the availability of 120-channel cable systems. Hughes is apparently about to enter this field and may encourage others to do the same. Only then will Clarke's prophetic vision be fulfilled.

There are currently six companies providing fixed satellite service to the U.S.: GE Americom, Alascom, AT&T, COMSAT, GTE, and Hughes Communications. They operate 36 satellites with a net worth of over four billion dollars. Each year from 10-20 communications satellites are launched valued at about $75 million each. The launch vehicles placing them in orbit have similar values. Both satellites and launch vehicles are multi-billion dollar businesses. The earth station business is equally large. Finally the communications services themselves are multi-billion dollar businesses. John R. Pierce was right--it would be worth a billion dollars.



A Selective Communications Satellite Chronology


* 1945 Arthur C. Clarke Article: "Extra-Terrestrial Relays"
* 1955 John R. Pierce Article: "Orbital Radio Relays"
* 1956 First Trans-Atlantic Telephone Cable: TAT-1
* 1957 Sputnik: Russia launches the first earth satellite.
* 1960 1st Successful DELTA Launch Vehicle
* 1960 AT&T applies to FCC for experimental satellite communications license
* 1961 Formal start of TELSTAR, RELAY, and SYNCOM Programs
* 1962 TELSTAR and RELAY launched
* 1962 Communications Satellite Act (U.S.)
* 1963 SYNCOM launched
* 1964 INTELSAT formed
* 1965 COMSAT's EARLY BIRD: 1st commercial communications satellite
* 1969 INTELSAT-III series provides global coverage
* 1972 ANIK: 1st Domestic Communications Satellite (Canada)
* 1974 WESTAR: 1st U.S. Domestic Communications Satellite
* 1975 INTELSAT-IVA: 1st use of dual-polarization
* 1975 RCA SATCOM: 1st operational body-stabilized comm. satellite
* 1976 MARISAT: 1st mobile communications satellite
* 1976 PALAPA: 3rd country (Indonesia) to launch domestic comm. satellite
* 1979 INMARSAT formed.
* 1988 TAT-8: 1st Fiber-Optic Trans-Atlantic telephone cable


This post is an extract from: Communications Satellites: Making the Global Village Possible by David J. Whalen

Wednesday, September 10, 2008

LARGE HADRON COLLIDER

Today has begun the experiment called Large Hadron Collider (LHC). This experiment is located in CERN (European Organization for Nuclear Research) and is considered the scientific experiment of this century. It has costed 3000 millions euros and, in order to build it, more than 10000 scientists have taken part. This experiment has been built in a circular tunnel 27km long in the border between France and Switzerland depths, at between 50 and 120 meters under the floor.

LHC architecture and its experiments


The LHC is really an enormous particle accelerator which constitutes the most powerful machine ever built by physicians. This accelerator will make possible collisions of high energy protons at almost the speed of light. The main goal of this high energy collisions is to discover the hypothetical Higgs boson, which is predicted by the Standard Model of elemental particles.

These are the four main experiments in the LHC:

- CMS (the Compact Muon Solenoid) - and Atlas are the LHC's general purpose detectors to investigate a wide range of physics, including the search for the elusive Higgs boson, extra dimensions, and particles that could make up dark matter, which gives other particles their mass. Atlas will be responsible for the search of dark matter.


CMS detector for LHC

- The LHC Beauty (LHCb) detector is designed to answer a specific question: where did all the anti-matter go? Equal amounts of matter and its opposite counterpart anti-matter were created in the Big Bang. But today we find no evidence of, for example, anti-matter galaxies or stars. The LHCb experiment will help us to understand why we live in a Universe that appears to be composed almost entirely of matter, but no antimatter.

- ALICE While the other LHC detectors will use proton beams to do their science, Alice relies on smashing together electrically charged lead atoms. Scientists hope to re-create a state of matter called quark-gluon plasma which existed just after the Big Bang.Matter was in this "liquid" state because the early Universe was still extremely hot.The Alice detector will be used to study this quark-gluon plasma as it expands and cools. In doing so, they will observe how it progressively gives rise to the particles that make up the matter in our Universe today.

Concerning the LHC there are some scientists who are afraid the LHC could cause the end of the world. Experts deny that dangerous black holes could be generated in the LHC. The LHC, like other particle accelerators, recreates the natural phenomena of cosmic rays under controlled laboratory conditions. As this natural phenomena has been happening for millions of years and the Earth still exists, there is no reason, as I see it, to be worried about it.

To know more:

http://news.bbc.co.uk/2/hi/science/nature/7534847.stm

http://public.web.cern.ch/public/en/LHC/LHC-en.html

http://public.web.cern.ch/public/en/LHC/Safety-en.html

Thursday, August 14, 2008

Sustainable development

Nowadays we live better thanks to technology. Technology is used in our lifes in several ways. For instance, communications technology, such as mobile telephones, Internet or computer networks, has made it possible that we are able to communicate with each other easily. Internet lets us get a lot of information without difficulty instead of having to go to libraries for instance. Internet provides us also with an e-mail service that let us send messages all around the world without the postacard's delay.



Computers and Computer's Networks, such as Internet, have improved our lifes


Besides, we have several household appliances at home. Nowadays nobody can imagine our life without a fridge or without a washing machine. The fridge frees us from going shopping everyday and washing machines let us save a lot of time. Of course, transport technology has to be considered as well. Since cars were invented, they have improved a lot, and now they are faster and more secure than before. Thus we can move from one place to another easily. Besides, we can't forget trains and planes, more and more fast and secure. Trains and planes let us travel long distances in a comfortable way.


Airbus A-380, the biggest aircraft ever built. This kind of plane let us travel long distances fast and in a comfortable way
(This file is licensed under Creative Commons Attribution 2.0 License)



Therefore, technology has made our life easier and more comfortable. However, there is an important drawback: the power consumption. Cars, computers consume power, which they transform into heat. This heat makes the Earth's temperature increase. This increase of temperature can be a serious threat for the environment. Moreover, energy have to be obtained in such a way that cars, trains and electronic devices can use it. Therefore, nuclear power stations and power stations have had to be built. Nevertheless, nuclear power stations and power stations pollute the environment.


Nuclear power stations pollute the environment because of radiactive waste


In summary, we can state that technology has made our lives easier. Nonetheless technology has an important drawback: the environment's pollution. Therefore, so as to preserve our planet, mankind has to find a balance between comfort and environment's preservation. A more responsible consume of energy and the development of renewable energies, specially nuclear fusion, could help us to get this balance.



The sun is a natural fusion reactor. If we could reproduce this fusion process of the sun, we would have available an incredibly big amount of energy
(image obteined from www.nasa.gov)

Tuesday, August 12, 2008

Philosophy of Science


Philosophy of science is a philosophy's branch developed mainly by Karl Popper. Nowadays Popper is considered one of the most important philosophers of the last century, specially in science. The theory he developed is called Falsacionism. Before speaking about this theory, however, it's necessary to define what science is.

A theory is considered a scientific one if it provides conclusions that can be checked and refuted by experiments. Otherwise this theory doesn't belong to science, but to philosophy.

A scientific law is established according to experience. Researchers obtain a lot of information from many experiments and then they extract a law. Later, they have to check this law with the experience. Therefore a scientific law is continuously to experiments. If a scientific theory explains an experiment correctly, we will be more confident about its truth, but we will never be absolutely confident about it. We will never know whether the next experiment will contradict our theory.


Newton's gravity theory, which explained the motion of planets, has been considered absolutely true for centuries, but Theory of Relativity showed that Newton's Theory was not absolutely true. It's only an aproximation, valid only when gravity is weak and velocity much slower than velocity of light


Even though the quantity of data from which a law is extracted is huge, we will never be able to infer a general statement in a logical way. For instance, if we observe that after A, B always occurs, it can't be infered that this fact has to occur always in this way. We think A is followed by B due to a belief instead of a logical connection between A and B. If a physical law explains the experiments rigorously, it doesn't mean that this physical law will also explain the experiments in the future. Therefore, we can only assure that a theory can be false, but we can never state that a theory is right. The theory that proposes this conclusion is called Falsacionism.



Aerial photo of the Tevatron at Fermilab. The main accelerator is the ring above; the one below is for preliminary acceleration, beam cooling and storage. Experiments in particle accelerators are always exactly explained by the Standard Model of particle physics. However nobody knows whether the next experiment will keep confirming the standard model


Consequently science can be proved neither by experience nor by logic. Hence, science isn't true, but more or less probable. The likelihood of a law depends on the number of experiments that support this law. Science is a process of search for better theories without end and the knowledge it provides is always provisional. We can't state that we know something with certainty, since its falseness is always possible. Hence, we have to understand the notion of truth as an approximation to the truth because we will never be able to know if we have reached the truth.

Sunday, August 10, 2008

True Knowledge

Nowadays, we are pretty sure that all the facts of our daily life will happen as they occurred in the past, but this certainty is actually groundless. Statements such as objects always fall are not absolutely true.


Water of Montjuic magical fountain in Barcelona falls because of gravity


Firstly, it's stated that objects fall because of gravity, so gravity is the cause and the object's fall is the effect. This mechanism is called the cause-effect relation and constitutes almost the only source of knowledge we have. However the effect and the cause are not linked in a logical way, but in a psychological way. That is to say, we think that the effect will always happen after the cause since we are used to this has been occurring for our whole life. As a result, our knowledge is a consequence of the strong belief we feel that events will occur in the future as they did in the past, because we are used to seeing them in this way. Thus, we only have a belief instead of certain knowledge.

Nevertheless there is a kind of knowledge that is not obtained from the cause-effect relationship. This knowledge is the mathematical one, whose certainty comes from the idea that its negation involves contradiction. But this certainty disappears when mathematics is applied to the physical world. Hence, scientific knowledge can't be considered true knowledge. For instance, it can't be stated that a physical law is correct because nobody can assure us that the next event could be explained by this law.

Similary to scientific knowledge, nobody can assure that what our eyes see is what really exists. The coincidence between we think exists and what actually exists is pure chance. Even the existence of oneself can't be really assured. That is to say, the consciousness of what we have been is a consequence of our memory and the existence of a psychological time that allows us to put our memories in order. As I see it, this psychological time leads us to believe in our existence through time since a real time doesn't exist at all.



As it's explained in the movie Matrix, nobody can assure that what our eyes see is what really exists (Image obtained from wikipedia)


In conclusion, all the knowledge we have can't be considered true, but only likely, except for mathematical knowledge. So the limit of knowledge is mathematics. Out of mathematics, nothing is absolutely sure, nothing is true knowledge. Consequently, there is absolute scepticism about we can know with certainty but, certainly, this scepticism is luckily only philosophical, since it's very unlikely that daily events occur in a different way as we think. Therefore our belief is enough to live.