INTEL

Intel Corporation (NASDAQ: INTC) is an American multinational semiconductor chip maker corporation headquartered in Santa Clara, California, United States and the world’s largest and highest valued semiconductor chip maker, based on revenue.[4] It is the inventor of the x86 series of microprocessors, the processors found in most personal computers. Intel Corporation, founded on July 18, 1968, is a portmanteau of Integrated Electronics (though a common misconception is that “Intel” is from the wordintelligence[citation needed]). Intel also makes motherboard chipsets, network interface controllers and integrated circuits, flash memory,graphic chips, embedded processors and other devices related to communications and computing. Founded by semiconductor pioneers Robert Noyce and Gordon Moore and widely associated with the executive leadership and vision of Andrew Grove, Intel combines advanced chip design capability with a leading-edge manufacturing capability. Though Intel was originally known primarily to engineers and technologists, its “Intel Inside” advertising campaign of the 1990s made it and its Pentium processor household names.   Continue reading

Advanced Micro Devices AMD

Advanced Micro Devices, Inc. (NYSE: AMD) or AMD is an American multinational semiconductor company based in Sunnyvale, California, that develops computer processors and related technologies for commercial and consumer markets. Its main products include microprocessors,motherboard chipsets, embedded processors and graphics processors for servers, workstations and personal computers, and embedded systemsapplications.

AMD is the second-largest global supplier of microprocessors based on the x86 architecture and also one of the largest suppliers of graphics processing units. It also owns 8.6% of Spansion, a supplier of non-volatile flash memory.[4]

AMD is the only significant rival to Intel in the central processor (CPU) market for (x86 based) personal computers. Together they held 99.1 percent (Intel 80.3%, AMD 18.8%) of the CPU’s sold for quarter three of 2011.[dubious – discuss][5] Since acquiring ATI in 2006, AMD and its competitorNvidia have dominated the discrete graphics processor unit (GPU) market, together making up virtually 100% of the market. Continue reading

x86

This article is about Intel microprocessor architecture in general. For the 32-bit generation of this architecture which is also called “x86”, see IA-32.
x86
Designer Intel, AMD
Bits 16-bit, 32-bit, and/or 64-bit
Introduced 1978
Design CISC
Type Register-memory
Encoding Variable (1 to 15 bytes)
Branching Status register
Endianness Little
Page size 8086–i286: None
i386, i486: 4 KB pages
P5 Pentium: added 4 MB pages
(Legacy PAE: 4 KB→2 MB)
x86-64: added 1 GB pages.
Extensions x87, IA-32, P6, MMX, SSE, SSE2, x86-64, SSE3, SSSE3, SSE4, SSE5, AVX
Open Partly. For some advanced features, x86 may require license from Intel; x86-64 may require an additional license from AMD. The 80486 processor has been on the market for over 20 years [1] and so cannot be subject to patent claims. This subset of the x86 architecture is therefore fully open.
Registers
General purpose 16-bit: 6 semi-dedicated registers + BP and SP;
32-bit: 6 GPRs + EBP and ESP;
64-bit: 14 GPRs + RBP and RSP.
Floating point 16-bit: Optional separate x87 FPU.
32-bit: Optional separate or integrated x87 FPU, integrated SSE2 units in later processors.
64-bit: Integrated x87 and SSE2 units.

The Intel 8086.

Intel Core 2 Duo – an example of an x86-compatible, 64-bit multicore processor.

AMD Athlon (early version) Continue reading

Graphology

Graphology is the pseudoscientific[1][2] study and analysis of handwriting, especially in relation to human psychology. In the medical field, it can be used to refer to the study of handwriting as an aid in diagnosis and tracking of diseases of the brain and nervous     system. The term is sometimes incorrectly used to refer to forensic document examination.

Letter from John Cox, 1784

Graphology has been controversial for more than a century. Although supporters point to the anecdotal evidence of thousands of positive testimonials as a reason to use it for personality evaluation, most empirical studies fail to show the validity claimed by its supporters.[3][4]

Etymology

From grapho- (from the Greek γραφή, “writing”) and logos (from the Greek λόγος, “speech”); cf.: Anthropology, Psychology, Biology, Geology. There also exist many other words formed from the same root: Graphopathology, Graphomaniac, Graphistic, Graphopsychology, Psychographology, Graphometric, Graphometry, Graphoanalysis, Graphotechnology.

Basic tenets

Graphology is based upon the following basic assertions:

  • When we write, the ego is active but it is not always active to the same degree. Its activity waxes and wanes; being at its highest Continue reading

Pneumatic tube sytem

Pneumatic tubes (or capsule pipelines; also known as Pneumatic Tube Transport or PTT) are systems in which cylindrical containers are propelled through a network of tubes bycompressed air or by partial vacuum. They are used for transporting solid objects, as opposed to conventional pipelines, which transport fluids. Pneumatic tube networks gained great prominence in the late 19th and early 20th century for businesses or administrations that needed to transport small but urgent packages (such as mail or money) over relatively short distances (within a building, or, at most, within a city). Some of these systems grew to great complexity, but they were eventually superseded by more modern methods of communication and courier transport, and are now much rarer than before. However, in some settings, such as hospitals, they remain of great use, and have been extended and developed further technologically in recent decades.[1]

A small number of pneumatic transportation systems were also built for larger cargo, to compete with more standard train and subway systems. However, these never gained as much popularity as practical systems.

Historical use

Pneumatic capsule transportation was originally invented by William Murdoch. Though a marvel of the time, and a successful sideshow, it was considered little more than a novelty until the invention of the capsule in 1836.[citation needed] The Victorians were the first to use capsule pipelines to transmit telegraph messages, or telegrams, to nearby buildings from telegraph stations.

While they are commonly used for small parcels and documents – now most often used as cash carriers at banks or supermarkets[2] – they were originally proposed in the early 19th century for transport of heavy freight. It was once envisaged that networks of these massive tubes might be used to transport people.

NASA Mission Control Center during the Apollo 13 mission. Note pneumatic tube canisters in console to the right.

Nanotechnology

Nanotechnology (sometimes shortened to “nanotech“) is the study of manipulating matter on an atomic and molecular scale. Generally, nanotechnology deals with developing materials, devices, or other structures possessing at least one dimension sized from 1 to 100 nanometres. Quantum mechanical effects are important at this quantum-realm scale. Nanotechnology is considered a key technology for the future. Consequently, various governments have invested billions of dollars in its future. The USA has invested 3.7 billion dollars through its National Nanotechnology Initiative followed by Japan with 750 million and the European Union 1.2 billion

Nanotechnology is very diverse, ranging from extensions of conventional device physics to completely new approaches based upon molecular self-assembly, from developing new materials with dimensions on the nanoscale to direct control of matter on the atomic scale. Nanotechnology entails the application of fields of science as diverse as surface science, organic chemistry, molecular biology, semiconductor physics, microfabrication, etc.

Scientists debate the future implications of nanotechnology. Nanotechnology may be able to create many new materials and devices with a vast range ofapplications, such as in medicine, electronics, biomaterials and energy production. On the other hand, nanotechnology raises many of the same issues as any new technology, including concerns about the toxicity and environmental impact of nanomaterials,[2] and their potential effects on global economics, as well as speculation about various doomsday scenarios. These concerns have led to a debate among advocacy groups and governments on whether special regulation of nanotechnology is warranted.

Although nanotechnology is a relatively recent development in scientific research, the development of its central concepts happened over a longer period of time. The emergence of nanotechnology in the 1980s was caused by the convergence of experimental advances such as the invention of the scanning tunneling microscope in 1981 and the discovery of fullerenes in 1985, with the elucidation and popularization of a conceptual framework for the goals of nanotechnology beginning with the 1986 publication of the book Engines of Creation.

The scanning tunneling microscope, an instrument for imaging surfaces at the atomic level, was developed in 1981 by Gerd Binnig and Heinrich Rohrer at IBM Zurich Research Laboratory, for which they received the Nobel Prize in Physics in 1986.[3][4] Fullerenes were discovered in 1985 by Harry Kroto, Richard Smalley, and Robert Curl, who together won the 1996 Nobel Prize in Chemistry.[5][6]

Around the same time, K. Eric Drexler developed and popularized the concept of nanotechnology and founded the field of molecular nanotechnology. In 1979, Drexler encountered Richard Feynman’s 1959 talk “There’s Plenty of Room at the Bottom”. The term “nanotechnology”, originally coined by Norio Taniguchi in 1974, was unknowingly appropriated by Drexler in his 1986 book Engines of Creation: The Coming Era of Nanotechnology, which proposed the idea of a nanoscale “assembler” which would be able to build a copy of itself and of other items of arbitrary complexity. He also first published the term “grey goo” to describe what might happen if a hypothetical self-replicating molecular nanotechnology went out of control. Drexler’s vision of nanotechnology is often called “Molecular Nanotechnology” (MNT) or “molecular manufacturing,” and Drexler at one point proposed the term “zettatech” which never became popular.

In the early 2000s, the field was subject to growing public awareness and controversy, with prominent debates about both its potential implications, exemplified by the Royal Society’s report on nanotechnology,[7] as well as the feasibility of the applications envisioned by advocates of molecular nanotechnology, which culminated in the public debate between Eric Drexler and Richard Smalley in 2001 and 2003.[8] Governments moved to promote and fund research into nanotechnology with programs such as the National Nanotechnology Initiative.

The early 2000s also saw the beginnings of commercial applications of nanotechnology, although these were limited to bulk applications of nanomaterials, such as the Silver Nano platform for using silver nanoparticles as an antibacterial agent, nanoparticle-based transparent sunscreens, and carbon nanotubes for stain-resistant textiles.[9][10]

Fundamental concepts

Nanotechnology is the engineering of functional systems at the molecular scale. This covers both current work and concepts that are more advanced. In its original sense, nanotechnology refers to the projected ability to construct items from the bottom up, using techniques and tools being developed today to make complete, high performance products.

One nanometer (nm) is one billionth, or 10−9, of a meter. By comparison, typical carbon-carbon bond lengths, or the spacing between these atoms in a molecule, are in the range 0.12–0.15 nm, and a DNA double-helix has a diameter around 2 nm. On the other hand, the smallest cellular life-forms, the bacteria of the genus Mycoplasma, are around 200 nm in length. By convention, nanotechnology is taken as the scale range 1 to 100 nm following the definition used by the National Nanotechnology Initiative in the US. The lower limit is set by the size of atoms (hydrogen has the smallest atoms, which are approximately a quarter of a nm diameter) since nanotechnology must build its devices from atoms and molecules. The upper limit is more or less arbitrary but is around the size that phenomena not observed in larger structures start to become apparent and can be made use of in the nano device.[11] These new phenomena make nanotechnology distinct from devices which are merely miniaturised versions of an equivalent macroscopic device; such devices are on a larger scale and come under the description of microtechnology.[12]

To put that scale in another context, the comparative size of a nanometer to a meter is the same as that of a marble to the size of the earth.[13] Or another way of putting it: a nanometer is the amount an average man’s beard grows in the time it takes him to raise the razor to his face.[13]

Two main approaches are used in nanotechnology. In the “bottom-up” approach, materials and devices are built from molecular components which assemble themselves chemically by principles of molecular recognition. In the “top-down” approach, nano-objects are constructed from larger entities without atomic-level control.[14]

Areas of physics such as nanoelectronics, nanomechanics, nanophotonics and nanoionics have evolved during the last few decades to provide a basic scientific foundation of nanotechnology.

Larger to smaller: a materials perspective

Image of reconstruction on a cleanGold(100) surface, as visualized usingscanning tunneling microscopy. The positions of the individual atoms composing the surface are visible.
Main article: Nanomaterials

A number of physical phenomena become pronounced as the size of the system decreases. These include statistical mechanical effects, as well asquantum mechanical effects, for example the “quantum size effect” where the electronic properties of solids are altered with great reductions in particle size. This effect does not come into play by going from macro to micro dimensions. However, quantum effects become dominant when the nanometer size range is reached, typically at distances of 100 nanometers or less, the so called quantum realm. Additionally, a number of physical (mechanical, electrical, optical, etc.) properties change when compared to macroscopic systems. One example is the increase in surface area to volume ratio altering Continue reading

FCC Ruling on 800MHz Band a Boon for Sprint

By Stephen Lawson, IDG News    May 25, 2012 11:20 pm

The U.S. Federal Communications Commission approved a rule change for part of the 800MHz band at a meeting on Thursday, opening the door for Sprint Nextel to use the band for its 4G LTE network.

Sprint has frequencies in the 800MHz SMR (Specialized Mobile Radio) band that so far have been dedicated to the iDEN network, which delivers the narrowband 2G service that Sprint acquired by buying Nextel in 2005. When the FCC carried out a rebanding project several years ago to eliminate interference between iDEN and public safety radios, it decided that services on those frequencies couldn’t use channels wider than 25KHz. That channel width can’t support anything more than a narrowband service such as iDEN, which delivers average throughput of 20Kbps (bits per second) to 30Kbps.

At its monthly open meeting on Thursday, the agency removed that channel limit. Sprint had asked the FCC to make the change, which is significant for the carrier’s 4G plans. Sprint has been planning to deploy LTE first in the 1900MHz band and then, pending the FCC decision,http://www.networkworld.com/news/2012/050912-for-lte-network-slow-but-259127.html starting in late 2013 or early 2014.

Continue reading