Saturday, March 22, 2008

Supercomputer


A supercomputer is a computer that is considered, or was considered at the time of its introduction, to be at the frontline in terms of processing capacity, particularly speed of calculation. The term "Super Computing" was first used by New York World newspaper in 1929[1] to refer to large custom-built tabulators IBM made for Columbia University.

Supercomputers introduced in the 1960s were designed primarily by Seymour Cray at Control Data Corporation (CDC), and led the market into the 1970s until Cray left to form his own company, Cray Research. He then took over the supercomputer market with his new designs, holding the top spot in supercomputing for five years (1985–1990). Cray, himself, never used the word "supercomputer", a little-remembered fact is that he only recognized the word "computer". In the 1980s a large number of smaller competitors entered the market, in a parallel to the creation of the minicomputer market a decade earlier, but many of these disappeared in the mid-1990s "supercomputer market crash". Today, supercomputers are typically one-of-a-kind custom designs produced by "traditional" companies such as IBM and HP, who had purchased many of the 1980s companies to gain their experience.
The Cray-2 was the world's fastest computer from 1985 to 1989.
The Cray-2 was the world's fastest computer from 1985 to 1989.

The term supercomputer itself is rather fluid, and today's supercomputer tends to become tomorrow's normal computer. CDC's early machines were simply very fast scalar processors, some ten times the speed of the fastest machines offered by other companies. In the 1970s most supercomputers were dedicated to running a vector processor, and many of the newer players developed their own such processors at a lower price to enter the market. The early and mid-1980s saw machines with a modest number of vector processors working in parallel become the standard. Typical numbers of processors were in the range of four to sixteen. In the later 1980s and 1990s, attention turned from vector processors to massive parallel processing systems with thousands of "ordinary" CPUs, some being off the shelf units and others being custom designs. (This is commonly and humorously referred to as the attack of the killer micros in the industry.) Today, parallel designs are based on "off the shelf" server-class microprocessors, such as the PowerPC, Itanium, or x86-64, and most modern supercomputers are now highly-tuned computer clusters using commodity processors combined with custom interconnects.
Supercomputer challenges, technologies

* A supercomputer generates large amounts of heat and must be cooled. Cooling most supercomputers is a major HVAC problem.
* Information cannot move faster than the speed of light between two parts of a supercomputer. For this reason, a supercomputer that is many meters across must have latencies between its components measured at least in the tens of nanoseconds. Seymour Cray's supercomputer designs attempted to keep cable runs as short as possible for this reason: hence the cylindrical shape of his Cray range of computers. In modern supercomputers built of many conventional CPUs running in parallel, latencies of 1-5 microseconds to send a message between CPUs are typical.
* Supercomputers consume and produce massive amounts of data in a very short period of time. According to Ken Batcher, "A supercomputer is a device for turning compute-bound problems into I/O-bound problems." Much work on external storage bandwidth is needed to ensure that this information can be transferred quickly and stored/retrieved correctly.

Technologies developed for supercomputers include:

* Vector processing
* Liquid cooling
* Non-Uniform Memory Access (NUMA)
* Striped disks (the first instance of what was later called RAID)
* Parallel filesystems

Processing techniques

Vector processing techniques were first developed for supercomputers and continue to be used in specialist high-performance applications. Vector processing techniques have trickled down to the mass market in DSP architectures and SIMD processing instructions for general-purpose computers.

Modern video game consoles in particular use SIMD extensively and this is the basis for some manufacturers' claim that their game machines are themselves supercomputers. Indeed, some graphics cards have the computing power of several TeraFLOPS. The applications to which this power can be applied was limited by the special-purpose nature of early video processing. As video processing has become more sophisticated, Graphics processing units (GPUs) have evolved to become more useful as general-purpose vector processors, and an entire computer science sub-discipline has arisen to exploit this capability: General-Purpose Computing on Graphics Processing Units (GPGPU).

Operating systems
Supercomputers predominantly run some variant of Linux or UNIX. Linux has been the most popular operating system since 2004
Supercomputers predominantly run some variant of Linux or UNIX. Linux has been the most popular operating system since 2004

Supercomputer operating systems, today most often variants of Linux or UNIX, are every bit as complex as those for smaller machines, if not more so. Their user interfaces tend to be less developed, however, as the OS developers have limited programming resources to spend on non-essential parts of the OS (i.e., parts not directly contributing to the optimal utilization of the machine's hardware). This stems from the fact that because these computers, often priced at millions of dollars, are sold to a very small market, their R&D budgets are often limited. (The advent of Unix and Linux allows reuse of conventional desktop software and user interfaces.)

Interestingly this has been a continuing trend throughout the supercomputer industry, with former technology leaders such as Silicon Graphics taking a back seat to such companies as NVIDIA, who have been able to produce cheap, feature-rich, high-performance, and innovative products due to the vast number of consumers driving their R&D.

Historically, until the early-to-mid-1980s, supercomputers usually sacrificed instruction set compatibility and code portability for performance (processing and memory access speed). For the most part, supercomputers to this time (unlike high-end mainframes) had vastly different operating systems. The Cray-1 alone had at least six different proprietary OSs largely unknown to the general computing community. Similarly different and incompatible vectorizing and parallelizing compilers for Fortran existed. This trend would have continued with the ETA-10 were it not for the initial instruction set compatibility between the Cray-1 and the Cray X-MP, and the adoption of UNIX operating system variants (such as Cray's Unicos and today's Linux.)

For this reason, in the future, the highest performance systems are likely to have a UNIX flavor but with incompatible system-unique features (especially for the highest-end systems at secure facilities).

Wednesday, March 19, 2008

FRIDAY THE 13TH- BEWARE!

FRIDAY THE 13TH - how did Friday the thirteenth become such an unlucky day?
Fear of Friday the 13th is rooted in ancient, separate bad-luck associations with the number 13 and the day Friday. The two unlucky entities combine to make one super unlucky day.
There is a Norse myth about 12 gods having a dinner party at Valhalla, their heaven. In walked the uninvited 13th guest, the mischievous Loki. Once there, Loki arranged for Hoder, the blind god of darkness, to shoot Balder the Beautiful, the god of joy and gladness, with a mistletoe-tipped arrow. Balder died and the Earth got dark. The whole Earth mourned.
There is a Biblical reference to the unlucky number 13. Judas, the apostle who betrayed Jesus, was the 13th guest to the Last Supper.
A particularly bad Friday the 13th occurred in the middle ages. On a Friday the 13th in 1306, King Philip of France arrested the revered Knights Templar and began torturing them, marking the occasion as a day of evil.
In ancient Rome, witches reportedly gathered in groups of 12. The 13th was believed to be the devil.
Both Friday and the number 13 were once closely associated with capital punishment. In British tradition, Friday was the conventional day for public hangings, and there were supposedly 13 steps leading up to the noose.
It is traditionally believed that Eve tempted Adam with the apple on a Friday. Tradition also has it that the Flood in the Bible, the confusion at the Tower of Babel, and the death of Jesus Christ all took place on Friday.
Numerologists consider 12 a "complete" number. There are 12 months in a year, 12 signs of the zodiac, 12 gods of Olympus, 12 labors of Hercules, 12 tribes of Israel, and 12 apostles of Jesus. In exceeding 12 by 1, 13's association with bad luck has to do with just being a little beyond completeness.
FRIDAY THE 13TH - how is fear of the number thirteen demonstarted?
More than 80 percent of high-rises lack a 13th floor.
Many airports skip the 13th gate.
Airplanes have no 13th aisle.
Hospitals and hotels regularly have no room number 13.
Italians omit the number 13 from their national lottery.
On streets in Florence, Italy, the house between number 12 and 14 is addressed as 12 and a half.
Many cities do not have a 13th Street or a 13th Avenue
In France, socialites known as the quatorziens (fourteeners) once made themselves available as 14th guests to keep a dinner party from an unlucky fate.
Many triskaidekaphobes, as those who fear the unlucky integer are known, point to the ill-fated mission to the moon, Apollo 13.
If you have 13 letters in your name, you will have the devil's luck . Jack the Ripper, Charles Manson, Jeffrey Dahmer, Theodore Bundy and Albert De Salvo all have 13 letters in their names.

Tuesday, March 18, 2008

AFFLUENZA

“AFFLUENZA- KEEPING UP WITH THE JONESES”
“AFFLUENZA PLAGUES THE SUPERRICH”
“AFFLUENZA- AN UNHAPPY RELATIONSHIP WITH
MONEY”
“WEALTH AND GREED –
DO YOU SUFFER FROM AFFLUENZA?”

Affluenza, the conflation of the words affluent and influenza. Defined as an extreme form of materialism or an unsustainable addiction to economic growth, affluenza is said to be the new age lifestyle disorder that has plagued every developed and developing economy in the world. This illness is said to be something that is constantly reinforcing itself at both the individual and the social levels.
Affluenza has been described as:
1. The bloated, sluggish and unfulfilled feeling that results from efforts to keep up with the Joneses.
2. An epidemic of stress, overwork, waste and indebtedness caused by dogged pursuit of the dough.
3. An unsustainable addiction to economic growth.

Affluenza is to be blamed for problems like over-consumption, luxury fever, consumer debt, overwork, waste and harm to the environment. The symptoms of affluenza also manifest themselves psychologically with disorders like alienation, stress and depression. Affluenza has been attributed to overwork, personal stress, the erosion of family and community, skyrocketing debt, and the growing gap between rich and poor.

Overabundance is making the whole world a place filled with lifestyle disorders. The drive to make more dough has made people super stressful. Overabundance in every sense comes as a major blow for today’s nuclear families, with busy parents and often-neglected children. Their kids have certain aimlessness about them. Now, in a good scenario this is what draws them to charity. In a bad case, they become aimless spoilt party sisters. Only parents can guard children against these terrible situations. Instead of laying it all out for them, instil in them a sense of direction. But what if the parents themselves are victims of affluenza? This is not an improbable situation. In professions where big money can be made in less time, affluenza hits hard. Sadly, today rich has become synonymous with ‘fake’. At some level or the other, everyone suffers from affluenza. Why else would parents deprive their children of the joy of waiting to have something, or working towards it?

However, it is not just the people's fault. Advertisers infect the country with more things that they "need" to have. Commercials are like the germs. Once people are exposed to them, over and over, they are bound to catch the sickness. It is a fact that “advertisers who promote and shape a consumer’s way of life seek to condition us to the idea that by trading our “life” for the money needed to buy their product, in hopes we can fulfil our hopes for power, happiness, acceptance, success, achievement, and personal worth.”

Strangeness characterizes this disease in many other ways too. Almost every one of us actively carries it. Its effects influence not only the immediate carrier but also society at large. Its symptoms are so commonplace that few people make a connection between the disease and the discomfort that it breeds. It infects rich and poor alike, and is beginning to imbed itself in younger and younger carriers. Oddly, those that do recognize the scourge that it sometimes brings are disparagingly referred to as hypocrites, elitists, party-poopers, or catastrophists. And most confusing, it is a disease that is socially acceptable among many of us.

America is said to be hit by the disease at large. America has five percent of the world's population and we consume thirty times as much as other countries. Paris Hilton is said to be hit by affluenza. According to affluenza.org, the average adult spends more time shopping each week than she/he spends with his or her children. “More Americans visit shopping malls on Sunday than go to church. More Americans file for bankruptcy each year than graduate from college.” The average American home is more than twice as large as it was in the 1950s, yet the average family is smaller.A large part of our country is still untouched by this dreadful syndrome. Affluenza strikes a thin stratum of the super wealthy Indians.

Saturday, March 15, 2008

Surface Computing




The History of Microsoft Surface

Beyond the Mouse and Keyboard

Surface computing is a major advancement that moves beyond the traditional user interface to a more natural

way of interacting with digital content. Microsoft Surface™, Microsoft Corp.’s fi rst commercially available

surface computer, breaks down the traditional barriers between people and technology to provide effortless

interaction with all forms of digital content through natural gestures, touch and physical objects instead of a

mouse and keyboard. Although customers will be able to interact with Surface in select restaurants, hotels, retail

establishments and public entertainment venues by the end of the year, the product has been years in the making at Microsoft.


Tub” model prototype












An Idea Inspired by Cross-Division Collaboration

In 2001, Stevie Bathiche of Microsoft Hardware and Andy Wilson of

Microsoft Research began working together on various projects that

took advantage of their complementary expertise in the areas of

hardware and software. In one of their regular brainstorm sessions,

they started talking about an idea for an interactive table that could

understand the manipulation of physical pieces. Although there were

related efforts happening in academia, Bathiche and Wilson saw the

need for a product where the interaction was richer and more intuitive,

and at the same time practical for everyone to use.

This conversation was the beginning of an idea that would later result

in the development of Surface, and over the course of the following year, various people at Microsoft involved

in developing new product concepts, including the gaming-specifi c PlayTable, continued to think through the

possibilities and feasibility of the project. Then in October 2001 a virtual team was formed to fully pursue bringing

the idea to the next stage of development; Bathiche and Wilson were key members of the team.

Humble Beginnings on an IKEA Table

In early 2003, the new Consumer Products Group, led by David Kurlander,

presented the idea to Bill Gates, Microsoft chairman, in a group review.

Gates instantly liked the idea and encouraged the team to continue

to develop their thinking. The virtual team expanded, and within a

month, through constant discussion and brainstorming, the fi rst humble

prototype was born and nicknamed T1. The model was based on an

IKEA table with a hole cut in the top and a sheet of architect vellum

used as a diffuser. The evolution of Surface had begun. A variety of early

applications were also built, including pinball, a photo browser and a

video puzzle. As more applications were developed, the team saw the

value of the surface computer beyond simply gaming and began to favor

those applications that took advantage of the unique ability of Surface to

recognize physical objects placed on the table. The team was also beginning to realize that surface computing

could be applied to a number of different embodiments and form factors.

Over the next year, the team grew signifi cantly, including the addition of Nigel Keam, initially software

development lead and later architect for Surface, who was part of the development team eventually tasked with

taking the product from prototype to a shipping product. Surface prototypes, functionality and applications were

continually refi ned. More than 85 early prototypes were built for use by software developers, hardware developers

and user researchers.

T1 prototype

One of the key attributes of Surface is object recognition and the ability of objects placed on the surface to trigger

different types of digital responses, including the transfer of digital content. This feature went through numerous

rounds of testing and refi ning. The team explored various tag formats of all shapes and sizes before landing on the

domino tag (used today) which is an 8-bit, three-quarter-inch-square tag that is optimal thanks to its small size.

At the same time, the original plan of using a single camera in the vision system was proving to be unreliable. After

exploring a variety of options, including camera placement and different camera lens sizes, it was decided that

Surface would use fi ve cameras that would more accurately detect natural movements and gestures from the surface.

Hardware Design

By late 2004, the software development platform of Surface was well-established and attention turned to the form

factor. A number of different experimental prototypes were built including “the tub” model, which was encased in a

rounded plastic shell, a desk-height model with a square top and cloth-covered sides, and even a bar-height model

that could be used while standing. After extensive testing and user research, the fi nal hardware design (seen today)

was fi nalized in 2005. Also in 2005, Wilson and Bathiche introduced the concept of surface computing in a paper for

Gates’ twice-yearly “Think Week,” a time Gates takes to evaluate new ideas and technologies for the company.

From Prototype to Product

The next phase of the development of Surface focused on

continuing the journey from concept to product. Although much

of what would later ship as Surface was determined, there was

signifi cant work to be done to develop a market-ready product

that could be scaled to mass production. “So much work goes

into turning a prototype into a product that can handle the strain

and stress of everyday use,” Keam said. “For instance, when we

developed the T1 prototype, it couldn’t be moved without having to

recalibrate it. Now, obviously the product can easily be moved. To

get Surface to where it is today, the code had to be rewritten from

the ground up.”

In early 2006, Pete Thompson joined the group as general manager, tasked with driving end-to-end business and

growing development and marketing. Under his leadership, the group has grown to more than 100 employees.

Today Surface has become the market-ready product once only envisioned by the group, a 30-inch display in

a table-like form factor that’s easy for individuals or small groups to use collaboratively. The sleek, translucent

surface lets people engage with Surface using touch, natural hand gestures and physical objects placed on the

surface. Years in the making, Microsoft Surface is now poised to transform the way people shop, dine, entertain

and live.

“Seeing Surface grow from a small germ of an idea to a working prototype and then to a full-fl edged marketready

product has been an amazing journey,” Wilson said. “This is a radically different user-interface experience

than anything Microsoft has done before, and it’s really a testament to the innovation that comes from marrying

brilliance and creativity.”

Beyond Surface — Surface Computing Tomorrow

Although surface computing is a new experience for consumers, over time Microsoft believes there will be a

whole range of surface computing devices and the technology will become pervasive in people’s lives in a variety

of environments. As form factors continue to evolve, surface computing will be in any number of environments

— schools, businesses, homes — and in any number of form factors — part of the countertop, the wall or the

refrigerator.

About Microsoft Surface Computing

Microsoft Surface Computing brings to life a whole new way to interact with information that engages the senses, improves collaboration and empowers consumers. By utilizing the best combination of connected software, services and hardware, Microsoft is at the forefront of developing surface computing products that push computing boundaries, deliver new experiences that break down barriers between users and technology, and provide new opportunities for companies to engage with people. More information can be found at http://www.surface.com.

The Human Touch

Microsoft Surface puts people in control of their experiences with technology, making everyday tasks entertaining, enjoyable and efficient. Imagine ordering a beverage during a meal with just the tap of a finger. Imagine quickly browsing through music and dragging favorite songs onto a personal playlist by moving a finger across the screen. Imagine creating and sending a personal postcard of vacation pictures instantly to friends and family, while still wearing flip-flops.

Surface also features the ability to recognize physical objects that have identification tags similar to bar codes. This means that when a customer simply sets a wine glass on the surface of a table, a restaurant could provide them with information about the wine they’re ordering, pictures of the vineyard it came from and suggested food pairings tailored to that evening’s menu. The experience could become completely immersive, letting users access information on the wine-growing region and even look at recommended hotels and plan a trip without leaving the table.

Surface computing at Microsoft is an outgrowth of a collaborative effort between the Microsoft Hardware and Microsoft Research teams, which were struck by the opportunity to create technology that would bridge the physical and virtual worlds. What started as a high-level concept grew into a prototype and evolved to today’s market-ready product that will transform the way people shop, dine, entertain and live. It’s a major advancement that moves beyond the traditional user interface to a more natural way of interacting with information. Surface computing, which Microsoft has been working on for a number of years, features four key attributes:

· Direct interaction. Users can actually “grab” digital information with their hands, interacting with content by touch and gesture, without the use of a mouse or keyboard.

· Multi-touch. Surface computing recognizes many points of contact simultaneously, not just from one finger like a typical touch-screen, but up to dozens of items at once.

· Multi-user. The horizontal form factor makes it easy for several people to gather around surface computers together, providing a collaborative, face-to-face computing experience.

· Object recognition. Users can place physical objects on the surface to trigger different types of digital responses, including the transfer of digital content.