Monday, March 24, 2008

Location-based games

Games played on a mobile piece of equipment using localization technology like GPS are called location-based games. In other words: while it does not matter for a normal mobile game where exactly you are (play them anywhere at anytime), the player's coordinate and movement are main elements in a Location-based game. The best-known example is the treasure hunt game Geocaching, which can be played on any mobile appliance with integrated or external GPS receiver. External GPS receivers are generally connected via Bluetooth. More and more mobile phones with integrated GPS are expected to come.

Besides Geocaching, there exist several other location-based games which are somewhat in the stage of research prototypes than a commercial success.

Monday, March 17, 2008

Battery powered mobile phones

Mobile phones commonly obtain power from batteries which can be recharged from mains power, a USB port or a cigarette lighter socket in a car. Formerly, the most regular form of mobile phone batteries were nickel metal-hydride, as they have a low size and weight. Lithium-Ion batteries are sometimes used, as they are lighter and do not have the voltage depression that nickel metal-hydride batteries do. Many mobile phone manufacturers have now switched to using lithium-Polymer batteries as different to the older Lithium-Ion; the main advantages of this being even lower weight and the possibility to make the battery a shape other than strict cuboids. Mobile phone manufacturers have been experimenting with alternate power sources, together with solar cells.

Monday, March 10, 2008

Uses & Design of Supercomputer

Uses

Supercomputers are used for extremely calculation-intensive tasks such as weather forecasting, climate research (including research into global warming), molecular modeling (computing the structures and properties of chemical compounds, biological macromolecules, polymers, and crystals), physical simulations (such as simulation of airplanes in wind tunnels, simulation of the detonation of nuclear weapons, and research into nuclear fusion), cryptanalysis, and the like. Military and scientific agencies are important users.

Design

Supercomputers traditionally gained their speed over conventional computers through the use of inventive designs that allow them to perform many tasks in parallel, as well as complex detail engineering. They tend to be specialized for certain types of computation, generally numerical calculations, and perform poorly at more general computing tasks. Their memory hierarchy is very carefully designed to make certain the processor is kept fed with data and instructions at all times—in fact, much of the performance difference between slower computers and supercomputers is due to the memory hierarchy design and componentry. Their I/O systems have a propensity to be designed to support high bandwidth, with latency less of an issue, because supercomputers are not used for transaction processing.

As with all highly parallel systems, Amdahl's law applies, and supercomputer designs devote huge effort to eliminating software serialization, and using hardware to accelerate the remaining bottlenecks.

Software tools

Software tools for distributed processing contain standard APIs such as MPI and PVM and open source-based software solutions such as Beowulf and openMosix which facilitate the creation of a sort of "virtual supercomputer" from a collection of ordinary workstations or servers. Technology like Rendezvous pave the way for the making of ad hoc computer clusters. An example of this is the distributed rendering function in Apple's Shake compositing function. Computers running the Shake software simply need to be in proximity to each other, in networking terms, to automatically discover and use each other's resources. While no one has yet built an ad hoc computer cluster that rivals even yesteryear's supercomputers, the line between desktop, or even laptop, and supercomputer is beginning to blur, and is probable to continue to blur as built-in support for parallelism and distributed processing increases in mainstream desktop operating systems. An easy programming language for supercomputers remains an open explore topic in Computer Science.

Tuesday, March 04, 2008

Physics in Natural science

Physics embodies the study of the fundamental constituents of the universe, the forces and connections they exert on one another, and the results produced by these interactions. In general, the physics is regarded as the fundamental science as all other natural sciences utilize and obey the principles and laws set down by the field. Physics relies heavily on mathematics as the logical framework for formulation and quantification of principles.

The study of the principles of the universe has a long history and largely derives from shortest observation and experimentation. The formulation of theories about the governing laws of the universe has been essential to the study of physics from very early on, with philosophy gradually yielding to systematic, quantitative experimental testing and observation as the source of verification. Key historical developments in physics include Isaac Newton's theory of universal gravitation and classical mechanics, an understanding of electricity and it's relation to magnetism, Einstein's theories of special and common relativity, the improvement of thermodynamics, and the quantum mechanical model of atomic and subatomic physics.

The field of physics is extremely broad, and can contain such diverse studies as quantum mechanics and theoretical physics to applied physics and optics. Modern physics is becoming increasingly specialized, where researchers tend to focus on a exacting area rather than being "universalists" like Albert Einstein and Lev Landau, who worked in multiple areas.