Easy domain and Hosting

Permanent short link for Go Daddy.com Just ez2.me
Spring Savings! $7.99 .com
Next time for Go Daddy: Easy to you just www.ez2.me Dadicated link for Go Daddy.com Just ez2.me

Friday, May 13, 2011

Cloud computing Comparisons

Cloud computing shares characteristics with:
Autonomic computing — "computer systems capable of self-management."
Client–server model – client–server computing refers broadly to any distributed application that distinguishes between service providers (servers) and service requesters (clients).
Grid computing — "a form of distributed computing and parallel computing, whereby a 'super and virtual computer' is composed of a cluster of networked, loosely coupled computers acting in concert to perform very large tasks."
Mainframe computer — powerful computers used mainly by large organizations for critical applications, typically bulk data processing such as census, industry and consumer statistics, enterprise resource planning, and financial transaction processing.
Utility computing — the "packaging of computing resources, such as computation and storage, as a metered service similar to a traditional public utility, such as electricity.
Peer-to-peer – distributed architecture without the need for central coordination, with participants being at the same time both suppliers and consumers of resources (in contrast to the traditional client–server model).
Service-oriented computing – Cloud computing provides services related to computing while, in a reciprocal manner, service-oriented computing consists of the computing techniques that operate on software-as-a-service.

Cloud computing Open source

Open source software has provided the foundation for many cloud computing implementations, one prominent example being the Hadoop framework. In November 2007, the Free Software Foundation released the Affero General Public License, a version of GPLv3 intended to close a perceived legal loophole associated with free software designed to be run over a network.
For many enterprises, cloud computing is becoming a reality in their IT infrastructures today. The technologies used by today's cloud environments, public and private, have been heavily based on open source software, which offers:

Robust application frameworks
Rapid development
Standards support
Vendor neutrality
Avoidance of vendor lock-in
To advance the development of open source cloud computing Red Hat is pleased to present its second online Open Source Cloud Computing Forum on February 10, 2010, hosted by Red Hat CTO Brian Stevens.
Backed by Rackspace, NASA, Dell, Citrix, Cisco, Canonical and over 50 other organizations, OpenStack has grown to be a global software community of developers, technologists, researchers and corporations collaborating on a standard and massively scalable open source cloud operating system. Our mission is to enable any organization to create and offer cloud computing services running on standard hardware.

Cloud computing Architecture

Cloud architecture, the systems architecture of the software systems involved in the delivery of cloud computing, typically involves multiple cloud components communicating with each other over application programming interfaces, usually web services and 3-tier architecture. This resembles the Unix philosophy of having multiple programs each doing one thing well and working together over universal interfaces. Complexity is controlled and the resulting systems are more manageable than their monolithic counterparts.
The two most significant components of cloud computing architecture are known as the front end and the back end. The front end is the part seen by the client, i.e. the computer user. This includes the client’s network (or computer) and the applications used to access the cloud via a user interface such as a web browser. The back end of the cloud computing architecture is the ‘cloud’ itself, comprising various computers, servers and data storage devices.

How works Cloud computing

Cloud computing utilizes the network as a means to connect the user to resources that are based in the 'cloud', as opposed to actually possessing them. The 'cloud' may be accessed via the Internet or a company network, or both. Cloud services may be designed to work equally well with Linux, Mac and Windows platforms. With smartphones and tablets on the rise, cloud services have changed to allow access from any device connected to the Internet, allowing mobile workers access on-the-go, as in telecommuting, and extending the reach of business services provided by outsourcing.
The service provider, such as Google, may pool the processing power of multiple remote computers in "the cloud" to achieve the task, such as backing up of large amounts of data, word processing, or computationally intensive work. These tasks would normally be difficult, time consuming, or expensive for an individual user or a small company to accomplish, especially with limited computing resources and funds. With 'cloud computing', clients only require a simple computer, such as netbooks which were created with cloud computing in mind, or even a smartphone, with a connection to the Internet, or a company network, in order to make requests to and receive data from the cloud, hence the term "software as a service" (SaaS). Computation and storage is divided among the remote computers in order to handle large volumes of both, thus the client need not purchase expensive hardware or software to handle the task. The outcome of the processing task is returned to the client over the network, depending on the speed of the Internet connection.

Cloud computing keys architecture

Agility improves with users' ability to rapidly and inexpensively re-provision technological infrastructure resources.
Application Programming Interface (API) accessibility to software that enables machines to interact with cloud software in the same way the user interface facilitates interaction between humans and computers. Cloud computing systems typically use REST-based APIs.
Cost is claimed to be greatly reduced and in a public cloud delivery model capital expenditure is converted to operational expenditure. This ostensibly lowers barriers to entry, as infrastructure is typically provided by a third-party and does not need to be purchased for one-time or infrequent intensive computing tasks. Pricing on a utility computing basis is fine-grained with usage-based options and fewer IT skills are required for implementation (in-house).
Device and location independence enable users to access systems using a web browser regardless of their location or what device they are using (e.g., PC, mobile phone). As infrastructure is off-site (typically provided by a third-party) and accessed via the Internet, users can connect from anywhere.
Multi-tenancy enables sharing of resources and costs across a large pool of users thus allowing for:
Centralization of infrastructure in locations with lower costs (such as real estate, electricity, etc.)
Peak-load capacity increases (users need not engineer for highest possible load-levels)
Utilization and efficiency improvements for systems that are often only 10–20% utilized.
Reliability is improved if multiple redundant sites are used, which makes well designed cloud computing suitable for business continuity and disaster recovery.
Scalability via dynamic ("on-demand") provisioning of resources on a fine-grained, self-service basis near real-time, without users having to engineer for peak loads.
Performance is monitored, and consistent and loosely coupled architectures are constructed using web services as the system interface.
Security could improve due to centralization of data, increased security-focused resources, etc., but concerns can persist about loss of control over certain sensitive data, and the lack of security for stored kernels. Security is often as good as or better than under traditional systems, in part because providers are able to devote resources to solving security issues that many customers cannot afford. However, the complexity of security is greatly increased when data is distributed over a wider area or greater number of devices and in multi-tenant systems which are being shared by unrelated users. In addition, user access to security audit logs may be difficult or impossible. Private cloud installations are in part motivated by users' desire to retain control over the infrastructure and avoid losing control of information security.
Maintenance of cloud computing applications is easier, because they do not need to be installed on each user's computer. They are easier to support and to improve, as the changes reach the clients instantly.

Cloud computing history

"Cloud" is used as a metaphor for the Internet, based on the cloud drawing used in the past to represent the telephone network, and later to depict the Internet in computer network diagrams as an abstraction of the underlying infrastructure it represents.
Cloud computing is a natural evolution of the widespread adoption of virtualization, service-oriented architecture, autonomic and utility computing. Details are abstracted from end-users, who no longer have need for expertise in, or control over, the technology infrastructure "in the cloud" that supports them.
The underlying concept of cloud computing dates back to the 1960s, when John McCarthy opined that "computation may someday be organized as a public utility." Almost all the modern-day characteristics of cloud computing (elastic provision, provided as a utility, online, illusion of infinite supply), the comparison to the electricity industry and the use of public, private, government and community forms, were thoroughly explored in Douglas Parkhill's 1966 book, The Challenge of the Computer Utility.
The actual term "cloud" borrows from telephony in that telecommunications companies, who until the 1990s primarily offered dedicated point-to-point data circuits, began offering Virtual Private Network (VPN) services with comparable quality of service but at a much lower cost. By switching traffic to balance utilization as they saw fit, they were able to utilize their overall network bandwidth more effectively. The cloud symbol was used to denote the demarcation point between that which was the responsibility of the provider, and that which was the responsibility of the user. Cloud computing extends this boundary to cover servers as well as the network infrastructure. The first scholarly use of the term “cloud computing” was in a 1997 lecture by Ramnath Chellappa.
After the dot-com bubble, Amazon played a key role in the development of cloud computing by modernizing their data centers, which, like most computer networks, were using as little as 10% of their capacity at any one time, just to leave room for occasional spikes. Having found that the new cloud architecture resulted in significant internal efficiency improvements whereby small, fast-moving "two-pizza teams" could add new features faster and more easily, Amazon initiated a new product development effort to provide cloud computing to external customers, and launched Amazon Web Service (AWS) on a utility computing basis in 2006. The first exposure of the term Cloud Computing to public media is by GoogleEx CEO Eric Schmidt at SES San Jose 2006. It was reported in 2011 that Amazon has thousands of corporate customers, from large ones like Pfizer and Netflix to start-ups, Amongst them also include many corporations that live on Amazon's web services, including Foursquare, a location-based social networking site; Quora, a question-and-answer service; Reddit, a site for news-sharing and BigDoor, a maker of game tools for Web publishers.
In 2007, Google, IBM and a number of universities embarked on a large-scale cloud computing research project. In early 2008, Eucalyptus became the first open-source, AWS API-compatible platform for deploying private clouds. In early 2008, OpenNebula, enhanced in the RESERVOIR European Commission-funded project, became the first open-source software for deploying private and hybrid clouds, and for the federation of clouds. In the same year, efforts were focused on providing QoS guarantees (as required by real-time interactive applications) to cloud-based infrastructures, in the framework of the IRMOS European Commission-funded project. By mid-2008, Gartner saw an opportunity for cloud computing "to shape the relationship among consumers of IT services, those who use IT services and those who sell them" and observed that "rganisations are switching from company-owned hardware and software assets to per-use service-based models" so that the "projected shift to cloud computing will result in dramatic growth in IT products in some areas and significant reductions in other areas.

Cloud computing storage

Cloud engineering is the application of a systematic, disciplined, quantifiable, and interdisciplinary approach to the ideation, conceptualization, development, operation, and maintenance of cloud computing, as well as the study and applied research of the approach, i.e., the application of engineering to cloud. It is a maturing and evolving discipline to facilitate the adoption, strategization, operationalization, industrialization, standardization, productization, commoditization, and governance of cloud solutions, leading towards a cloud ecosystem. Cloud engineering is also known as cloud service engineering.

Cloud storage
Cloud storage is a model of networked computer data storage where data is stored on multiple virtual servers, generally hosted by third parties, rather than being hosted on dedicated servers. Hosting companies operate large data centers; and people who require their data to be hosted buy or lease storage capacity from them and use it for their storage needs. The data center operators, in the background, virtualize the resources according to the requirements of the customer and expose them as virtual servers, which the customers can themselves manage. Physically, the resource may span across multiple servers.

Private Cloud computing

Douglas Parkhill first described the concept of a "private computer utility" in his 1966 book The Challenge of the Computer Utility. The idea was based upon direct comparison with other industries (e.g. the electricity industry) and the extensive use of hybrid supply models to balance and mitigate risks.
"Private cloud" and "internal cloud" have been described as neologisms, but the concepts themselves pre-date the term cloud by 40 years. Even within modern utility industries, hybrid models still exist despite the formation of reasonably well-functioning markets and the ability to combine multiple providers.
Some vendors have used the terms to describe offerings that emulate cloud computing on private networks. These (typically virtualization automation) products offer the ability to host applications or virtual machines in a company's own set of hosts. These provide the benefits of utility computing – shared hardware costs, the ability to recover from failure, and the ability to scale up or down depending upon demand.
Private clouds have attracted criticism because users "still have to buy, build, and manage them" and thus do not benefit from lower up-front capital costs and less hands-on management, essentially "lacking the economic model that makes cloud computing such an intriguing concept.Enterprise IT organizations use their own private cloud(s) for mission critical and other operational systems to protect critical infrastructures. Therefore, for all intents and purposes, "private clouds" are not an implementation of cloud computing at all, but are in fact an implementation of a technology subset: the basic concept of virtualized computing.

Cloud computing privacy

Cloud model has been criticized by privacy advocates for the greater ease in which the companies hosting the cloud services control, and thus, can monitor at will, lawfully or unlawfully, the communication and data stored between the user and the host company. Instances such as the secret NSA program, working with AT&T, and Verizon, which recorded over 10 million phone calls between American citizens, causes uncertainty among privacy advocates, and the greater powers it gives to telecommunication companies to monitor user activity.While there have been efforts (such as US-EU Safe Harbor) to "harmonize" the legal environment, providers such as Amazon still cater to major markets (typically the United States and the European Union) by deploying local infrastructure and allowing customers to select "availability zones.

In order to obtain compliance with regulations including FISMA, HIPAA and SOX in the United States, the Data Protection Directive in the EU and the credit card industry's PCI DSS, users may have to adopt community or hybrid deployment modes which are typically more expensive and may offer restricted benefits. This is how Google is able to "manage and meet additional government policy requirements beyond FISMA" and Rackspace Cloud are able to claim PCI compliance. Customers in the EU contracting with cloud providers established outside the EU/EEA have to adhere to the EU regulations on export of personal data.
Many providers also obtain SAS 70 Type II certification (e.g. Amazon, Salesforce.com, Google and Microsoft), but this has been criticised on the grounds that the hand-picked set of goals and standards determined by the auditor and the auditee are often not disclosed and can vary widely. Providers typically make this information available on request, under non-disclosure agreement.

Research at Cloud computing

Number of universities, vendors and government organizations are investing in research around the topic of cloud computing. Academic institutions include University of Melbourne (Australia), Georgia Tech, Yale, Wayne State, Virginia Tech, University of Wisconsin–Madison, Carnegie Mellon, MIT, Indiana University, University of Massachusetts, University of Maryland, IIT Bombay, North Carolina State University, Purdue University, University of California, University of Washington, University of Virginia, University of Utah, University of Minnesota, among others.
Joint government, academic and vendor collaborative research projects include the IBM/Google Academic Cloud Computing Initiative (ACCI). In October 2007 IBM and Google announced the multi- university project designed to enhance students' technical knowledge to address the challenges of cloud computing. In April 2009, the National Science Foundation joined the ACCI and awarded approximately million in grants to 14 academic institutions.
In July 2008, HP, Intel Corporation and Yahoo! announced the creation of a global, multi-data center, open source test bed, called Open Cirrus, designed to encourage research into all aspects of cloud computing, service and data center management. Open Cirrus partners include the NSF, the University of Illinois (UIUC), Karlsruhe Institute of Technology, the Infocomm Development Authority (IDA) of Singapore, the Electronics and Telecommunications Research Institute (ETRI) in Korea, the Malaysian Institute for Microelectronic Systems(MIMOS), and the Institute for System Programming at the Russian Academy of Sciences (ISPRAS). In Sept. 2010, more researchers joined the HP/Intel/Yahoo Open Cirrus project for cloud computing research. The new researchers are China Mobile Research Institute (CMRI), Spain's Supercomputing Center of Galicia (CESGA by its Spanish acronym), Georgia Tech's Center for Experimental Research in Computer Systems (CERCS) and China Telecom.
In July 2010, HP Labs India announced a new cloud-based technology designed to simplify taking content and making it mobile-enabled, even from low-end devices. Called SiteonMobile, the new technology is designed for emerging markets where people are more likely to access the internet via mobile phones rather than computers. In November 2010, HP formally opened its Government Cloud Theatre, located at the HP Labs site in Bristol, England. The demonstration facility highlights high-security, highly flexible cloud computing based on intellectual property developed at HP Labs. The aim of the facility is to lessen fears about the security of the cloud. HP Labs Bristol is HP’s second-largest central research location and currently is responsible for researching cloud computing and security.
The IEEE Technical Committee on Services Computing in IEEE Computer Society sponsors the IEEE International Conference on Cloud Computing (CLOUD). CLOUD 2010 was held on July 5–10, 2010 in Miami, Florida
On March 23, 2011, Google, Microsoft, HP, Yahoo, Verizon, Deutsche Telekom and 17 other companies formed a nonprofit organization called Open Networking Foundation, focused on providing support for a new cloud initiative called Software-Defined Networking. The initiative is meant to speed innovation through simple software changes in telecommunications networks, wireless networks, data centers and other networking areas.

Security in Cloud computing

Relative security of cloud computing services is a contentious issue which may be delaying its adoption. Issues barring the adoption of cloud computing are due in large part to the private and public sectors unease surrounding the external management of security based services. It is the very nature of cloud computing based services, private or public, that promote external management of provided services. This delivers great incentive amongst cloud computing service providers in producing a priority in building and maintaining strong management of secure services.
Organizations have been formed in order to provide standards for a better future in cloud computing services. One organization in particular, the Cloud Security Alliance is a non-profit organization formed to promote the use of best practices for providing security assurance within cloud computing.

Availability and performance
In addition to concerns about security, businesses are also worried about acceptable levels of availability and performance of applications hosted in the cloud.
There are also concerns about a cloud provider shutting down for financial or legal reasons, which has happened in a number of cases.

Sustainability and siting
Although cloud computing is often assumed to be a form of "green computing", there is as of yet no published study to substantiate this assumption. Siting the servers affects the environmental effects of cloud computing. In areas where climate favors natural cooling and renewable electricity is readily available, the environmental effects will be more moderate. Thus countries with favorable conditions, such as Finland, Sweden and Switzerland, are trying to attract cloud computing data centers.
SmartBay, marine research infrastructure of sensors and computational technology, is being developed using cloud computing, an emerging approach to shared infrastructure in which large pools of systems are linked together to provide IT services.

Legality of Cloud computing

Mid March 2007, Dell applied to trademark the term "cloud computing" (U.S. Trademark 77,139,082) in the United States. The "Notice of Allowance" the company received in July 2008 was canceled in August, resulting in a formal rejection of the trademark application less than a week later. Since 2007, the number of trademark filings covering cloud computing brands, goods and services has increased at an almost exponential rate. As companies sought to better position themselves for cloud computing branding and marketing efforts, cloud computing trademark filings increased by 483% between 2008 and 2009. In 2009, 116 cloud computing trademarks were filed, and trademark analysts predict that over 500 such marks could be filed during 2010.
Other legal cases may shape the use of cloud computing by the public sector. On October 29, 2010, Google filed a lawsuit against the U.S. Department of Interior, which opened up a bid for software that required that bidders use Microsoft's Business Productivity Online Suite. Google sued, calling the requirement "unduly restrictive of competition. Scholars have pointed out that, beginning in 2005, the prevalence of open standards and open source may have an impact on the way that public entities choose to select vendors

Weather of Houston

Houston's climate is classified as humid subtropical (Cfa in Köppen climate classification system). Spring supercell thunderstorms sometimes bring tornadoes to the area. Prevailing winds are from the south and southeast during most of the year, bringing heat across the continent from the deserts of Mexico and moisture from the Gulf of Mexico.
During the summer months, it is common for the temperature to reach over 90 °F (32 °C), with an average of 99 days per year above 90 °F (32 °C). However, the humidity results in a heat index higher than the actual temperature. Summer mornings average over 90 percent relative humidity and approximately 60 percent in the afternoon. Winds are often light in the summer and offer little relief, except near the immediate coast. To cope with the heat, people use air conditioning in nearly every vehicle and building in the city; in 1980 Houston was described as the "most air-conditioned place on earth. Scattered afternoon thunderstorms are common in the summer. The hottest temperature ever recorded in Houston was 109 °F (43 °C) on September 4, 2000.
Winters in Houston are fairly temperate. The average high in January, the coldest month, is 62 °F (17 °C), while the average low is 39 °F (4 °C). Snowfall is generally rare. Recent snow events in Houston include a storm on December 24, 2004 when one inch (2.5 cm) fell and more recent snowfalls on December 10, 2008. However, more recently on December 4, 2009 an inch of snow fell in the city. This was the earliest snowfall ever recorded in Houston. In addition, it set another milestone marking the first time in recorded history that snowfall has occurred on two consecutive years, and marks the third accumulating snowfall occurring in the decade of 2000–2010. The coldest temperature ever recorded in Houston was 5 °F (−15 °C) on January 23, 1940. Houston receives a high amount of rainfall annually, averaging about 54 inches a year. These rains tend to cause floods over portions of the city.
Houston has excessive ozone levels and is ranked among the most ozone-polluted cities in the United States. Ground-level ozone, or smog, is Houston’s predominant air pollution problem, with the American Lung Association rating the metropolitan area's ozone level as the 6th worst in the United States in 2006. The industries located along the ship channel are a major cause of the city's air pollution.

Lindsay Lohan Sentenced to 120

Lindsay Lohan was on Wednesday, May 11 sentenced to three years of probation after pleading no contest to misdemeanor theft in a missing necklace case, while a judge also ordered that the actress report to a women's jail by June 17 to learn how she is to serve a sentence that was given for violating her probation in a DUI case.

Her next court date is a July 21 progress report. The starlet was sentenced to 120 days in jail on Easter weekend, and after nearly five hours in custody following her hearing, Lohan posted bail has been a free woman. She is expected to appeal the judge’s decision of jail time.

TMZ reported Lohan was “blindsided” by the jail sentence she received, as many had speculated that if her charges were reduced from felony theft to misdemeanor charges, she would be able to escape any time behind bars. For some reason, that didn’t happen and she was hit with three months of hard time and 480 hours of community service.

Deputy City Attorney Melanie Chavira asked for substance abuse counseling for Lohan, which the judge denied. Sautner said drugs and alcohol are not the root of Lohan's legal troubles, but "she's got other problems for which she self-medicates."
Lohan's four years in and out of court -- and sometimes jail -- started with two drunken driving arrests in 2007. Since then, she's spent more than eight months in substance abuse rehab.
Dadecated link for Go Daddy.com Just ez2.me
Spring Savings! $7.99 .com
Next time for Go Daddy: Easy to you just www.ez2.me

Weather seeding

Cloud seeding, a form of weather modification, is the attempt to change the amount or type of precipitation that falls from clouds, by dispersing substances into the air that serve as cloud condensation or ice nuclei, which alter the microphysical processes within the cloud. The usual intent is to increase precipitation (rain or snow), but hail and fog suppression are also widely practiced in airports.
Terpenes are released by trees more actively during warmer weather, acting as a natural form of cloud seeding. The clouds reflect sunlight, allowing the forest to regulate its temperature.

With an NFPA 704 rating of Blue 2, silver iodide can cause temporary incapacitation or possible residual injury to humans and mammals with intense or continued but not chronic exposure. However, there have been several detailed ecological studies that showed negligible environmental and health impacts. The toxicity of silver and silver compounds (from silver iodide) was shown to be of low order in some studies. These findings likely result from the minute amounts of silver generated by cloud seeding, which are 100 times less than industry emissions into the atmosphere in many parts of the world, or individual exposure from tooth fillings.
Accumulations in the soil, vegetation, and surface runoff have not been large enough to measure above natural background. A 1995 environmental assessment in the Sierra Nevada of California and a 2004 independent panel of experts (an overview only is presented in the executive summary of the research) in Australia confirmed these earlier findings.
Cloud seeding over Kosciuszko National Park - a Biosphere Reserve - is problematic in that several rapid changes of environmental legislation were made to enable the "trial." Environmentalists are concerned about the uptake of elemental silver in a highly sensitive environment affecting the pygmy possum amongst other species as well as recent high level algal blooms in once pristine glacial lakes. The ABC program Earthbeat on 17 July 2004 heard that not every cloud has a silver lining where concerns for the health of the pygmy possums was raised. Research 50 years ago and analysis by the former Snowy Mountains Authority led to the cessation of the cloud seeding program in the 1950s with non-definitive results. Formerly, cloud seeding was rejected in Australia on environmental grounds because of concerns about the protected species, the pygmy possum. Since silver iodide and not elemental silver is the cloud seeding material, the claims of negative environmental impact are disputed by peer-reviewed research as summarized by the international weather modification association

Cloud seeding In Australia

Australia, CSIRO’s activities in Tasmania in the 1960s were successful. Seeding over the Hydro-Electricity Commission catchment area on the Central Plateau achieved rainfall increases as high as 30% in autumn. The Tasmanian experiments were so successful that the Commission has regularly undertaken seeding ever since in mountainous parts of the State.
In 2004, Snowy Hydro Limited began a trial of cloud seeding to assess the feasibility of increasing snow precipitation in the Snowy Mountains in Australia. The test period, originally scheduled to end in 2009, was later extended to 2014. The New South Wales (NSW) Natural Resources Commission, responsible for supervising the cloud seeding operations, believes that the trial may have difficulty establishing statistically whether cloud seeding operations are increasing snowfall. This project was discussed at a summit in Narrabri, NSW on 1 December 2006. The summit met with the intention of outlining a proposal for a 5 year trial, focussing on Northern NSW.
The various implications of such a widespread trial were discussed, drawing on the combined knowledge of several worldwide experts, including representatives from the Tasmanian Hydro Cloud Seeding Project however does not make reference to former cloud seeding experiments by the then Snowy Mountains Authority which rejected weather modification. The trial required changes to NSW environmental legislation in order to facilitate placement of the cloud seeding apparatus. The modern experiment is not supported for the Australian Alps.
In December 2006, the Queensland government of Australia announced A$7.6 million in funding for "warm cloud" seeding research to be conducted jointly by the Australian Bureau of Meteorology and the United States National Center for Atmospheric Research. Outcomes of the study are hoped to ease continuing drought conditions in the states South East region.

Cloud seeding system in China

The largest cloud seeding system in the world is that of the People's Republic of China, which believes that it increases the amount of rain over several increasingly arid regions, including its capital city, Beijing, by firing silver iodide rockets into the sky where rain is desired. There is even political strife caused by neighboring regions which accuse each other of "stealing rain" using cloud seeding. About 24 countries currently practice weather modification operationally[citation needed]. China used cloud seeding in Beijing just before the 2008 Olympic Games in order to clear the air of pollution, but there are disputes regarding the Chinese claims. In February 2009, China also blasted iodide sticks over Beijing to artificially induce snowfall after four months of drought, and blasted iodide sticks over other areas of northern China to increase snowfall. The snowfall in Beijing lasted for approximately three days and led to the closure of 12 main roads around Beijing. At the end of October 2009 Beijing claimed it had its earliest snowfall since 1987 due to cloud seeding.
In Southeast Asia, open burning produces haze that pollutes the regional environment. Cloud-seeding has been used to improve the air quality by encouraging rainfall. In India, cloud seeding operations were conducted during the years 2003 and 2004 through U.S. based Weather Modification Inc. in state of Maharashtra. In 2008, there are plans for 12 districts of state of Andhra Pradesh.

EU's Cloud seeding

Cloud seeding was begun in France during the 1950s with the intent of reducing hail damage to crops. The ANELFA project consists of local agencies acting within a non-profit organization. A similar project in Spain is managed by the Consorcio por la Lucha Antigranizo de Aragon. The success of the French program was supported by insurance data; that of the Spanish program in studies conducted by the Spanish Agricultural Ministry.
Soviet military pilots seeded clouds over the Byelorussian SSR after the Chernobyl disaster to remove radioactive particles from clouds heading toward Moscow. At the July 2006 G8 Summit, President Putin commented that air force jets had been deployed to seed incoming clouds so they rained over Finland. Rain drenched the summit anyway. In Moscow, the Russian Airforce tried seeding clouds with bags of cement on June 17, 2008. One of the bags did not pulverize and went through the roof of a house. In October 2009, the Mayor of Moscow promised a "winter without snow" for the city after revealing efforts by the Russian Air Force to seed the clouds upwind from Moscow throughout the winter.

United States's cloud seeding

The United States, cloud seeding is used to increase precipitation in areas experiencing drought, to reduce the size of hailstones that form in thunderstorms, and to reduce the amount of fog in and around airports. Cloud seeding is also occasionally used by major ski resorts to induce snowfall. Eleven western states and one Canadian province (Alberta) have ongoing weather modification operational programs . In January 2006, an $8.8 million cloud seeding project began in Wyoming to examine the effects of cloud seeding on snowfall over Wyoming's Medicine Bow, Sierra Madre, and Wind River mountain ranges.
A number of commercial companies, such as Aero Systems Incorporated , Atmospherics Incorporated , North American Weather Consultants , Weather Modification Incorporated , Weather Enhancement Technologies International , Seeding Operations and Atmospheric Research (SOAR) , offer weather modification services centered on cloud seeding. The USAF proposed its use on the battlefield in 1996, although the U.S. signed an international treaty in 1978 banning the use of weather modification for hostile purposes.
During the sixties, Irving P. Krick & Associates operated a successful cloud seeding operation in the area around Calgary, Alberta. This utilized both aircraft and ground-based generators that pumped silver iodide into the atmosphere in an attempt to reduce the threat of hail damage. Ralph Langeman, Lynn Garrison, and Stan McLeod, all ex-members of the RCAF's 403 Squadron, attending the University of Alberta, spent their summers flying hail suppression. A number of surplus Harvard aircraft were fitted with racks under each wing containing 32 railroad fuzees that were impregnated with silver iodide. These could be ignited individually or all at once, depending upon the threat. In coordination with ground units, the aircraft would lay a plume of silver iodide in front of approaching cumulo-nimbus clouds with noticeable effect. Large, active CBs were reduced to nothing. Heavy hail storms were reduced in intensity. This effective program was funded by farmer contributions and government grants.

Cloud seeding's history

Vincent Schaefer (1906–1993) discovered the principle of cloud seeding in July 1946 through a series of serendipitous events. Following ideas generated between himself and Nobel laureate Irving Langmuir while climbing Mt. Washington in New Hampshire, Schaefer, Langmuir's research associate, created a way of experimenting with supercooled clouds using a deep freeze unit of potential agents to stimulate ice crystal growth, i.e., salt, talcum powder, soils, dust and various chemical agents with minor effect. Then one hot and humid July 14, 1946, he wanted to try a few experiments at General Electric's Schenectady Research Lab. He was dismayed to find that the deep freezer was not cold enough to produce a "cloud" using breath air. He decided to move the process along by adding a chunk of dry ice just to lower the temperature of his experimental chamber. To his astonishment, as soon as he breathed into the deep freezer, a bluish haze was noted, followed by an eye-popping display of millions of microscopic ice crystals, reflecting the strong light rays from the lamp illuminating a cross-section of the chamber. He instantly realized that he had discovered a way to change supercooled water into ice crystals. The experiment was easily replicated and he explored the temperature gradient to establish the −40˚Climit for liquid water.
Within the month, Schaefer's colleague, the noted atmospheric scientist Dr. Bernard Vonnegut (brother of novelist Kurt Vonnegut) is credited with discovering another method for "seeding" supercooled cloud water. Vonnegut accomplished his discovery at the desk, looking up information in a basic chemistry text and then tinkering with silver and iodide chemicals to produce silver iodide. Together with Dr. Vonnegut, Professor Henry Chessin, SUNY Albany, a crystallographer, co-authored a publication in Science Magazine [17] and received a patent in 1975. Both methods were adopted for use in cloud seeding during 1946 while working for the General Electric Corporation in the state of New York. Schaefer's altered a cloud's heat budget, Vonnegut's altered formative crystal structure – an ingenious property related to a good match in lattice constant between the two types of crystal. (The crystallography of ice later played a role in Kurt Vonnegut's novel Cat's Cradle.) The first attempt to modify natural clouds in the field through "cloud seeding" began during a flight that began in upstate New York on 13 November 1946. Schaefer was able to cause snow to fall near Mount Greylock in western Massachusetts, after he dumped six pounds of dry ice into the target cloud from a plane after a 60-mile easterly chase from the Schenectady County Airport.
Dry ice and silver iodide agents are effective in changing the physical chemistry of supercooled clouds, thus useful in augmentation of winter snowfall over mountains and under certain conditions, and lightning and hail suppression. While not a new technique, hygroscopic seeding for enhancement of rainfall in warm clouds is enjoying a revival, based on some positive indications from research in South Africa, Mexico, and elsewhere. The hygroscopic material most commonly used is salt. It is postulated that hygroscopic seeding causes the droplet size spectrum in clouds to become more maritime (bigger drops) and less continental, stimulating rainfall through coalescence. From March 1967 until July 1972, the U.S. military's Operation Popeye cloud-seeded silver iodide to extend the monsoon season over North Vietnam, specifically the Ho Chi Minh Trail. The operation resulted in the targeted areas seeing an extension of the monsoon period an average of 30 to 45 days. The 54th Weather Reconnaissance Squadron carried out the operation to "make mud, not war.
In 1969 at the Woodstock Festival, various people claimed to have witnessed clouds being seeded by the U.S. military. This was said to be the cause of the rain which lasted throughout most of the festival.

Cloud seeding's Effects

Referring to the 1903, 1915, 1919 and 1944 and 1947 weather modification experiments, the Australian Federation of Meteorology discounted "rain making." By the 1950s the CSIRO Division of Radiophysics switched to investigating the physics of clouds and had hoped by 1957 to better understand these processes. By the 1960s the dreams of weather making had faded only to be re-ignited post-corporatisation of the Snowy Mountains Scheme in order to achieve "above target" water. This would provide enhanced energy generation and profits to the public agencies who are the principal owners. Cloud seeding has been shown to be effective in altering cloud structure and size and in converting supercooled liquid water to ice particles. The amount of precipitation due to seeding is difficult to quantify. Cloud seeding may also suppress precipitation.
A key challenge is in discerning how much precipitation would have occurred had clouds not been seeded. Overall, there is general expectation that winter cloud seeding over mountains will produce snow, expressed by professional organizations. There is statistical evidence for seasonal precipitation increases of about 10% with winter seeding.
The US government through its National Center for Atmospheric Research has analyzed seeded and unseeded clouds to understand the differences between them, and has conducted seeding research in other countries.
Clouds were seeded during the 2008 Summer Olympics in Beijing using rockets, so that there would be no rain during the opening and closing ceremonies. although others dispute their claims of success.

System for Cloud seeding

Most common chemicals used for cloud seeding include silver iodide and dry ice (frozen carbon dioxide). The expansion of liquid propane into a gas has also been used and can produce ice crystals at higher temperatures than silver iodide. The use of hygroscopic materials, such as salt, is increasing in popularity because of some promising research results.
Seeding of clouds requires that they contain supercooled liquid water—that is, liquid water colder than zero degrees Celsius. Introduction of a substance such as silver iodide, which has a crystalline structure similar to that of ice, will induce freezing nucleation. Dry ice or propane expansion cools the air to such an extent that ice crystals can nucleate spontaneously from the vapor phase. Unlike seeding with silver iodide, this spontaneous nucleation does not require any existing droplets or particles because it produces extremely high vapor supersaturations near the seeding substance. However, the existing droplets are needed for the ice crystals to grow into large enough particles to precipitate out.
In mid-latitude clouds, the usual seeding strategy has been predicated upon the fact that the equilibrium vapor pressure is lower over ice than over water. When ice particles form in supercooled clouds, this fact allows the ice particles to grow at the expense of liquid droplets. If there is sufficient growth, the particles become heavy enough to fall as snow (or, if melting occurs, rain) from clouds that otherwise would produce no precipitation. This process is known as "static" seeding.
Seeding of warm-season or tropical cumulonimbus (convective) clouds seeks to exploit the latent heat released by freezing. This strategy of "dynamic" seeding assumes that the additional latent heat adds buoyancy, strengthens updrafts, ensures more low-level convergence, and ultimately causes rapid growth of properly selected clouds.
Cloud seeding chemicals may be dispersed by aircraft (as in the second figure) or by dispersion devices located on the ground (generators, as in first figure, or canisters fired from anti-aircraft guns or rockets). For release by aircraft, silver iodide flares are ignited and dispersed as an aircraft flies through the inflow of a cloud. When released by devices on the ground, the fine particles are carried downwind and upwards by air currents after release.
An electronic mechanism was tested in 2010, when infrared laser pulses were directed to the air above Berlin by researchers from the University of Geneva. The experimenters posited that the pulses would encourage atmospheric sulfur dioxide and nitrogen dioxide to form particles that would then act as seeds.

Working of Cloud chamber

Simple cloud chamber consists of the parts sealed environment, radioactive source (if you want to), dry ice or a cold plate and some kind of alcohol source (it has to allow easy evaporation)
Lightweight methyl alcohol vapour saturates the chamber. The alcohol falls as it cools down and the cold condenser provides a steep temperature gradient. The result is a supersaturated environment. The alcohol vapour condenses around ion trails left behind by the travelling ionizing particles. The result is cloud formation, seen in the cloud chamber by the presence of droplets falling down to the condenser. As particles pass through the chamber they leave ionization trails and because the alcohol vapour is supersaturated it condenses onto these trails. Since the tracks are emitted radially out from the source, their point of origin can easily be determined.
Just above the cold condenser plate there is an area of the chamber which is sensitive to radioactive tracks. At this height, most of the alcohol has not condensed. This means that the ion trail left by the radioactive particles provides an optimal trigger for condensation and cloud formation. This sensitive area is increased in height by employing a steep temperature gradient, little convection, and very stable conditions. A strong electric field is often used to draw cloud tracks down to the sensitive region of the chamber and increase the sensitivity of the chamber. While tracks from sources can still be seen without a voltage supply, background tracks are very difficult to observe. In addition, the voltage can also serve to prevent large amounts of "rain" from obscuring the sensitive region of the chamber,caused by condensation forming above the sensitive area of the chamber. This means that ion trails left by radioactive particles are obscured by constant precipitation. The black background makes it easier to observe cloud tracks.
Before tracks can be visible, a tangential light source is needed. This illuminates the white droplets against the black background. Drops should be viewed from a horizontal position. If the chamber is working correctly, tiny droplets should be seen condensing. Often this condensation is not apparent until a shallow pool of alcohol is formed at the condenser plate. The tracks become much more obvious once temperatures and conditions have stabilized in the chamber. This requires the elimination of any significant drift currents (poor chamber sealing)

Cloud chamber

Cloud chamber, also known as the Wilson chamber, is used for detecting particles of ionizing radiation. In its most basic form, a cloud chamber is a sealed environment containing a supersaturated vapor of water or alcohol. When an alpha or beta particle interacts with the mixture, it ionizes it. The resulting ions act as condensation nuclei, around which a mist will form (because the mixture is on the point of condensation). The high energies of alpha and beta particles mean that a trail is left, due to many ions being produced along the path of the charged particle. These tracks have distinctive shapes (for example, an alpha particle's track is broad and shows more evidence of deflection by collisions, while an electron's is thinner and straight.). When any uniform magnetic field is applied across the cloud chamber, positively and negatively charged particles will curve in opposite directions, according to the Lorentz force law with two particles of opposite charge. For more detailed track-shape information.
Charles Thomson Rees Wilson (1869–1959), a Scottish physicist, is credited with inventing the cloud chamber. Inspired by sightings of the Brocken spectre while working on the summit of Ben Nevis in 1894, he began to develop expansion chambers for studying cloud formation and optical phenomena in moist air. Very rapidly he discovered that ions could act as centers for water droplet formation in such chambers. He pursued the application of this discovery and perfected the first cloud chamber in 1911. In Wilson's original chamber the air inside the sealed device was saturated with water vapor, then a diaphragm is used to expand the air inside the chamber (adiabatic expansion). This cools the air and water vapor starts to condense. When an ionizing particle passes through the chamber, water vapor condenses on the resulting ions and the trail of the particle is visible in the vapor cloud. Wilson, along with Arthur Compton, received the Nobel Prize for Physics in 1927 for his work on the cloud chamber. This kind of chamber is also called a Pulsed Chamber, because the conditions for operation are not continuously maintained. Developments were made by Patrick Blackett who utilised a stiff spring to expand and compress the chamber very rapidly, making the chamber sensitive to particles several times a second. The cloud chamber was the first radioactivity detector.

Tuesday, May 10, 2011

Flooding Mississippi submerges Memphis waterfront

Memphis, Tennessee -- The Mississippi River has begun cresting at Memphis, forecasters said Tuesday, as attention began turning to flooding concerns in Louisiana and Mississippi.
The slow passing of the bulge of water working its way from north to south along the Mississippi is only the beginning of the end of the siege for Memphis residents, who could be dealing with high water levels into June.
And the struggle is just getting started for residents of Mississippi and Louisiana, where the river is expected to begin cresting next week at levels unseen since 1927.

The river level hit 48 feet, or about 14 metres on Tuesday, the highest it has been since 1937, said CTV's Joy Malbon, reporting from Memphis.

"It's like watching a slow-motion disaster happen," Malbon told CTV News Channel.

"The water took a long time to rise and it's going to take several weeks for the water to recede before people can go back into their homes.

There was no danger to Memphis' major tourist draws, such as Graceland, which lies far beyond the water's reach, and much of the city appeared normal. Levees built to prevent massive flooding were holding, according to the Army Corps of Engineers on Tuesday. Shelby County and four others were declared disaster areas by President Obama late Monday. The designation means that they'll be eligible for federal disaster aid, which local officials say is much-needed.

Cleanup was expected to be massive, and there were fears that farmland and cities further south could yet be devastated. Inmates in Louisiana's largest prison were taken to higher ground, and farmers were building homemade levees to protect their crops. Engineers diverted water into Lake Pontchartrain to ease the pressure on levees around New Orleans, where levee failures after Hurricane Katrina virtually drowned the city.

The flooding is the result of heavy rains recently and unusually heavy snow over the winter further north that began melting and adding to the already swollen Mississippi.