07 October, 2012

How to move your domain name account


Domain name account transfer is the process of transferring the destination registrar of a domain name. The process of transferring differs among registrars. Domain transfer normally happens due to the price factor. But honestly speaking, this is a very creepy process. Before transferring you need to purchase a domain of your own.

You have to get the following things done to get the domain transfer in effect.
  1.First unlock your domain from the old registrar.
  2.Make the domain transfer confirmed from the old registrar.
  3.Finally confirm the domain transfer from the new registrar.

The above are the overall procedure of transferring a domain but the detailed steps are as follows:

Step#1:Purchase a domain transfer first.

Step#2: Unlock the domain and then receive “Authorization code” from the old registrar. This can be done through control panel & support service depending upon your registrar.

Step#3: The new registrar then sends an e-mail containing your ID, Code & a link to confirm your transfer.

Step#4: Then you have to confirm the transfer using your “ID Number” & “Key Code” after clicking on the given link. Sometimes the “authorized code” is also asked during confirmation.

Step#5: Then the new registrar notifies the old registrar to transfer the domain.

Step#6:Getting the request, the old registrar processes the request and then confirms the new registrar about the release of the requested domain & asks you confirm.

Step#7: Then you have to confirm your transfer of domain to the old registrar.

Step#8: Now after confirming from you the old registrar releases your domain to your new registrar.

Step#9: The new registrar now sends a “Transfer Successful” message to you confirming your domain transfer.

Step#10: Now you get your domain transferred to the new registrar and can use it with sign in. And the transfer process gets completed.

Twitter Hacking Victims Find Stolen Accounts Sold On Black Market


Eric Weaver tried logging in to his Twitter account this summer, but he was locked out. A hacker had broken into his account and changed the password. But it didn't end there.
With a little digging, Weaver found that his Twitter handle -- @weave -- was beingsold in an online forum at HackForums.net. With more digging, he also found that software was being sold online to automate the process of quickly hacking dozens of Twitter accounts.
"I was surprised this was all happening so openly," said Weaver, an advertising executive in Seattle. The hackers "are able to operate with seeming impunity."
Weaver's experience is not unique. Other Twitter hacking victims have also discovered that their accounts are for sale in online forums like ForumKorner.com and HackForums.net, where coveted one-word Twitter handles are sold in bulk for as little as $10.
This week, Twitter user Daniel Dennis Jones detailed in a Storify post how his Twitter account -- @blanket -- was hacked, stolen and put up for sale on the black market. Jones said he communicated with his hacker, who claimed to be a 14-year-old South Dakota teen who hacks and sells one-word Twitter accounts. Jones has since regained access to his account.
Experts say the underground market for Twitter accounts and the apparent ease with which they are stolen raises questions about security at the popular micro-blogging site. Most companies have built systems to prevent hackers from repeatedly guessing passwords, said Chester Wisniewski, a researcher at cybersecurity firm Sophos.
“Why is Twitter not doing that?” Wisniewski said. “This has been going on for a long time. It’s not going away and Twitter doesn’t seem to be doing anything about it.”
Twitter did not respond to repeated requests for comment.
In his post on Storify, Jones said the teenager who claimed to be his hacker told him that hackers could mask the IP address of their location by exploiting a loophole in Twitter security.
Such software -- known as a “Twitter cracker" -- can be easily purchased online.
"It's very well worth it,” one seller recently said on ForumKorner.com, which was not working at the time of publication. “With this you can upload more than 10,000 passwords and it automatically checks the login and if it doesn't work it moves on to the next one.”
Hackers also use the site to sell the stolen accounts, sometimes in bulk. Last week, a hacker who went by the name of Gumbo posted a list of more than 30 recently-stolen Twitter names for sale -- including handles like “gadgetry” and “compadre" -- on ForumKorner.com.
Another hacker claimed to have stolen the Twitter handle @Fend and vowed to“begin the bidding at $30.” Still another, who went by the screenname Spongebob, was selling “a 20-pack of 4 character Twitter handles for $10." Among the accounts for sale were @Nona, @Pina, @Zala and @Wexa.
Such short, one-word Twitter handles are in high demand. They are not only easy to remember, but they also give users a few extra characters to express themselves within the 140-character limit. Last year, the Wall Street Journal reported that easy-to-recall Twitter handles like @adam or @megan have become "a stylish totem in the tech world."
In August, tech reporter Mat Honan revealed how his digital life was destroyed after hackers targeted him because of his short, unique Twitter handle -- @mat. Instead of trying to sell the account, they appeared to use @mat as a platform to broadcast racist and homophobic messages, Honan wrote.
Rob Bertholf, who owns the Twitter handle @rob, said his account has never been hacked. But he suspects hackers often try -- albeit unsuccessfully -- to break into his account because he receives weekly email notifications from Twitter notifying him that someone is trying to reset his password.
“No doubt in my mind that I have been targeted many times,” Bertholf told The Huffington Post.
Weaver, the Seattle advertising executive, said that after his account was stolen, he was able to trace his hacker’s identity to a 20-year-old Miami man. He said the hacker was also selling other accounts: @Bond, @Mock, @Four, @Strung, @545 and @Mind.
"Selling or accepting trades only," the hacker wrote under the screen name "Darent.""I will show proof to serious buyers."
Weaver said he contacted Twitter, but did not regain access to his account for three weeks -- and only after a friend called one of his contacts who worked at Twitter. During that time, his said the name linked to his account was changed to "Jaimi in Brooklyn."
He said that getting his account stolen was particularly embarrassing because he is an ad executive whose work revolves around social media.
"My Twitter followers are friends and business colleagues," he said. "They were confused by my sudden fascination with hair, nail and certain R&B acts."
Weaver said he has since strengthened his Twitter password by making it 15 characters long and more complex, but added that the person who he thinks hacked his Twitter account continues to operate openly online.
“ They're just bored kids,” he said. "They think they're invincible."

Flickr photo by shawncampbell.
Source-Huffingtonpost.com

24 September, 2012

Facebook Can ID Faces, but Using Them Grows Tricky


SAN FRANCISCO — Facebook on Friday confronted a new obstacle over what to do with one of its most vital assets — pictures.

The company promised European regulators that it would forgo using facial recognition software and delete the data used to identify Facebook users by their pictures.

The decision could have wide repercussions on how facial recognition technology — a particularly sensitive technological advance — is used globally as surveillance cameras are increasingly installed in public spaces.

“This is a big deal,” said Chris Hoofnagle, a law professor at the University of California, Berkeley who specializes in online privacy.

“The development of these tools in the private sector directly affects civil liberties,” he explained. “The ultimate application is going to be — can we apply these patterns in video surveillance to automatically identify people for security purposes and maybe for marketing purposes as well?”

The agreement comes as Facebook is under pressure from Wall Street to profit from its vast trove of data, including pictures, and also from regulators worldwide over the use of personal information.

The decision in Europe applies to the “tag suggestion,” a Facebook feature that deploys a sophisticated facial recognition tool to automatically match pictures with names. When a Facebook user uploads a photo of friends, the “tag suggestion” feature can automatically pull up the names of the individuals in the image.

The facial recognition software was developed by an Israeli company, Face.com, which Facebook acquired for an undisclosed price in June.

The company quietly and temporarily pulled the plug on “tag suggestion” for all Facebook users several months ago. The company said on Friday it was to “make improvements to the tool’s efficiency” and did not say how soon it would be restored. However, the company promised European regulators on Friday that it would reinstate the feature on the Continent only after getting their approval.

Facebook declined to say under what circumstances the “tag suggestions” would be back online in the United States or elsewhere.

Facebook’s promise to the European regulators is part of an investigation into whether the company’s data collection practices comply with European privacy rules. It was made with regulators in Ireland, where the company has its European headquarters.

“We will continue to work together to ensure we remain compliant with European data protection law,” Facebook said in a statement.

Europe is an important market for the company, as it struggles to prove its worth on Wall Street. About one in four Facebook users logs in from Europe. According to the company’s earnings figures, Europe accounts for just under a third of its advertising revenue.

Pictures have always been vital to Facebook. Pictures are what drew users to Facebook in its earliest days, and pictures are what continue to keep people coming back. Facebook users upload 300 million images a day. The company’s acquisition of Instagram, the photo-sharing site, eliminated its biggest rival in this area.

Photo tagging is important for Facebook in the sense that it allows the social network to better analyze with whom its users interact in the real world.

In addition to scrutiny from European regulators, Facebook has also come under fire from consumer protection groups and lawmakers in the United States over its use of facial recognition technology. At a hearing on Capitol Hill last July, Senator Al Franken, Democrat of Minnesota, described Facebook as the “world’s largest privately held database of face prints — without the explicit consent of its users.”

On Friday, Mr. Franken said in an e-mail statement that he hoped Facebook would offer a way for American users to opt in to its photographic database.

“I believe that we have a fundamental right to privacy, and that means people should have the ability to choose whether or not they’ll be enrolled in a commercial facial recognition database,” he said. “I encourage Facebook to provide the same privacy protections to its American users as it does its foreign ones.”

The Electronic Privacy Information Center, an advocacy group in Washington, filed a complaint with the Federal Trade Commission over Facebook’s use of automatic tagging. The complaint is pending. The commission has a consent order with Facebook that subjects the company to audits over its privacy policies for the next 20 years.

Personal data is Facebook’s crown jewel, but how to use it artfully and profitably is arguably its biggest challenge. Facebook has access to a tremendous amount of information about its one billion users, including the photos they upload every day. Marketers have pushed for greater access to that data, so as to tailor the right message to the right customer. Consumers and lawmakers have resisted, to different degrees in different countries around the world.

“They are pushing the edges of what privacy rules may allow, just as an aggressive driver might with parking rules,” said Brian Wieser, an analyst with the Pivotal Research Group, a research firm in New York. “You don’t know you’ve broken a law until someone says you’ve broken a law.”

Several independent application developers are experimenting with how to use facial recognition technology in the real world, and have sought to use pictures on Facebook to build products of their own.

For example, one company in Atlanta is developing an application to allow Facebook users to be identified by cameras installed in stores and restaurants. The company, Redpepper, said in a blog post that users would have to authorize the application to pull their most recent tagged photographs. The company said its “custom-developed cameras then simply use this existing data to identify you in the real world,” including by offering special discounts and deals.


Source:NYTimes

Power, Pollution and the Internet


SANTA CLARA, Calif. — Jeff Rothschild’s machines at Facebook had a problem he knew he had to solve immediately. They were about to melt.
The company had been packing a 40-by-60-foot rental space here with racks of computer servers that were needed to store and process information from members’ accounts. The electricity pouring into the computers was overheating Ethernet sockets and other crucial components.
Thinking fast, Mr. Rothschild, the company’s engineering chief, took some employees on an expedition to buy every fan they could find — “We cleaned out all of the Walgreens in the area,” he said — to blast cool air at the equipment and prevent the Web site from going down.
That was in early 2006, when Facebook had a quaint 10 million or so users and the one main server site. Today, the information generated by nearly one billion people requires outsize versions of these facilities, called data centers, with rows and rows of servers spread over hundreds of thousands of square feet, and all with industrial cooling systems.
They are a mere fraction of the tens of thousands of data centers that now exist to support the overall explosion of digital information. Stupendous amounts of data are set in motion each day as, with an innocuous click or tap, people download movies on iTunes, check credit card balances through Visa’s Web site, send Yahoo e-mail with files attached, buy products on Amazon, post on Twitter or read newspapers online.
A yearlong examination by The New York Times has revealed that this foundation of the information industry is sharply at odds with its image of sleek efficiency and environmental friendliness.
Most data centers, by design, consume vast amounts of energy in an incongruously wasteful manner, interviews and documents show. Online companies typically run their facilities at maximum capacity around the clock, whatever the demand. As a result, data centers can waste 90 percent or more of the electricity they pull off the grid, The Times found.
To guard against a power failure, they further rely on banks of generators that emit diesel exhaust. The pollution from data centers has increasingly been cited by the authorities for violating clean air regulations, documents show. In Silicon Valley, many data centers appear on the state government’s Toxic Air Contaminant Inventory, a roster of the area’s top stationary diesel polluters.
Worldwide, the digital warehouses use about 30 billion watts of electricity, roughly equivalent to the output of 30 nuclear power plants, according to estimates industry experts compiled for The Times. Data centers in the United States account for one-quarter to one-third of that load, the estimates show.
“It’s staggering for most people, even people in the industry, to understand the numbers, the sheer size of these systems,” said Peter Gross, who helped design hundreds of data centers. “A single data center can take more power than a medium-size town.”
Energy efficiency varies widely from company to company. But at the request of The Times, the consulting firm McKinsey & Company analyzed energy use by data centers and found that, on average, they were using only 6 percent to 12 percent of the electricity powering their servers to perform computations. The rest was essentially used to keep servers idling and ready in case of a surge in activity that could slow or crash their operations.
A server is a sort of bulked-up desktop computer, minus a screen and keyboard, that contains chips to process data. The study sampled about 20,000 servers in about 70 large data centers spanning the commercial gamut: drug companies, military contractors, banks, media companies and government agencies.
“This is an industry dirty secret, and no one wants to be the first to say mea culpa,” said a senior industry executive who asked not to be identified to protect his company’s reputation. “If we were a manufacturing industry, we’d be out of business straightaway.”
These physical realities of data are far from the mythology of the Internet: where lives are lived in the “virtual” world and all manner of memory is stored in “the cloud.”
The inefficient use of power is largely driven by a symbiotic relationship between users who demand an instantaneous response to the click of a mouse and companies that put their business at risk if they fail to meet that expectation.
Even running electricity at full throttle has not been enough to satisfy the industry. In addition to generators, most large data centers contain banks of huge, spinning flywheels or thousands of lead-acid batteries — many of them similar to automobile batteries — to power the computers in case of a grid failure as brief as a few hundredths of a second, an interruption that could crash the servers.
“It’s a waste,” said Dennis P. Symanski, a senior researcher at the Electric Power Research Institute, a nonprofit industry group. “It’s too many insurance policies.”
At least a dozen major data centers have been cited for violations of air quality regulations in Virginia and Illinois alone, according to state records. Amazon was cited with more than 24 violations over a three-year period in Northern Virginia, including running some of its generators without a basic environmental permit.
A few companies say they are using extensively re-engineered software and cooling systems to decrease wasted power. Among them are Facebook and Google, which also have redesigned their hardware. Still, according to recent disclosures, Google’s data centers consume nearly 300 million watts and Facebook’s about 60 million watts.
Many of these solutions are readily available, but in a risk-averse industry, most companies have been reluctant to make wholesale change, according to industry experts.
Improving or even assessing the field is complicated by the secretive nature of an industry that is largely built around accessing other people’s personal data.
For security reasons, companies typically do not even reveal the locations of their data centers, which are housed in anonymous buildings and vigilantly protected. Companies also guard their technology for competitive reasons, said Michael Manos, a longtime industry executive. “All of those things play into each other to foster this closed, members-only kind of group,” he said.
That secrecy often extends to energy use. To further complicate any assessment, no single government agency has the authority to track the industry. In fact, the federal government was unable to determine how much energy its own data centers consume, according to officials involved in a survey completed last year.
The survey did discover that the number of federal data centers grew from 432 in 1998 to 2,094 in 2010.
To investigate the industry, The Times obtained thousands of pages of local, state and federal records, some through freedom of information laws, that are kept on industrial facilities that use large amounts of energy. Copies of permits for generators and information about their emissions were obtained from environmental agencies, which helped pinpoint some data center locations and details of their operations.
In addition to reviewing records from electrical utilities, The Times also visited data centers across the country and conducted hundreds of interviews with current and former employees and contractors.
Some analysts warn that as the amount of data and energy use continue to rise, companies that do not alter their practices could eventually face a shake-up in an industry that has been prone to major upheavals, including the bursting of the first Internet bubble in the late 1990s.
“It’s just not sustainable,” said Mark Bramfitt, a former utility executive who now consults for the power and information technology industries. “They’re going to hit a brick wall.”
Bytes by the Billions
Wearing an FC Barcelona T-shirt and plaid Bermuda shorts, Andre Tran strode through a Yahoo data center in Santa Clara where he was the site operations manager. Mr. Tran’s domain — there were servers assigned to fantasy sports and photo sharing, among other things — was a fair sample of the countless computer rooms where the planet’s sloshing tides of data pass through or come to rest.
Aisle after aisle of servers, with amber, blue and green lights flashing silently, sat on a white floor punctured with small round holes that spit out cold air. Within each server were the spinning hard drives that store the data. The only hint that the center was run by Yahoo, whose name was nowhere in sight, could be found in a tangle of cables colored in the company’s signature purple and yellow.
“There could be thousands of people’s e-mails on these,” Mr. Tran said, pointing to one storage aisle. “People keep old e-mails and attachments forever, so you need a lot of space.”
This is the mundane face of digital information — player statistics flowing into servers that calculate fantasy points and league rankings, snapshots from nearly forgotten vacations kept forever in storage devices. It is only when the repetitions of those and similar transactions are added up that they start to become impressive.
Each year, chips in servers get faster, and storage media get denser and cheaper, but the furious rate of data production goes a notch higher.
Jeremy Burton, an expert in data storage, said that when he worked at a computer technology company 10 years ago, the most data-intensive customer he dealt with had about 50,000 gigabytes in its entire database. (Data storage is measured in bytes. The letter N, for example, takes 1 byte to store. A gigabyte is a billion bytes of information.)
Today, roughly a million gigabytes are processed and stored in a data center during the creation of a single 3-D animated movie, said Mr. Burton, now at EMC, a company focused on the management and storage of data.
Just one of the company’s clients, the New York Stock Exchange, produces up to 2,000 gigabytes of data per day that must be stored for years, he added.
EMC and the International Data Corporation together estimated that more than 1.8 trillion gigabytes of digital information were created globally last year.
“It is absolutely a race between our ability to create data and our ability to store and manage data,” Mr. Burton said.
About three-quarters of that data, EMC estimated, was created by ordinary consumers.
With no sense that data is physical or that storing it uses up space and energy, those consumers have developed the habit of sending huge data files back and forth, like videos and mass e-mails with photo attachments. Even the seemingly mundane actions like running an app to find an Italian restaurant in Manhattan or a taxi in Dallas requires servers to be turned on and ready to process the information instantaneously.
The complexity of a basic transaction is a mystery to most users: Sending a message with photographs to a neighbor could involve a trip through hundreds or thousands of miles of Internet conduits and multiple data centers before the e-mail arrives across the street.
“If you tell somebody they can’t access YouTube or download from Netflix, they’ll tell you it’s a God-given right,” said Bruce Taylor, vice president of the Uptime Institute, a professional organization for companies that use data centers.
To support all that digital activity, there are now more than three million data centers of widely varying sizes worldwide, according to figures from the International Data Corporation.
Nationwide, data centers used about 76 billion kilowatt-hours in 2010, or roughly 2 percent of all electricity used in the country that year, based on an analysis by Jonathan G. Koomey, a research fellow at Stanford University who has been studying data center energy use for more than a decade. DatacenterDynamics, a London-based firm, derived similar figures.
The industry has long argued that computerizing business transactions and everyday tasks like banking and reading library books has the net effect of saving energy and resources. But the paper industry, which some predicted would be replaced by the computer age, consumed 67 billion kilowatt-hours from the grid in 2010, according to Census Bureau figures reviewed by the Electric Power Research Institute for The Times.
Direct comparisons between the industries are difficult: paper uses additional energy by burning pulp waste and transporting products. Data centers likewise involve tens of millions of laptops, personal computers and mobile devices.
Chris Crosby, chief executive of the Dallas-based Compass Datacenters, said there was no immediate end in sight to the proliferation of digital infrastructure.
“There are new technologies and improvements,” Mr. Crosby said, “but it still all runs on a power cord.”
‘Comatose’ Power Drains
Engineers at Viridity Software, a start-up that helped companies manage energy resources, were not surprised by what they discovered on the floor of a sprawling data center near Atlanta.
Viridity had been brought on board to conduct basic diagnostic testing. The engineers found that the facility, like dozens of others they had surveyed, was using the majority of its power on servers that were doing little except burning electricity, said Michael Rowan, who was Viridity’s chief technology officer.
A senior official at the data center already suspected that something was amiss. He had previously conducted his own informal survey, putting red stickers on servers he believed to be “comatose” — the term engineers use for servers that are plugged in and using energy even as their processors are doing little if any computational work.
“At the end of that process, what we found was our data center had a case of the measles,” said the official, Martin Stephens, during a Web seminar with Mr. Rowan. “There were so many red tags out there it was unbelievable.”
The Viridity tests backed up Mr. Stephens’s suspicions: in one sample of 333 servers monitored in 2010, more than half were found to be comatose. All told, nearly three-quarters of the servers in the sample were using less than 10 percent of their computational brainpower, on average, to process data.
The data center’s operator was not some seat-of-the-pants app developer or online gambling company, butLexisNexis, the database giant. And it was hardly unique.
In many facilities, servers are loaded with applications and left to run indefinitely, even after nearly all users have vanished or new versions of the same programs are running elsewhere.
“You do have to take into account that the explosion of data is what aids and abets this,” said Mr. Taylor of the Uptime Institute. “At a certain point, no one is responsible anymore, because no one, absolutely no one, wants to go in that room and unplug a server.”
Kenneth Brill, an engineer who in 1993 founded the Uptime Institute, said low utilization began with the field’s “original sin.”
In the early 1990s, Mr. Brill explained, software operating systems that would now be considered primitive crashed if they were asked to do too many things, or even if they were turned on and off. In response, computer technicians seldom ran more than one application on each server and kept the machines on around the clock, no matter how sporadically that application might be called upon.
So as government energy watchdogs urged consumers to turn off computers when they were not being used, the prime directive at data centers became running computers at all cost.
A crash or a slowdown could end a career, said Michael Tresh, formerly a senior official at Viridity. A field born of cleverness and audacity is now ruled by something else: fear of failure.
“Data center operators live in fear of losing their jobs on a daily basis,” Mr. Tresh said, “and that’s because the business won’t back them up if there’s a failure.”
In technical terms, the fraction of a computer’s brainpower being used on computations is called “utilization.”
McKinsey & Company, the consulting firm that analyzed utilization figures for The Times, has been monitoring the issue since at least 2008, when it published a report that received little notice outside the field. The figures have remained stubbornly low: the current findings of 6 percent to 12 percent are only slightly better than those in 2008. Because of confidentiality agreements, McKinsey is unable to name the companies that were sampled.
David Cappuccio, a managing vice president and chief of research at Gartner, a technology research firm, said his own recent survey of a large sample of data centers found that typical utilizations ran from 7 percent to 12 percent.
“That’s how we’ve overprovisioned and run data centers for years,” Mr. Cappuccio said. “ ‘Let’s overbuild just in case we need it’ — that level of comfort costs a lot of money. It costs a lot of energy.”
Servers are not the only components in data centers that consume energy. Industrial cooling systems, circuitry to keep backup batteries charged and simple dissipation in the extensive wiring all consume their share.
In a typical data center, those losses combined with low utilization can mean that the energy wasted is as much as 30 times the amount of electricity used to carry out the basic purpose of the data center.
Some companies, academic organizations and research groups have shown that vastly more efficient practices are possible, although it is difficult to compare different types of tasks.
The National Energy Research Scientific Computing Center, which consists of clusters of servers and mainframe computers at the Lawrence Berkeley National Laboratory in California, ran at 96.4 percent utilization in July, said Jeff Broughton, the director of operations. The efficiency is achieved by queuing up large jobs and scheduling them so that the machines are running nearly full-out, 24 hours a day.
A company called Power Assure, based in Santa Clara, markets a technology that enables commercial data centers to safely power down servers when they are not needed — overnight, for example.
But even with aggressive programs to entice its major customers to save energy, Silicon Valley Power has not been able to persuade a single data center to use the technique in Santa Clara, said Mary Medeiros McEnroe, manager of energy efficiency programs at the utility.
“It’s a nervousness in the I.T. community that something isn’t going to be available when they need it,” Ms. McEnroe said.
The streamlining of the data center done by Mr. Stephens for LexisNexis Risk Solutions is an illustration of the savings that are possible.
In the first stage of the project, he said that by consolidating the work in fewer servers and updating hardware, he was able to shrink a 25,000-square-foot facility into 10,000 square feet.
Of course, data centers must have some backup capacity available at all times and achieving 100 percent utilization is not possible. They must be prepared to handle surges in traffic.
Mr. Symanski, of the Electric Power Research Institute, said that such low efficiencies made sense only in the obscure logic of the digital infrastructure.
“You look at it and say, ‘How in the world can you run a business like that,’ ” Mr. Symanski said. The answer is often the same, he said: “They don’t get a bonus for saving on the electric bill. They get a bonus for having the data center available 99.999 percent of the time.”
The Best-Laid Plans
In Manassas, Va., the retailing colossus Amazon runs servers for its cloud amid a truck depot, a defunct grain elevator, a lumberyard and junk-strewn lots where machines compress loads of trash for recycling.
The servers are contained in two Amazon data centers run out of three buildings shaped like bulky warehouses with green, corrugated sides. Air ducts big enough to accommodate industrial cooling systems sprout along the rooftops; huge diesel generators sit in rows around the outside.
The term “cloud” is often generally used to describe a data center’s functions. More specifically, it refers to a service for leasing computing capacity. These facilities are primarily powered from the national grid, but generators and batteries are nearly always present to provide electricity if the grid goes dark.
The Manassas sites are among at least eight major data centers Amazon operates in Northern Virginia, according to records of Virginia’s Department of Environmental Quality.
The department is on familiar terms with Amazon. As a result of four inspections beginning in October 2010, the company was told it would be fined $554,476 by the agency for installing and repeatedly running diesel generators without obtaining standard environmental permits required to operate in Virginia.
Even if there are no blackouts, backup generators still emit exhaust because they must be regularly tested.
After months of negotiations, the penalty was reduced to $261,638. In a “degree of culpability” judgment, all 24 violations were given the ranking “high.”
Drew Herdener, an Amazon spokesman, agreed that the company “did not get the proper permits” before the generators were turned on. “All of these generators were all subsequently permitted and approved,” Mr. Herdener said.
The violations came in addition to a series of lesser infractions at one of Amazon’s data centers in Ashburn, Va., in 2009, for which the company paid $3,496, according to the department’s records.
Of all the things the Internet was expected to become, it is safe to say that a seed for the proliferation of backup diesel generators was not one of them.
Terry Darton, a former manager at Virginia’s environmental agency, said permits had been issued to enough generators for data centers in his 14-county corner of Virginia to nearly match the output of a nuclear power plant.
“It’s shocking how much potential power is available,” said Mr. Darton, who retired in August.
No national figures on environmental violations by data centers are available, but a check of several environmental districts suggests that the centers are beginning to catch the attention of regulators across the country.
Over the past five years in the Chicago area, for example, the Internet powerhouses Savvis and Equinix received violation notices, according to records from the Illinois Environmental Protection Agency. Aside from Amazon, Northern Virginia officials have also cited data centers run by Qwest, Savvis, VeriSign and NTT America.
Despite all the precautions — the enormous flow of electricity, the banks of batteries and the array of diesel generators — data centers still crash.
Amazon, in particular, has had a series of failures in Northern Virginia over the last several years. One, in May 2010 at a facility in Chantilly, took businesses dependent on Amazon’s cloud offline for what the company said was more than an hour — an eternity in the data business.
Pinpointing the cause became its own information glitch.
Amazon announced that the failure “was triggered when a vehicle crashed into a high-voltage utility pole on a road near one of our data centers.”
As it turns out, the car accident was mythical, a misunderstanding passed from a local utility lineman to a data center worker to Amazon headquarters. Instead, Amazon said that its backup gear mistakenly shut down part of the data center after what Dominion Virginia Power said was a short on an electrical pole that set off two momentary failures.
Mr. Herdener of Amazon said the backup system had been redesigned, and that “we don’t expect this condition to repeat.”
The Source of the Problem
Last year in the Northeast, a $1 billion feeder line for the national power grid went into operation, snaking roughly 215 miles from southwestern Pennsylvania, through the Allegheny Mountains in West Virginia and terminating in Loudon County, Va.
The work was financed by millions of ordinary ratepayers. Steven R. Herling, a senior official at PJM Interconnection, a regional authority for the grid, said the need to feed the mushrooming data centers in Northern Virginia was the “tipping point” for the project in an otherwise down economy.
Data centers in the area now consume almost 500 million watts of electricity, said Jim Norvelle, a spokesman for Dominion Virginia Power, the major utility there. Dominion estimates that the load could rise to more than a billion watts over the next five years.
Data centers are among utilities’ most prized customers. Many utilities around the country recruit the facilities for their almost unvarying round-the-clock loads. Large, steady consumption is profitable for utilities because it allows them to plan their own power purchases in advance and market their services at night, when demand by other customers plummets.
Mr. Bramfitt, the former utility executive, said he feared that this dynamic was encouraging a wasteful industry to cling to its pedal-to-the-metal habits. Even with all the energy and hardware pouring into the field, others believe it will be a challenge for current methods of storing and processing data to keep up with the digital tsunami.
Some industry experts believe a solution lies in the cloud: centralizing computing among large and well-operated data centers. Those data centers would rely heavily on a technology called virtualization, which in effect allows servers to merge their identities into large, flexible computing resources that can be doled out as needed to users, wherever they are.
One advocate of that approach is Mr. Koomey, the Stanford data center expert. But he said that many companies that try to manage their own data centers, either in-house or in rental spaces, are still unfamiliar with or distrustful of the new cloud technology. Unfortunately, those companies account for the great majority of energy usage by data centers, Mr. Koomey said.
Others express deep skepticism of the cloud, saying that the sometimes mystical-sounding belief in its possibilities is belied by the physicality of the infrastructure needed to support it.
Using the cloud “just changes where the applications are running,” said Hank Seader, managing principal for research and education at the Uptime Institute. “It all goes to a data center somewhere.”
Some wonder if the very language of the Internet is a barrier to understanding how physical it is, and is likely to stay. Take, for example, the issue of storing data, said Randall H. Victora, a professor of electrical engineering at the University of Minnesota who does research on magnetic storage devices.
“When somebody says, ‘I’m going to store something in the cloud, we don’t need disk drives anymore’ — the cloud is disk drives,” Mr. Victora said. “We get them one way or another. We just don’t know it.”
Whatever happens within the companies, it is clear that among consumers, what are now settled expectations largely drive the need for such a formidable infrastructure.
“That’s what’s driving that massive growth — the end-user expectation of anything, anytime, anywhere,” said David Cappuccio, a managing vice president and chief of research at Gartner, the technology research firm. “We’re what’s causing the problem.”

Source: NYTimes

Apple’s Feud With Google Is Now Felt on iPhone


SAN FRANCISCO — Once the best of friends, Google and Apple have become foes, battling in courtrooms and in the consumer marketplace. Last week, the hostilities took a new turn when they spilled right onto smartphone screens.

In the latest version of Apple’s iPhonesoftware, which became available Wednesday, Apple removed two mainstay apps, both Google products — Maps and YouTube.

The disappearing apps show just how far-reaching the companies’ rivalry has become, as well as the importance of mobile users to their businesses.

“It’s the two big kids kicking sand in the sandbox,” said Colin Gillis, an analyst who covers Google and Apple for BGC Partners. “They’re now competing against each other with phones, with maps, with content, with search. They’re going head-to-head.”

Maps are particularly crucial on mobile devices, where location-based services and ads have emerged as the pathway to making money. Google and Apple are not the only warriors in the fight. Amazon, Nokia, Microsoft, AOL and Yahoo are competing, too.

“If you own a mobile ecology, as Google does, the other mobile ecology owners are not going to allow you to own tons of data in their world,” said Scott Rafer, chief executive of Lumatic, which makes city map apps. “And so neither Apple nor Amazon were going to let Google know where every one of their users was at every time.”

Being kicked off the iPhone has potentially significant consequences for Google, whose Maps service earns more than half its traffic from mobile devices, and almost half of that mobile traffic has been from iPhone users. Apple’s move strikes at the heart of Google’s core business, search, because about 40 percent of mobile searches are for local places or things.

“Local is a huge thing for Google in terms of advertising dollars, and search is very tied to that,” said Barry Schwartz, an editor at Search Engine Land, an industry blog. “Knowing where you are, when you search for coffee, it can bring up local coffee shops and ads that are much more relevant for the user.”

Consumers are innocent bystanders of the brawl. IPhone users now have an extra step to download the YouTube app from the App Store and, so far, Google has given no indication that it will offer a maps app. Apple’s maps, meanwhile, are littered with flaws, some laughable, like a bridge that appears to collapse crossing the Tacoma Narrows Strait of Puget Sound.

Some analysts say, however, that Apple’s maps will quickly improve, and that the long-term result of heightened competition will be better maps all around.

“Apple Maps are apparently not ready for prime time, and that’s a loss,” said Peter Krasilovsky, the program director for marketplaces at BIA/Kelsey, a local media research firm. “But a long-term loss? No. With all the incredible technology being developed by everybody, consumers are the winner.”

The war between Google and Apple escalated abruptly before breaking out on the iPhone screen. At the height of their friendship, their chief executives together unveiled the first iPhone, packed with Google services like maps, search and YouTube. But since Google introduced its own mobile operating system, Android, the companies have battled over everything mobile, from patents to ads and apps.

The brawl has played out most publicly in the courtroom, where Apple and phone manufacturers that use Google’s Android software have sued one another. Most recently, on Friday and Saturday, Apple and Samsung each filed papers to amend or overturn a jury verdict that awarded Apple $1 billion in a patent trial with Samsung. Apple wants more money and Samsung wants a new trial. The companies will return to court Dec. 6 to discuss their demands.

Though Apple’s rejection of YouTube is part of its effort to cut ties with its former friend, it is different from the battle over maps because Apple has no competing video service. Google has introduced a new YouTube app in the App Store, which has become the No. 1 free app.

But with maps, Google, which has long been the dominant digital mapmaker, now must adjust to a new rival, along with the loss of valuable iPhone users.

Even though Android phones far outnumber iPhones — 60 percent of smartphones run Android, versus 34 percent for iPhones, according to Canalys, a research firm — iPhone users account for almost half of mobile traffic to Google Maps.

In July, according to comScore Mobile Metrix, 12.6 million iPhone users visited Maps each day, versus 7.6 million on Android phones. And iPhone users spent an hour and a half using Maps during the month, while Android users spent just an hour.

Those users are a valuable source for Google, because it relies on their data to determine things like which businesses or landmarks are most important and whether maps have errors.

Google also risks losing the allegiance of app developers who build apps that tie in to maps.

“Overnight, Apple has really taken out a significant chunk of Google’s market, and it’s much harder for Google to say to developers, ‘We’re the only game in town, come play with us,’ ” said Tony Costa, a senior analyst who studies mobile phones at Forrester. “It will affect the Google ecosystem, putting it back in the same game of their apps lagging behind Apple, and that’s not a good position for them to be in.”

Still, Google is no doubt feeling a bit of satisfaction as Apple is loudly criticized for the errors in its maps.

Apple Maps users have been tallying its blunders. A Tumblr devoted to the topic included a missing lake in Hyderabad, India, misplaced restaurants in Cambridge, Mass., and the placement of Berlin in Antarctica.

Apple responded Thursday with a statement that its map service was a work in progress and would improve as more people used it.

Google, meanwhile, has been reminding people of its seven years of experience in mapping.

But the company would not say whether it was building an iPhone app for users to download. Its only public statement on the matter has been vague: “Our goal is to make Google Maps available to everyone who wants to use it, regardless of device, browser, or operating system.”

Google could decide not to build an app, as a gamble that iPhone users depend on its maps so much that they might switch to Android.

If it does build an app, Apple would have to approve it. Its guidelines for developers are ambiguous, but exclude apps that “appear confusingly similar to an existing Apple product.”

Rejecting Google’s app would most likely set off a brouhaha similar to that over the Google Voice app, which Apple rejected in 2009, prompting an investigation by the Federal Communications Commission, and a year later was approved.

More likely, analysts say, Google is waiting for the right time to swoop in and save the day by offering its own iPhone app. One benefit of making its own app: It could add features and sell ads, which it could not do on the old app because Apple controlled it. The situation with the YouTube app was the same.

In the meantime, Google is encouraging people to use maps on the iPhone’s browser, where it shows instructions to install it on their home screen.


Source: NYTimes