DOJ requires AT&T to sell some assets in acquisition

The U.S. Department of Justice will require telecom giant AT&T to sell off pieces of its mobile network in parts of Louisiana and Mississippi in order to continue with its US$944 million acquisition of Centennial Communications, the agency said Tuesday. The area covered includes parts of southwestern and central Louisiana and southwestern Mississippi. If AT&T did not divest its assets in the two states, the acquisition would "substantially lessen" competition for mobile telecom services and would likely result in higher prices, lower quality and reduced network investments, the DOJ said.

The DOJ's Antitrust Division, along with the attorney general of Louisiana, filed a civil lawsuit Tuesday in U.S. District Court for the District of Columbia to block the proposed acquisition of Centennial by AT&T. At the same time, the DOJ and the Louisiana attorney general filed a proposed settlement that, if approved by the court, would resolve the competitive concerns in the lawsuit. The complaint alleges that the proposed transaction would substantially reduce competition for mobile wireless telecommunications services in each of the areas. According to the complaint, AT&T and Centennial are each other's closest competitors for a significant number of customers in eight cellular marketing areas (CMAs), as defined by the U.S. Federal Communications Commission. AT&T is the second-largest mobile telecom provider in the U.S. by number of subscribers, serving nearly 80 million subscribers throughout all 50 states, the DOJ said. Centennial is the eighth-largest mobile telecom provider in the U.S., with about 1.1 million subscribers in six states, Puerto Rico and the U.S. Virgin Islands. In 2008, AT&T earned mobile revenues of about $44 billion.

Dell details the Efficient Enterprise strategy at OpenWorld

Dell Chairman and CEO Michael Dell touted Dell's "Efficient Enterprise" strategy Wednesday at the Oracle OpenWorld 2009 conference and was joined briefly onstage by Oracle CEO Larry Ellison, who stressed that Oracle is a major user of Dell equipment as well as a partner. Standardization involves use of Intel processors, Dell explained. Dell's strategy centers on standardization, simplification, and automation.

Simplification entails making the complex simple using solutions like virtualization and storage consolidation, while automation is about streamlining service delivery and enabling self-service IT models, Dell said during a keynote presentation at the San Francisco conference. [ Also at OpenWorld, Ellison and Sun Chairman Scott McNealy lauded Sun technologies. ] He cited figures stating that of the $1.2 trillion spent annually on IT infrastructure, nearly $800 billion is spent on labor and services, while just $400 billion goes to hardware and related software. Dell stressed Intel as the industry standard. "The numbers really don't lie. [Intel] x86 is the standard architecture in the datacenter," he said. Meanwhile, just 1 percent of total business spending is left for driving IT innovation, Dell said. "We believe there's a real opportunity to drive out inefficiency and make technology work harder for our customers, and it forms the basis of what we call the Efficient Enterprise "Dell said.  He vowed that Dell would reduce $200 billion of inefficiency out of the $1.2 trillion. Dell referred to Intel versus "proprietary architectures" in promoting Intel-based systems. Our partnership just gets bigger and bigger every year," Ellison said.

Ellison briefly joined Dell to stress Oracle and Dell synergies. "We've got so many customers that are Dell customers and also Oracle customers. Oracle also is a major Dell user, with 20,000 Dell servers that Oracle uses to run its development and testing operations, Ellison said. Ellison did not address what could be a sticky predicament in that Oracle soon will own the rival Sun Microsystems hardware line if a proposed $7.4 billion merger goes through as planned. Dell technology is "working well for us," he said. Sun hardware is largely based on the SPARC CPU platform, with some Intel-based machines in the mix. Virtual machines, Dell said, are becoming key to driving workloads form the client to the cloud.

Efficient Enterprise, Dell said, is intended to enable greater spending on innovation and less on maintenance.  Enterprise efficiency also is about giving IT administrators increased visibility into deployed workloads, Dell said. Dell is driving its "virtual-ready infrastructure," he said.  The company is delivering automation and flexibility and self-service IT through cloud computing, said Dell. This story, "Dell details the Efficient Enterprise stretegy at OpenWorld," was originally published at InfoWorld.com. He cited 7-11 as one enterprise that has moved to Dell's managed services to improve service delivery. Follow the latest developments in open source at InfoWorld.com.

FCC net neutrality proposal is 'dramatic shift' in policy

U.S. Federal Communications Commission Chairman Julius Genachowski's decision to seek to formalize net neutrality rules would either bring "unconstitutional" new regulations to the Internet or a welcome "paradigm shift" in U.S. communications policy, depending on whom you talk to. Genachowski also pushed to apply the net neutrality regulations to mobile broadband providers, and he called for an expansion in existing broadband policy principles to prohibit broadband providers from discriminating against Web content and services while allowing them to engage in reasonable network management. Genachowski announced Monday that he will ask his fellow commissioners to support a rulemaking proceeding to create formal net neutrality rules that would prohibit Internet providers from selectively blocking or slowing Web content and applications.

The FCC has been enforcing net neutrality principles on a case-by-case basis since August 2005, but formal rules would ensure that application and content developers on the "edge" of broadband networks can innovate without interference from network operators, Genachowski said in a speech at the Brookings Institution. "This is the power of the Internet: distributed innovation and ubiquitous entrepreneurship, the potential for jobs and opportunity everywhere there is broadband," he said. "Saying nothing - and doing nothing - would impose its own form of unacceptable cost. It would deny the benefits of predictable rules of the road to all players in the Internet ecosystem." But some broadband providers and conservative think tanks suggested Genachowski's plan could lead to burdensome new regulations. It would deprive innovators and investors of confidence that the free and open Internet we depend on today will still be here tomorrow. The FCC is currently developing a national broadband plan and Genachowski's proposal might "change the rules of the road" before that's completed, said Ken Ferree, president of the Progress and Freedom Foundation, a conservative think tank. "I'm troubled to learn that the FCC is embarking on an exercise that would probably result in rules that are unconstitutional and almost certainly beyond the FCC's statutory jurisdiction," he said in an e-mail. "Aside from the legal issues it raises though, I find myself at a loss to understand why the administration wants to start meddling with a sector of the economy that, despite a challenging macro-economic environment, is performing pretty well by any rational standard. The FCC used its broadband policy principles to prohibit Comcast from blocking or slowing peer-to-peer traffic in a commission vote in August 2008. Comcast was glad to see that Genachowski appeared to suggest that the Internet is now free and open, Comcast Executive Vice President David Cohen said in a blog post. "Before we rush into a new regulatory environment for the Internet, let's remember there can be no doubt that the Internet has enjoyed immense growth even as these debates have gone on," he wrote. "The Internet in America has been a phenomenal success that has spawned technological and business innovation unmatched anywhere in the world. It's almost as if they are trying to turn a story of success into one of failure." Broadband provider Comcast said it welcomes a dialogue about net neutrality, but officials there questioned if more regulations are needed.

So it's still fair to ask whether increased regulation of the Internet is a solution in search of a problem." CTIA, a trade group representing mobile carriers, said it was concerned that the FCC could make rules that prohibit mobile carriers from differentiating their products and services. The Internet is a work in progress, and we really don't know what it's going to look like five years from now," he said. "We believe that new capabilities will be created by innovation in the network, and those new capabilities and innovation should not be precluded by regulation." Young said he was glad to hear Genachowski say the end result of the rulemaking has not been determined in advance. "We need to determine what are the problems that need to be fixed," he said. "What are the examples that require a dramatic change in the regulatory policy of dealing with the Internet." Until now, U.S. lawmakers and regulators have had a hands-off approach to the Internet, Young added. Genachowski pointed to limited competition among service providers as part of the need for new net neutrality rules, but competition is strong among mobile carriers, said Chris Guttman-McCabe, vice president of regulatory affairs at CTIA. "We are concerned about the unintended consequences Internet regulation would have on consumers considering that competition within the industry has spurred innovation, investment, and growth for the U.S. economy," Guttman-McCabe said in a statement. "Unlike the other platforms that would be subject to the rules, the wireless industry is extremely competitive, extremely innovative, and extremely personal." Verizon Communications supports a free and open Internet, but new FCC rules could make it difficult for broadband providers to offer security features or other innovative products, said David Young, the company's vice president for federal regulatory affairs. But Genachowski and Gigi Sohn, president of digital rights group Public Knowledge, said net neutrality rules wouldn't really be new. Over the past four years, there's been a heated debate in Washington, D.C., about the need for net neutrality rules, he said. "It is the elixir of consumer choice and competition that we have long been waiting to see firmly applied in the Internet space," Scott said. "We're going to settle this question once and for all, and we're going to deliver an open Internet for the U.S." Other companies and groups supporting Genachowski's announcement included Google, Skype, the Consumer Electronics Association, and the Computer and Communications Industry Association, a tech trade group.

Until 2005, when the FCC changed the rules, broadband providers had to operate open networks to share with competitors, Sohn said. "American Internet users should be celebrating today," Sohn said. "After four years of regulatory uncertainty, the FCC chairman announced that the agency will start a proceeding to adopt rules that will ensure an open Internet on every single broadband platform." Ben Scott, policy director at media reform group Free Press, called Genachowski's announcement a "paradigm shift" in FCC policy that will ensure the health of the Internet. Senator Byron Dorgan, a North Dakota Democrat, also welcomed Genachowski's plan. This principle of open access has been the cornerstone of the Internet's growth so far, and is vital to its continued success in the future." Dorgan has pushed for net neutrality legislation in the U.S. Congress. "An open and democratic Internet is necessary in order to allow innovation, economic opportunities, and consumer benefits to flourish, and it is critical that we maintain this access," Dorgan said in a statement. "By ensuring that consumers and online businesses can use the Internet without interference from broadband service providers, net neutrality will prevent the advent of haves and have-nots.

Benioff plays nice to Oracle at OpenWorld

Attendees packed into a presentation by Salesforce.com Chairman and CEO Marc Benioff at Oracle's OpenWorld conference Tuesday, but those hoping the executive would deliver some of his trademark trash talk toward Oracle left the room disappointed. But Benioff made no response to Ellison's jibes on Tuesday, instead referring to the companies' "fantastic relationship" and thanking Oracle for being "magnanimous" enough to let Salesforce.com appear at OpenWorld. Some sort of throwdown seemed possible, even likely, given that during a shareholder meeting last week, Oracle CEO Larry Ellison mocked Salesforce.com's offering as a "little itty-bitty application" that is dependent on Oracle's own technology.

Salesforce.com is a sponsor of the show. Since then, the two executives have repeatedly slammed each other's business model, with Benioff declaring on-premise software a dying model and Ellison famously mocking cloud computing on a number of occasions, even as his own company tests those waters. Ellison was an early investor in Salesforce.com, but left the vendor's board after he and Benioff had a falling out. Their history caused surprise and curiosity among some observers, who questioned why Oracle would allow such a direct rival to tout its products at OpenWorld. And during the shareholder meeting, Ellison said he could provide a long list of customers who once used Salesforce.com but "chucked it out" in favor of Oracle's own on-demand CRM (customer relationship management) software.

Indeed, beyond slamming Salesforce.com's technological achievements, Ellison has made it a point during recent earnings conference calls to cite deals it won against the on-demand vendor. But in the end, Benioff seemed more intent Tuesday on building bridges than burning them. The two companies announced a partnership on Monday for selling Salesforce.com CRM and related services to small and medium-sized businesses. At one point, he was joined onstage by Dell CEO Michael Dell. Salesforce.com and Dell already had close ties, having used each other's products for some time. Dell said its experience running Salesforce.com will give it an edge when working with new customers.

SETI@home in spotlight after IT chief loses job

Reports this week out of Arizona about how a public school district IT chief lost his job have put the use of volunteer grid computing efforts in the spotlight. The school district alleges that running the program on computers around the clock for nearly 10 years has cost it more than $1 million in energy and other costs, and interfered with teaching by messing up other programs, such as SMART board systems. According to the Arizona Republic and other news reports, Brad Niesluchowski lost his job earlier this fall as network systems administrator at Arizona's Higley Unified School District following an investigation into suspicious activity that included running the SETI@home distributed computing program across 5,000-plus school computers.

In fact, Niesluchowski (or "NEZ") had gained a reputation as a sort of god among SETI@home users for his status as its most active user as documented via a public credit system.  The situation has generated strong opinions from many corners, with some upset by comments by school superintendent Denise Birdwell ("We support educational research and we would have supported cancer research but we however as an educational institutional do not support the search of ET.") that are seen as flip and showing a lack of understanding of how SETI@home really works. Others pointed out that Niesluchowski losing his job stemmed from much more than just his use of SETI@home. A Fox News report  out of Las Vegas includes an interview with Niesluchowski's wife, who says use of the software was authorized by a previous administration. On top of all this, a police investigation is ongoing and involves allegations of possible stolen computers and gear, according to the Republic. 12 cool ways to donate your PC's spare processing power One issue the Niesluchowski affair immediately brought to my mind has to do with the proper use of volunteer computing programs, which allow end users to donate the spare processing power on their computers via one of the dozens of ongoing volunteer computing projects, many based on open source software called BOINC  In compiling a package of stories on volunteer computing this past summer, I asked David Anderson, a research scientist at UC Berkeley Space Sciences Laboratory who founded the BOINC project in 2002, about guidelines for using such software. His response: "I don't think S@h gets a black eye. His response: "The BOINC project's advice is to get permission from whoever owns the machine." I circled back with Anderson today in light of the Niesluchowski situation, asking about whether it might harm SETI@home.

Our policies explicitly forbid this." He said it looks like "NEZ" got obsessed with SETI@home credit and made "some major errors in judgment." On the plus side, Anderson said that SETI@home being in the news reminds the world that the project – which celebrated its 10th anniversary this year - is still going. Follow Bob Brown on twitter. For more on network research news, follow our Alpha Doggs blog.

Report highlights Smart Grid security vulnerabilities

A cybersecurity coordination task force released a report this week that assesses various security and privacy requirements for the U.S. Smart Grid , as well as strategies needed to address them. The draft report highlights the need for planners to address threats that could potentially allow attackers to penetrate the smart grid, gain access to control software, and alter load conditions to cause widespread disruptions. The 256-page document was compiled by the task force, comprised of individuals from the government, industry, academia and regulatory bodies, and led by the National Institutes of Standards and Technology (NIST). Now open for comment, NIST will release a final version of the document in March 2010 describing a overall Smart Grid security architecture and security requirements.

Cybersecurity strategies for protecting the smart grid need to address not only deliberate attacks but also inadvertent compromises resulting from user errors, equipment failures and buggy software, the report said. A smart grid uses digital technology to transmit, distribute and deliver power to consumer in a more reliable and efficient manner than traditional electricity systems. Released as part of the report was a Privacy Impact Analysis that examines some of the privacy implications of establishing a smart grid for power distribution. A key component of the smart grid is the real-time, two-way communication it establishes between consumers and power distributors for tracking energy use and enabling smarter consumption and pricing. While proponents of a smart grid have touted its potential to improve the electricity system, others have expressed concern about their susceptibility to cyber attacks and inadvertent compromises.

Current plans call for nearly 17 million two-way connected smart meters to be installed in U.S. homes over the next few years. Many are concerned that the software, wireless sensor networks and the Advanced Metering Infrastructure (AMI) networks that go into a smart grid present too many points of vulnerability into the network. IOActive researchers showed how attackers could spread malware through the network and remotely shut down power to consumers by taking advantage of flaws in the metering devices. In June, security consultancy IOActive Inc. disclosed how its researchers had tested Smart Grid components for security vulnerabilities and had discovered several that could allow attackers to access the network and cut off power. The NIST report is an attempt to assess such threats . The vulnerabilities that are listed in the report were gathered from existing research and security documents including NIST's own guide to industrial control systems security and the Open Web Application Security Project (OWASP) vulnerabilities list. The report also considers vulnerabilities arising from inadequate patch, configuration and change management processes, weak access controls, and lack of risk assessment, audit, management and incident response plans.

It looks at vulnerabilities that can arise during the operation a smart grid as well as on problems such as authenticating and authorizing users to substations, key management for meters, and intrusion detection for power equipment. Vulnerabilities associated with bad software coding practices, including input validation errors and user authentication errors, can also pose a risk to the integrity of a smart grid, the report said. One major issue that needs to be addressed is the data that will be collected automatically from smart meters. The real-time, two-way communication between consumers and suppliers in a smart grid also raises several privacy concerns, the NIST report noted. There needs to be more of an understanding of how that data will be distributed and utilized throughout the smart grid system, the report said. "In the current operation of the electric grid, data taken from meters consists of basic data usage readings required to create bills," the report said. "Under a smart grid implementation, meters can and will collect other types of data," some of which could be personally identifiable information that needs to be protected with strong privacy controls it said.