Ever since its launch this summer, Google Voice has presented carriers with some potentially thorny issues. Google Voice was designed in part to make it easier for users to change mobile carriers without sacrificing their phone numbers and also to give users several add-on features that are not offered by carriers. The biggest potential pitfall for carriers is that widespread adoption of Google Voice could render their networks "dumb pipes" that don't offer users any value-added services. For example, Google Voice can provide simultaneous ringing for both landline and wireless devices using the same phone number and it can serve as a hub for SMS as it lets users send text messages from any of their devices or even right over the Web on their computer.

However, America's top two wireless telcos this week indicated that they had no problem supporting Google Voice on their networks. Net neutrality proponents such as the media advocacy group Free Press have met Google Voice with enthusiasm, as they think it could give users the ability to seamlessly switch carriers if their current carrier is too restrictive of what they can and cannot use on their mobile devices. During a joint press conference with Google on Tuesday, Verizon CEO Lowell McAdam said that all Verizon phones based on the open-source Android platform would give users access to the Google Voice application. How Google Voice could change the wireless industry Although AT&T didn't mention Google Voice specifically as an application that it would allow onto its network, it's very likely that Google Voice will soon be available to iPhone users since it doesn't present the direct threat to cellular service revenues that other VoIP applications and services do. AT&T, meanwhile, said Tuesday that it was changing its tune and allowing iPhone users to utilize VoIP applications such as Skype on the AT&T 3G network. The reason for this is that when you make a call using Google Voice, it initially goes through the standard public switch telephone network to the Google cloud, where it is then sent out as a VoIP call.

But even if Google Voice won't harm carriers' ability to charge users for cell phone minutes, Gartner analyst Peggy Schoener does think it could harm carriers' profitability if users come to rely upon it for services. "It is a threat to their business model to some degree," she says. "But right now the demand for openness is trumping that. So while Google Voice will enable users to save money on typically expensive long-distance calls, it won't be an alternative to using up minutes from your standard wireless carrier in the way that Skype is. Carriers are looking at how the world is shaping up and they have to demonstrate openness and cooperation with industry newcomers." FCC action in the background While neither Verizon nor AT&T will say it out loud, one factor in their decision to allow Google Voice onto their networks could be the more active approach that the Federal Communications Commission has taken this year under new chairman Julius Genachowski. More recently, Genachowski has also proposed new network neutrality rules that would bar carriers from blocking or degrading lawful Web traffic and that would force carriers to be more open about their traffic management practices. For example, this summer the FCC asked Google, Apple and AT&T to explain why the Google Voice application was not yet been made available for the iPhone. ABI Research analyst Jeff Orr thinks that the government's more aggressive stance toward regulating the wireless industry has been a key factor in the telcos' decision to allow Google Voice on their networks despite whatever misgivings about the application they may have. "I think that they're looking at the talk going around at the FCC looking for net neutrality, and they figure that they'll need to back off and pick the battles they want to fight," he says. "By allowing Google Voice and other VoIP applications onto their networks they say to the FCC that they can monitor their own practices and that there's not a need for legislation mandating net neutrality." Schoener agrees that FCC action is part of the reason why carriers are showing more openness on their networks right now, but she also thinks that carriers are being forced by market trends to embrace more openness as well.

For instance, the past decade has seen large Internet companies such as Google and Skype become major market players with the clout to push for net neutrality regulations. "There's not a direct cause and effect between the FCC's actions and the carriers' decisions," she says. "But the FCC's stance is a part of the current trend that openness is better, and the telcos figure they can be better off in the long run if they embrace it rather than playing hardball."

Lotus Software GM Bob Picciano has grown tired of the "hot wind" blowing out of Redmond carrying claims that Exchange is displacing Notes and is singling out CEO Steve Ballmer and COO Kevin Turner as the main culprits spreading "ridiculous and fabricated" information. They are still utilizing capabilities from other aspects of the Lotus portfolio," said Picciano. Exchange alternatives: Front ends and back endsA look at Exchange 2010 "Microsoft is making claims in the marketplace around 4.7 million people have exchanged e-mail from Notes to Exchange and that is just a ridiculous fabricated figure," said Picciano, who took the reins at Lotus in 2008. "Every time they sell a [client access license] they count that as a competitive migration." "People need to recognize that Kevin Turner and Steve Ballmer have blown a lot of hot wind from Washington and there is not much substance or truth to what they are espousing in the marketplace," Picciano said. "They were so bold as to say there are entire countries that have migrated off of Notes and that is utterly ridiculous." Picciano says all the talk has "got me pretty worked up that they would be so bold to make such erroneous statements and not be challenged." The Lotus Software GM says many of the reference companies cited by Microsoft when it made its "4.7 million people" comment in July "are still licensing Lotus Notes technology and still utilizing e-mail and applications from Lotus. At Microsoft's annual meeting this summer for financial analysts, Turner heaped on more numbers during his presentation at the event. "We've taken out almost 13 million Lotus Notes [seats] the past three years. … Now, the thing that I would tell you is there's still 15 — we count — there's still 15 million out there." He cited SharePoint Server as the "fastest-growing, hottest product in the history of Microsoft," and pegged it as a catalyst in the fight against IBM. Picciano said the counter was last week's news that U.S. Bank was replacing Microsoft's SharePoint platform by standardizing on the Notes 8.5 client and would roll out Lotus Connections social networking tools, the Sametime real-time platform and Lotus Quickr, which is IBM's alternative to SharePoint.

He said PNC Bank and Continental Tire are joining U.S. Bank in getting rid of Microsoft's Exchange, Office and SharePoint. On Tuesday, Picciano threw out his own numbers saying a total to 15,421 companies have picked IBM over Microsoft since 2008 in the worldwide integrated collaborative environment market as defined by IDC. In addition, Picciano says customers are expanding their investment in Lotus software and he cited as examples Accenture, BASF, Chrysler, Coca-Cola, Colgate-Palmolive, Continental AG, Finishline, General Motors, GlaxoSmithKline, Gruppo Amadori, KBC Bank, Nationwide, Novartis, Phillips Electronics and PNC Bank. In January, Picciano said more than 12,000 new companies in 2008 bought their first Notes/Domino licenses. People understand what Kevin's motivation is and the prancing around in front of partners and talking about this. And he said half of the Fortune global 100 are Notes/Domino users. "It's important to put [Microsoft's claims] into perspective and call it what it is, a bunch of fabrication," Picciano said. "Kevin is feeling that he is under a bit of pressure.

It's duplicitous and overshadows the real truth." Follow John on Twitter.

Micron Corp. today introduced what it claims to be the industry's highest endurance, highest capacity multi-level cell (MLC) and single-level cell (SLC) NAND flash memory. To achieve higher performance for transactional databases and other I/O-intensive applications, enterprises often short stroke their hard disk drives, which limits the number of tracks accessed by the read/write to those on the outer edge of a drive platter. The technology, which is used for building solid-state drive products, is aimed at enterprise-class companies that want to boost performance of I/O-hungry applications, while maintaining the longevity they get with hard disk drives. The technique increases performance, but in turn, it cuts drive capacity by as much as 90% and dramatically increases hardware costs.

The six-fold performance increase translates into 30,000 write cycles on Micron's new MLC Enterprise NAND - and 300,000 write cycles on its SLC NAND flash. Solid-state drive technology offers greater performance and capacity over serial-attached SCSI or Fibre Channel drives, but so far it has been mainly limited to longer-lasting and higher performing SLC flash, which is far more expensive than MLC. Micron said that by using its 34-nanometer lithography technology to increase density, it has also been able to increase write performance - or the number of writes/erase cycles that can be sustained over the flash memory's life - six-fold on its MLC product and three-fold on the SLC flash memory. Normally, MLC NAND can sustain an average of 5,000 write/erase cycles, with a maximum of 10,000 write/erase cycles. Micron's 32Gbit MLC and 16Gbit SLC enterprise flash chip technology can be configured into multi-die, single packages enabling densities of up to 32GB for MLC and 16GB for SLC. "This isn't a solid state disk (SSD) drive announcement," said a Micron spokeswoman. "Right now we're working with equipment manufacturers and SSD manufacturers to design products around this. SLC flash natively can sustain up to 100,000 write cycles. You could also put these chips directly on a computer's motherboard." Micron expects to begin volume production of the new 32bit NAND flash technology in early 2010. "The use of advanced NAND flash is required to achieve broad SSD adoption in enterprise applications," said Steffen Hellmold, vice president of business development at SSD controller manufacturer SandForce Inc. "We are very excited to work with Micron and enable cost effective, reliable, high-performance SSD solutions that support stringent enterprise lifecycle requirements."

Analysts split today in their take on recent reports that Apple's long-rumored tablet will stress the device's e-book capabilities, saying that the company's plan for the "iPod Touch on steroids" would depend on the price tag. It certainly will be an e-reader, that will be part of its ecosystem, but that won't be all it is." Gottheil, who six months ago touted the idea that Apple would deliver a tablet best described as an "iPod Touch on steroids," stuck to that reasoning today. "It will use the iPhone OS, or a modified version of it," Gottheil said, echoing something iLounge.com said it heard from a reliable source this week. Earlier this week, the popular gadget blog Gizmodo cited unnamed sources who claimed that Apple was in talks with several media companies, including the New York Times , to negotiate content deals for its unannounced-but-expected tablet. "[Apple isn't] just going for e-books and mags," Gizmodo's Brian Lam wrote Wednesday. "They're aiming to redefine print." Not so fast, said one analyst. "It's more than just an e-reader," said Ezra Gottheil, an analyst with Technology Business Research who follows Apple's moves. "It's an application platform, it's a game and social gaming platform. The App Store, which Apple said this week had delivered its two billionth application, is crucial to the tablet's success, said Gottheil, which means that the device will be more than a one-trick pony. "Apple will market it as 'one more thing' nested inside 'one more thing'," Gottheil said, a move possible because of the App Store's broad library. "They'll [cast] it as able to do several increasing cool things." Gottheil's reasoning relies on the $800 price he expects Apple to slap on the tablet, a price tag much too high for a media reader-only device. "I don't think Apple has any particular interest in just creating another Kindle," he said, referring to Amazon's $489 Kindle DX . "Apple enjoys skimming the top of the market by making something hot and getting a nice margin out of it." Brian Marshall, a Wall Street analyst with Broadpoint AmTech, had a much different take, largely because of his price expectations. "I think $500 is the price," said Marshall today, adding that he agreed with Gizmodo that the tablet will focus on its e-reader capabilities. "I actually think that's how they'll promote it," he added. "They'll pitch [e-books] as a big segment, but they'll also say, 'We're gonna do this in color and much better than the Kindle'." Amazon's Kindle DX features a 9.7-inch grayscale display; according to reports out of Taiwan, component suppliers building parts for the expected Apple tablet are assembling 9.6-inch color, touch-enabled screens.

Most analysts have pegged the first half of 2010 for a tablet rollout, although some have proposed that Apple will craft a two-stage introduction, as it did with the iPhone in 2007, by announcing the hardware several months in advance of availability to give developers time to create applications or tweak existing iPhone programs for the larger device.

The Federal Aviation Administration today said it would streamline the environmental review part of permit applications for the launch and/or reentry of reusable suborbital rockets to help bolster a fledgling commercial space market. The PEIS would eliminate repetitive discussions of recurring issues and focus on issues that are ready for decision…specific to a particular launch. At the heart of the ruling is a document used to outline and determine the potential environmental consequences of issuing experimental permits known as the Processing of Experimental Permit Applications (PEIS). NetworkWorld Extra: Top 10 cool satellite projects The idea as the FAA explains it: Because the PEIS presents information and analysis common to reusable, suborbital rockets, the FAA could choose to tier environmental documents from the PEIS to focus on environmental impacts specific to an applicant's proposed experimental operations.

Individual launch operators would be required to coordinate with site operators to gain access to a site. From the FAA ruling: "The PEIS considers activities associated with the launch and reentry of reusable suborbital rockets, including pre-flight activities, flight profile (takeoff, flight, and landing), and post-flight activities. In addition, the launch operators would be required to apply to the FAA for an experimental permit, which would require an individual safety and environmental review. The general suborbital rocket designs addressed in the PEIS include vehicles resembling conventional aircraft-30 to 140 feet long with unfueled weight of up to 9,921 pounds; vehicles resembling conventional rockets-6 to 33 feet long with unfueled weight of up to 5,500 pounds; and vehicles that hover—up to 20 feet in length or diameter with unfueled weight of up to 4,400 pounds. The PEIS examines the potential environmental impacts of issuing an experimental permit for the operation of reusable suborbital rockets anywhere in the U.S. and abroad, and the potential site-specific impacts of permitted launches from seven FAA-licensed commercial launch sites: California Spaceport, California; Mojave Air and Space Port, California; Kodiak Launch Complex, Alaska; Mid-Atlantic Regional Spaceport, Virginia; and Space Florida." NetworkWorld Extra: 10 NASA space technologies that may never see the cosmos The FAA said it prepared the PEIS with cooperation from the National Aeronautics and Space Administration (NASA) and the US Air Force and said that its ruling does not propose site- specific environmental mitigation measures. "Rather, launch operators would be expected to implement site-specific mitigation measures that are consistent with those currently employed by the eight launch facilities addressed in the PEIS. Additional site-specific mitigation measures could be developed and presented in the site-specific documents that would tier from the PEIS." Reusable launch vehicles or rockets are one of the key technologies for the future of commercial space flight.

The FAA also assumes the total rocket fuel capacity of a reusable suborbital rocket not to exceed 11,00lbs. The Review of United States Human Space Flight Plan Committee report said that commercial services to deliver crew to low-Earth orbit are within reach. "While this presents some risk, it could provide an earlier capability at lower initial and life-cycle costs than government could achieve. The study of reusable launch vehicle or RLVs will focus on identifying technologies and assessing their potential use to accelerate the development of commercial reusable launch vehicles that have improved reliability, availability, launch turn-time, robustness and significantly lower costs than current launch systems, NASA stated. A new competition with adequate incentives to perform this service should be open to all US aerospace companies." NASA recently said it would partner with the US Air Force Research Laboratory to develop a technology roadmap for use of reusable commercial spaceships. The study results will provide roadmaps with recommended government technology tasks and milestones for different vehicle categories. NASA said its Commercial Crew and Cargo Program looks to develop and demonstrate safe, reliable, and cost-effective capabilities to transport cargo and eventually crew to low-Earth orbit and the International Space Station.

NASA also recently said it would offer $50 million in stimulus money to further develop private commercial spacecraft. The aerospace consultancy Futron recently said that as much as $1.5 billion may be up for grabs for commercial space operation in the next ten years.

A federal judge's rejection of a proposed settlement by TD Ameritrade Inc. in a data breach lawsuit marks the second time in recent months that a court has weighed in on what it considers to be basic security standards for protecting data. In September 2007, Ameritrade announced that the names, addresses, phone numbers and trading information of potentially all of its more than 6 million retail and institutional customers at that time had been compromised by an intrusion into one of its databases. U.S. District Court Judge Vaughn Walker in San Francisco yesterday denied final approval of a settlement that had been proposed by TD Ameritrade in May to settle claims stemming from a 2007 breach that exposed more than 6 million customer records . In arriving at his decision, Walker said the court didn't find the proposed settlement to be "fair, reasonable or adequate." Rather than benefit those directly affected by the breach, Ameritrade's proposed settlement is designed largely to benefit the company, Walker wrote in his 13-page ruling.

The stolen information was later used to spam its customers. The company also offered to retain the services of an analytics form to find out whether any of the data that had been compromised in the breach had been used for identity theft purposes. As part of an effort to settle claims arising from that incident, Ameritrade this May said it would retain an independent security expert to conduct penetration tests of its networks to look for vulnerabilities. The company also said it would give affected customers a one-year subscription for antivirus and antispam software. He described the additional security measures that Ameritrade proposed in the settlement as "routine practices" that any reputable company should be taking anyway. It was these offers that the judge dismissed as too meager.

Penetration tests provide a reliable way for companies to detect the sort of security weaknesses that led to the Ameritrade breach, Walker said. The two "very temporary fixes do not convince the court that the company has corrected or will address the security of client data in any serious way, let alone provide discernable benefits," he noted. But "as a large company that deals in sensitive personal information, penetration and data breach tests should be routine practices of TD Ameritrade's department that handles information security," he wrote. A TD Ameritrade spokeswoman said the company would provide its response to the judge's ruling soon. In August, the federal court for the Northern District of Illinois, denied a request by Citizens Financial Bank to dismiss a negligence claim brought against it by a couple.

The case is the latest to illustrate a growing willingness by courts around the country to consider claims of negligence and breach of contract brought by individuals against companies for failing to protect sensitive data. The two had claimed that Citizens' failure to implement two-factor user-authentication measures had resulted in the theft of more than $26,000 from their home equity line of credit. Such rulings are relatively rare in consumer lawsuits against companies that suffer data breaches involving the potential compromise of credit card data and personal information. The judge hearing the case allowed the claim to move forward, saying there was a reasonable basis to show that the bank had not moved quickly enough to implement stronger user authentication measures as it should have. Until recently, courts have tended to reject such lawsuits mainly on the grounds that consumers suffer little financial harm from such breaches.

A case before the Maine Supreme Court is testing whether consumers can seek restitution from merchants for the time and effort involved in changing payment cards and bank accounts after a data breach. They have also held that consumers can't seek damages for any potential injury that could stem from any future ID theft that might result from such breaches.

A Beijing court has ruled that Microsoft violated a Chinese company's intellectual property rights in a case over fonts used in past Windows operating systems, state media said Tuesday. Microsoft plans to appeal the case, a company representative said in a statement. The Beijing Number One Intermediate People's Court this week ordered Microsoft to stop selling versions of Windows that use the Chinese fonts, state broadcaster CCTV said.

The ruling comes as Barack Obama visits China for his first time as U.S. president. A U.S. business association this week appealed to Obama for further efforts to protect intellectual property rights in China, where pirated copies of DVDs and computer software including Windows are widely sold on streets and in bazaars. The visit has brought renewed focus on tensions over piracy and the trade of high-tech products between the countries. Microsoft originally licensed Zhongyi's intellectual property more than a decade ago for use in the Chinese version of Windows 95, according to Zhongyi. Microsoft agrees with the court that the key in the two cases is a dispute over the scope of licensing agreements, the Microsoft representative said.

Zhongyi argues that agreement applied only to Windows 95, but that Microsoft continued to use the intellectual property from Windows 98 to Windows XP. The court reportedly also ruled that Microsoft's use of a Chinese input system from Zhongyi did not violate any licensing agreements. But it disagrees with the ruling on the coverage of the agreements, which it believes also include its use of the fonts, the representative said. Pirated versions of Windows 7 were on sale in one Beijing bazaar weeks before the software officially went on sale last month. Windows XP is the most widely used OS in Chinese offices and homes, but countless users run pirated copies. Microsoft offers Windows 7 in China for a lower price than in developed markets, and often labels its software "legal" to differentiate it from the pirated versions common in the country.

Windows 7 Home Premium costs 699 yuan (US$103) in China, compared to $199.99 in the U.S.

Tibco will offer on Wednesday do-it-yourself capabilities for generating business intelligence reports on business processes to users of its BPM (business process management) software. With this information, users can fine-tune their applications. [ Discover what's new in business applications with InfoWorld's Technology: Applications newsletter and Killer Apps blog. ] Previously, users have had to specifically request business intelligence information on BPM from IT personnel. "The cool thing about this technology is unlike existing business integration products or BPM, this product will allow business users to directly manipulate and analyze the BPM data or the process data that's out there," said Rourke McNamara, Tibco director of product marketing. Built as an add-on to Tibco iProcess Suite for BPM, the company's Tibco iProcess Spotfire software enables users themselves to build personalized, real-time process reports.

Management of business processes enables users to make businesses more efficient, he stressed. Customized templates display reports and analyses. Featured in Spotfire are personalized reporting and analytics, as opposed to using static dashboards to display business processes. Contextual process performance data is generated that can be combined with business data from other applications, enabling process performance to be assessed in a full business context, Tibco said. While BPM is used for a wide variety of tasks, McNamara mentioned insurance claims management as an example of a use.

Users can build reports on such activities as bottleneck data, process cycle time, and how quickly business participants are working. "This allows the business users to optimize those processes based on how they're being used today," McNamara said. Tibco's iProcess software represents a convergence of BPM, business intelligence, and business rules engines, said analyst Boris Evelson, of Forrester. Another shortcoming is the lack of common metadata and metadata standards to bridge the gap between data, process and rules data, he said. This convergence, he said, was "necessary to optimize enterprise operations and create actionable insight into data and processes in order to make better strategic, tactical, and operational decisions." But the merging of the three technologies represents an immature market, which has mostly been addressed by systems integrators cobbling together bits and pieces of components from multiple vendors, Evelson said. Tibco's iProcess Spotfire software is built as a Windows client package, although a Web client is available with abbreviated capabilities, called Spotfire Web Player. The company also will roll out iProcess Workspace Lite, an HTML workspace client focused on core activities for executing business processes.

Also being offered in the Tibco BPM space Wendesday is Business Studio 3.2, which is a user interface adding capabilities for visually defining an organization's structure and relationships between different organizational components. A simple user interface in Workspace Lite enables the product to be used more easily by those with impaired vision and/or fine motor control difficulties, Tibco said. This story, "Tibco brings DIY BI report generation to BPM," was originally published at InfoWorld.com. Tibco would not disclose pricing information for the three products. Follow the latest developments in enterprise applications at InfoWorld.com.

It has been a tough year for Dataupia, but the data-warehousing appliance startup is alive and well following reports of major layoffs and a potential asset sale, according to founder Foster Hinshaw. "Financially, we're in fairly good shape. Dataupia's customers have remained with the company, and some have already submitted new orders, Hinshaw said. We're running close to cash break-even," he said. Dataupia is known for its Satori Server appliance, which combines servers, storage and optimization software, and is compatible with multiple databases.

They cut staff and shifted focus to selling subsets of Dataupia's technology, versus turnkey appliances, which require more research and development work on the part of vendors, Hinshaw said. In response to last year's economic downturn, Dataupia's board and CEO at the time decided to "hunker down," Hinshaw said. At the time, Hinshaw was in the middle of recuperating from a medical condition, although he remained a board member. It has also received additional funding from its original investors, according to Hinshaw. The cost-cutting moves were short-sighted, according to Hinshaw, who recently assumed the job of CEO. Dataupia's employee head count now stands at about 30 people and its core technical staff is "one of the strongest in the industry," he said. He declined to reveal the size of the latest investment.

Dataupia will also make new product announcements in several months, according to Hinshaw. Hinshaw said he is feeling healthy and plans to remain at the helm for the long-term. He declined to provide details. "I always smile when [vendors] pre-announce something that isn't there yet." The company has a difficult road ahead, in the view of analyst Curt Monash of Monash Research. "Unlike numerous other analytic DBMS vendors, Dataupia never seemed to have much in the way of technological differentiation," Monash said via e-mail. "It seems to be little more than a price play, in a sector with vigorous ongoing price competition and even a few appealing free alternatives."

Just a day before a crucial hearing in the patent infringement case between Canadian developer i4i and Microsoft, i4i's top executive said that the injunction that forbids Microsoft from selling Word could be reinstated. Microsoft was also hit with $290 million in damages in the case. "The wording of the court order - it said it was staying the injunction 'pending appeal' - is not a highly-specific order," said Loudon Owen, i4i's chairman, in an interview today. "We're awaiting its interpretation. Last month, a federal judge barred Microsoft from selling current versions of Word 2003 and 2007 as of Oct. 10, part of the punishment for losing the case brought by Toronto-based i4i in 2007. But after Microsoft warned that sales chaos would result, the U.S. Court of Appeals stayed the injunction earlier this month. Oweb said that it's unclear whether the wording could be taken to mean that the stay would hold until the end of the appeals process, or perhaps only until the three-judge panel hears oral arguments tomorrow. "This is the classic [phrasing] for a stay, but it leaves a great deal of discretion in the hands of the judges," Owen added.

Microsoft has had ample time. Owen declined to say whether i4i's lawyers would bring up the injunction or the wording of the stay order during the oral hearing slated for Wednesday in Washington D.C. But he dismissed Microsoft's warning that the injunction might force it to pull Word 2003 and Word 2007, as well as the associated suites, Office 2003 and Office 2007, off the market for months while it removed the "custom" XML feature that's at the center of the legal dispute. "If we look at the record, Microsoft has had extensive time to make modifications to Word," said Owen. "We filed [the lawsuit] in March of 2007, and said then that we would seek an injunction. The jury verdict was in May." Owen also declined to comment on how long i4i thought it would take Microsoft to revise Word. "We haven't seen the source code," he acknowledged. "But Microsoft's apocalyptic prediction was unfair." Two months ago, a long-time patent attorney said he thought Microsoft could easily make a technical fix to Word, then sell the new version in the U.S. According to the original injunction, Microsoft is not required to update copies of Word 2003 and 2007 already in customers' hands. The two OEMs, who asked to be granted "friend of the court" status in the appeal, said that changes to Word would "require extensive time- and resource-consuming retesting" on their part. Hewlett-Packard and Dell, the top-two PC makers worldwide, disagreed with the attorney's belief. Many new computers come with Microsoft's Office or a trial version of the productivity suite; HP and Dell said they would have to rebuild the disk images they use to factory-install software on their new PCs. According to i4i, Microsoft began adding XML editing and custom XML features to Word shortly after meeting with the company in 2001. Microsoft has denied the charge, saying i4i distorted the facts. "After a handful of meetings there weren't fruitful, i4i and Microsoft went their separate ways and Microsoft later released the custom XML functionality for Word that it had told i4i it was developing," the Microsoft's lawyers said in a brief filed last week . Owen refused to speculate about what i4i hoped to get out of tomorrow's hearing, other than to say, "We expect a fair hearing." He also dodged questions about what i4i would do if the appeals court overturned the jury verdict. "It's hard to look past the appeal," he said, but promised that if Microsoft is granted a retrial - something the American developer has asked for at minimum - i4i would continue the battle. "This is certainly an important case to us," Owen said, "but it's also important to any inventor or entrepreneur who invents technology." Both Microsoft and i4i have promised to comment after tomorrow's hearing.

The National Institute of Standards and Technology (NIST) Special Publication (SP) SP 800-53 provides a unified information security framework to achieve information system security and effective risk management across the entire federal government. Everything that follows is Brusil's work with minor edits. * * * The Risk Management Framework in SP 800-53 (Chapter 3) evokes the use of NIST document SP 800-39, Managing Risk from Information Systems: An Organizational Perspective to specify the risk management framework for developing and implementing comprehensive security programs for organizations. In this second of four articles about the latest revision of this landmark Special Publication from the Joint Task Force Transformation Initiative in the Computer Security Division of the Information Technology Laboratory, Paul J. Brusil reviews the framework for risk management offered in SP 800-53 Recommended Security Controls for Federal Information Systems and Organizations, Rev. 3 which was prepared by a panel of experts drawn from throughout the U.S. government and industry.

SP 800-39 also provides guidance for managing risk associated with the development, implementation, operation, and use of information systems. The risk management activities are detailed across several NIST documents (as identified in SP 800-53, Figure 3-1), of which SP 800-53 is only one. Part 1: NIST SP800-53 Rev. 3: Key to Unified Security Across Federal Government and Private Sectors The risk management activities within the Risk Management Framework include the six steps of:1) Categorizing information and the information systems that handle the information.2) Selecting appropriate security controls.3) Implementing the security controls.4) Assessing the effectiveness and efficiency of the implemented security controls.5) Authorizing operation of the information system.6) Monitoring and reporting the ongoing security state of the system. SP 800-53 focuses primarily on step (2): security control selection, specification and refinement. To start the risk management process, each organization uses other mandatory, NIST-developed, government standards. SP800-53 is intended for new information systems, legacy information systems and for external providers of information system services.

One standard helps to determine the security category of each of an organization's information and information systems. These other standards are Federal Information Processing Standard (FIPS) 199, Standards for Security Categorization of Federal Information and Information Systems and FIPS 200, Minimum Security Requirements for Federal Information and Information Systems. The other standard is used to designate each information system's impact level (low-impact, moderate-impact or high-impact). The impact level identifies the significance that a breach of the system has on the organization's mission. Companion guidelines in another NIST recommendation, SP 800-60, Guide for Mapping Types of Information and Information Systems to Security Categories, Rev. 1,> facilitate mapping information and information systems into categories and impact levels. SP 800-53 details the security control selection activities in Section 3.3. In brief, a minimum set of broadly applicable, baseline security controls (SP 800-53, Appendix D), are chosen as a starting point for security controls applicable to the information and information system. SP 800-53 summarizes the categorization activities in Section 3.2. Each organization then chooses security controls commensurate with their specific information and their specific information system's risk level exposure using typical factors such as identifying vital threats to systems, establishing the likelihood a threat will affect the system and assessing the impact of a successful threat event.

SP 800-53 specifies three groups of baseline security controls that correspond to the low-impact, moderate-impact and high-impact information system level categories defined in FIPS 200. The intent of establishing different target impacts is to facilitate the use of appropriate and sufficient security controls that effectively mitigate most risks encountered by a target with a specific level of impact. Then, as needed based on an organization's specific risk assessment, possible local conditions and environments, or specific security requirements or objectives, these minimal baseline security controls can be tailored, expanded or supplemented to meet all of the organization's security needs. The baseline security controls are selected by an organization based on the organization's approach to managing risk, as well as security category and worst-case impact analyses in accordance with FIPS 199 and FIPS 200. SP 800-53 gives guidance to organizations on the scope of applicability of each security control to the organization's specific situation, including, for example, the organization's specific applicable policies and regulations, specific physical facilities, specific operational environment, specific IT components, specific technologies, and/or specific exposure to public access interfaces. Tailoring activities include selecting organization-specific parameters in security controls, assigning organization-specific values to parameters in security controls and assigning or selecting appropriate, organization-specific control actions. If the tailored security control baseline is not sufficient to provide adequate protection for an organization's information and information system, additional security controls or control enhancements can be selected to meet specific threats, vulnerabilities, and/or additional requirements in applicable regulations. Augmentation activities include adding appropriate, organization-specific, control functionality or increasing control strength.

As a last resort, an organization can select security controls from another source other than SP 800-53. This option is possible if suitable security controls do not exist in SP 800-53, if appropriate rationale is established for going to another source and if the organization assesses and accepts the risk associated with use of another source. The plan documents rationale for selecting and tailoring each security control. An organizationally-specific security plan is then developed. Such rationale is used to provide evidence that the security controls adequately protect organizational operations and assets, individuals, other organizations and ultimately the nation. A designated senior official gives such authorization. Subsequent analyses of the risk management decisions documented in the security plan become the bases for authorizing operation of the organization's information system.

After authorizing operation, the organization begins continuous monitoring of the effectiveness of all security controls. Modification and update may be necessary to handle information system changes and/or updates, new configurations, operational environment changes, new types of security incidents, new threats and the like. Such monitoring facilitates potential future decisions to modify or to update the organization's security plan and the deployed security controls. Depending on the severity of adverse impacts on the organization, the revised security plan may need to be used to re-authorize operation of the information system. Organizations document selected program management controls in an Information Security Program Plan.

SP 800-53 also defines 11 organization-level, program management security controls (Appendix G) for managing and protecting information security programs. This plan is implemented, assessed for effectiveness via assessment procedures documented in NIST document SP 800-53A, Guide for Accessing the Security Controls in Federal Information Systems – Building Effective Security Assessment Plans and subsequently authorized and continuously monitored. In the next part of this four-part series, Brusil discusses the comprehensive repository of security controls presented in SP800-53 Rev. 3. * * *

Small and midsize businesses are confident in their disaster recovery capabilities, but their actual performance preventing outages shows they are "remarkably unprepared," according to survey results released Monday by Symantec. But that confidence is unwarranted. Four out of five SMBs are satisfied with their disaste-recovery plans, and two-thirds believe their customers would be willing to "wait patiently until our systems were back in place" in the event of an outage, Symantec found.

Three out of four SMBs report that they are based in a region susceptible to natural disasters. The report is a follow-up to Symantec's annual Disaster Recovery Research Report released last summer,  which found that the average cost of executing and implementing a recovery plan amounted to $287,600 for each downtime incident. The average respondent suffered three outages in the past 12 months, either from natural disasters, power outages, or virus and hacker attacks. "With this kind of exposure, and with the confidence SMBs display about their disaster preparedness, one would think SMBs have solid disaster-recovery plans in place," Symantec writes in the SMB Disaster Preparedness report. "However this is not universally soothe case - almost half (47 percent) report they do not yet have a plan to deal with such disruptions." Survey respondents included 1,657 companies worldwide, including both SMBs (companies with 10 to 499 employees) and their customers. This week's SMB study found that in some areas, respondents showed "an alarming lack of readiness," according to Symantec. "First, the average SMB backs up only 60 percent of its company and customer data," Symantec writes. "Second, they do so infrequently. This inattention to data backup is echoed by the fact that more than half (55 percent) of the SMBs feel they would lose 40 percent of their company data if their computing systems were wiped out in a fire." This lack of preparedness puts SMBs at risk of losing customers. Only one in five (23 percent) back up on a daily basis and 40 percent back up monthly or less.

Two out of five SMB customers surveyed by Symantec have switched vendors because they decided their vendor's technology was unreliable. Forty-two percent of outages reported by SMB customers lasted eight hours or more, and 26% of customers reported losing data because of a vendor's outage. More than a quarter of customers had suffered outages, many of which were significant. Customers said the estimated cost of outages averaged $15,000 per day. First SMBs should determine what critical information should be secured and protected, giving priority to customer, financial and business information, and trade secrets.

Symantec offered several recommendations to SMBs looking to bolster their disaster-recovery preparedness. SMBs should also automate the backup process to minimize human error, and test systems annually to ensure that data can be recovered and downtime minimized during a disaster.

Until yesterday, signing up for a Google Voice account required you to pick a new phone number - not a pleasant option for those who have kept the same digits for years. When you sign up for Google Voice - which is still not widely available to the public (you need to get an invite or request one) - you can either choose Google one-stop phone number or keep your own for a more pared-down experience. Now Google has enabled users to keep their existing phone numbers and get (most of) the features Google Voice offers, including Google's excellent voicemail service. Keeping your old digits gives you: Online, searchable voicemail Free automated voicemail transcription Custom voicemail greetings for different callers Email and SMS notifications Low-priced international calling Going for the full-throttle Google experience gives you all of the above plus: One number that reaches you on all your phones SMS via email Call screening Listen In Call recording Conference calling Call blocking If you already have a Google Voice number, you can add the voicemail option to any mobile phone associated with the account.

Happily, Google circumvented this problem earlier this month. Some of the awesome benefits are explained in Google's YouTube explanation: Since voicemails are transcribed and placed online, even made publicly available for sharing purposes, there has been some danger of said voicemails appearing in search results. These new features are both freeing and limiting: you can keep your number but sacrifice some of the goodies that make Google Voice a powerful contender in the telephony business. Follow Brennon on Twitter: @neonmadman Full number portability is likely coming in the future, after, of course, Google deals with AT&T, Apple, and the FCC. But some have high hopes that eventually the opposition will grow to accept and embrace Google Voice.

The biggest issue facing Nortel enterprise customers on the heels of Avaya's $900 million purchase of that business is product overlap, consolidation and subsequent support, analysts say. Now comes the uneasy task of sifting through the product portfolio and eliminating redundancies - an ordeal that could leave Nortel and even Avaya users with a shortened life span on their investments. "There may be some surprises there," says Bob Hafner, an analyst with Gartner. "These are going to be two large companies coming together. Avaya emerged as the winning bidder for Nortel's enterprise business Monday, beating out Siemens Enterprise Communications.

It's not the easiest thing to do. Avaya is the leading revenue market share vendor in enterprise telephony, according to Dell'Oro Group, while Nortel is No. 4. (See our test of Avaya's unified communications platform.) Less overlap will be found in routers, switches and other infrastructure products, where Nortel has a larger market share and installed base than Avaya. "The biggest issue for users is, 'Show me the [product] road map,'" says Henry Dewing of Forrester Research. "They want to see hardcore product plans [and] how they are going to actually consolidate product lines." Avaya has pledged near term support for the Nortel enterprise products, including those serviced by Verizon, a Nortel reseller. These things never go without issues, problems or concerns." Slideshow: Rise and Fall of Nortel   Significant overlap is expected in the IP telephony/unified communications portfolios of both companies - IP PBXs, handsets and call management software. Verizon filed motions last week seeking assurances that Avaya would continue to support the Verizon accounts, which the carrier says include many federal law enforcement agencies. "I'd be surprised if that issue doesn't work itself out," says IDC analyst Abner Germanow of the Verizon/Avaya impasse. "I'd have a hard time believing they'd leave the U.S. government out to dry." Nonetheless, Germanow is advising Nortel customers to accelerate any assessment or planning activities in light of the Avaya takeover. "They should figure out where their own needs lie and how to most effectively migrate," he says. "They should hold companies to their multi-vendor visions - that open means open." Gartner's Hafner agrees. "Customers need to pay attention to what's going on in the [merged] organization" to detect any potential distractions or turf battles that may adversely affect them, he says. For a recap of 2009's hottest tech M&A deals, check out our slideshow. 

Cisco this week unveiled a blade addition to its Nexus line of data center switches that's designed to aggregate multiple physical x86 blade servers from various vendors into a 10G Ethernet fabric. The 4000 is intended to fit inside a blade server system enclosure and aggregate multiple 1G server NIC connections into a 10G pipe for connection to and from the Nexus 5000 and 7000 top-of-rack and core switches. Cisco broadens data center ambitions The Nexus 4000 is the first blade switch in the Nexus line, which also includes the Nexus 7000 core switch, the 5000 top-of-rack switch, the 2000 fabric extender and the 1000V software-based virtual switch. The Nexus 4000 supports the same NX-OS converged LAN/SAN operating system as the rest of the Nexus family, and Cisco's MDS SAN switch line.

The Nexus 4000 supports FibreChannel and FibreChannel over Ethernet, and IP-based iSCSI or Network Attached Storage over Ethernet Data Center Bridging specifications for converged LAN and storage access from the server. This is intended to provide consistency across the data center as well as scale, high availability, fault tolerance and uniform management, Cisco says. It features a specialized ASIC for low latency and lossless operation in a virtualized environment, Cisco says. The 1000V aggregates virtual machine images from a single server while the 4000 aggregates multiple physical blade servers, Cisco says. It can work in conjunction with Cisco's Nexus 1000V virtual switch, which resides on blade servers running VMware's ESX 4.0 virtualization software. The Nexus 4000 will be sold to Cisco's OEM customers who will rebrand it and then sell it to end users.

Since it is being developed for blade server vendors, Cisco says it will leave product details, availability and pricing up to those particular vendors. Cisco expects its existing base of Catalyst blade switch OEMs to purchase the new Nexus blade switch. The new switch will compete with HP's new 6120XG and 6120G/XG blade switches, and 6- and 10-port BNT switches from Blade Network Technologies, which are resold by IBM. The Nexus 4000 is a small piece of a broader strategy outlined by Cisco for its data center and FCoE initiatives. In that vein, Cisco says it plans to unveil FCoE modules for both the MDS and Nexus 7000 switches; a 16Gbps FibreChannel MDS switch; and an 8Gbps FibreChannel expansion module for the Nexus 5000 FCoE switch. As part of that strategy, Cisco is positioning its MDS FibreChannel SAN switches as evolutionary elements in the transition to unified data center switching fabrics. These will likely come in the first half of 2010, Cisco officials said.

The University of Arizona is embarking on the transformation right now. These will be piece parts in Cisco's plan to incrementally evolve data centers to FCoE by starting at the server edge/access point and deepening the immersion into the aggregation and core areas of the data center network. Eighteen months ago, the school commenced an "enterprise system replacement" project to upgrade its data center networking facilities to better support its HR, student information, financial management, grants management, business management and data warehousing applications. The school also deployed FCoE converged network adapters on the servers with plans to employ FCoE up to the core Nexus switches, Masseth says. The school needed an infrastructure to support 300 to 400 physical servers, several hundred virtual servers running ESX, and 300 Terabytes of storage across 13 different EMC arrays for 55,000 users, says Derek Masseth, senior director for infrastructure services at the University of Arizona. "Our architecture was not going to meet our needs," Masseth says, referring to the school's current infrastructure of Cisco Catalyst 6500 switches and MDS 9500 directors in the SAN. With that, the university installed three Nexus 7000s in the core and several Nexus 5010s at the top of server racks. University of Arizona realized a 50% reduction in capital expenditures and a 30% reduction in power consumption per port with the Nexus deployment, Masseth says.

The school is currently a Dell shop for its blade servers. The school is not yet evaluating Cisco's Unified Computing System to further tighten its server, storage, networking and virtualization environments but plans to give it a close look over the next year. In the meantime, the school plans to decommission its Catalyst 6500 switches from the data center. "We'd like to get to a pure Nexus data center," Masseth says. "We have a very strong desire to be on a single platform."

Sprint is making it clear to software developers that it wants to help them make new applications for its devices, no matter what platform they run on. The 10 best Google Android apps   The conference will feature not only Sprint executives, but also representatives from HTC, Palm and Google who will try to teach attendees about best practices for developing software for multiple mobile operating systems.  Sprint also says that the conference will address enterprise M2M business solutions and 4G technical development and resources. "Our objective with the [conference] is to create a forum for mobile app developers and those who develop apps for the desktop to get a taste of what's possible for them using the tools that Sprint's developer program provides," says Len Barlik, Sprint's vice president of wireless and wireline services. "With today's operating system platforms, powerful new devices… and the strength of the Sprint Now Network and 4G, the opportunities are limitless." Sprint has become more aggressive over the last year in trying to bring more high-end smartphones onto its network.  The carrier scored a modest hit with the Palm Pre over this past summer and the company is slated to release its first smartphone based on Google's mobile Android platform on Oct. 11.  The carrier has also expanded its smartphone roster this year to include the BlackBerry Tour and the Palm Centro, although the company shares the rights to sell those phones with Verizon Wireless. During its Open Developer Conference to be held in Santa Clara, Calif., later this month, Sprint says it plans to focus on a wide array of mobile operating systems, including Google's Android, Microsoft's Windows Mobile, Palm's webOS and Research in Motion's BlackBerry operating system.

In its search for water on the moon, NASA slammed not one, but two, spacecraft into a deep, dark crater on the lunar south pole this morning. NASA successfully nailed a target about 230,000 miles from Earth - twice. It was a precision operation.

The Lunar Crater Observation and Sensing Satellite, known as LCROSS, separated into two sections last night. Four minutes later, the rest of the space probe shot through the miles-high plume of debris kicked up by the first impact, grabbed analysis of the matter, and then it too crashed into the lunar surface. Its empty rocket hull, weighing in at more than 2 tons, was the first of the two pieces to slam into the lunar surface at 7:31 a.m. EDT today. Effectively, it was a one-two punch designed to kick up what scientists believe is water ice hiding in the bottom of a permanently dark crater. NASA said it will issue a report on its initial analysis of the probe at10 a.m. EDT today.

With NASA still hopeful to one day create a viable human outpost on the moon , it would be helpful for anyone there to find water rather than haul it up from Earth. NASA had been promising live images of the impact and resulting debris plume but the live images on NASA TV disappeared moments before impact. The orbiter is expected to send its own analysis of the debris plume back to earth later this morning. The LCROSS spacecraft, which blasted off from Cape Canaveral Air Force Station in Florida on June 18, went aloft with its companion satellite, the Lunar Reconnaissance Orbiter . As the Atlas V rocket carrying lifted off, a NASA spokesman called it "NASA's first step in a lasting return to the moon." NASA's Lunar Reconnaissance Orbiter , which has been in orbit around the moon since late June, was 50 kilometers above the moon's surface during this morning's impact. The LCROSS spacecraft heavily loaded with scientific gear.

The instruments were selected to provide mission scientists with multiple views of the debris created by the hull's initial impact. According to NASA, its payload consisted of two near-infrared spectrometers, a visible light spectrometer, two mid-infrared cameras, two near-infrared cameras, a visible camera and a visible radiometer. Before it crashed into the moon, LCROSS was transmitting data back to NASA mission control at 1.5 Mbps, NASA noted this morning.

Anyone can record ho-hum video on an iPhone 3GS and shoot it to YouTube. For example, you might want to make a business video of a "talking head" explaining a concept, or a sales demo video showing a product up close and personal, or even just a video blog about a news item of the day. Yet, it's possible to record excellent video that looks crisp and colorful without the typical jerky-camera look. The following tips and tricks can make your iPhone videos really pop-and garner more attention from your audience of YouTube viewers.

So, first off, it's important to keep the iPhone perfectly still during video recording. The three main ingredients to a great video are stability, lighting, and sound quality. The iPhone is meant for portable video, so-to stabilize a video shoot - I use the portable Joby Gorillapod Flexible Tripod ($22) and the Zacuto Zgrip iPhone Jr. mount ($70), which holds the iPhone and attaches to the tripod. Lighting is critical for the best video. I use the Zgrip because I can re-position the iPhone for the best viewing angle and it works with my much-more-stable camera tripod as well. I use the Litepanels Micro (around $285) and attach it to another camera tripod.

Of course, if you're not ready to fork over this amount of money, try using a house lamp to boost illumination-and remove the shade for the brightest light. The LitePanel is expensive, but casts a uniform, video-friendly glow across the faces of my video participants-a glow that looks more like sunlight. While most light bulbs will cast a yellow-ish light (a good reason to use a photo or video light), you can use your video editor to color correct lighting problems. You could try the included iPhone ear bud set for sound, but the cord is not long enough. With the iPhone, it's hard to get a good viewing angle for video and record sound close enough at the same time.

Another option is the Shure SE210 ear buds ($180) and the Shure Music Phone Adapter ($40), which has a long cord. Just be sure to record a hand clap - with the iPhone video and Samson audio recorder going at the same time, to sync up video and audio. However, your best approach is to use a Samson Zoom H2 Handy Recorder ($200). Record audio separately, then sync the video and audio in iMovie on your Mac. Other video tips: if you are making a business video, you can run a teleprompter on your Mac using NovaStorm AquaPrompt ($15) so your subject can read from a script. And last, don't forget to smile. [John Brandon is a 20-year veteran Mac user who used to run an all-Mac graphics department.] Also, be sure to include some pre-roll before and after the video so your cuts in iMovie-or on the iPhone 3GS itself, which supports simple editing cuts-don't look too choppy and awkward.

The iPhone GPS app market unleashed by the release of the iPhone 3.0 software update is getting more interesting by the day, with several developers in an arms race to add new features to their initial offerings. My own in-car navigation box doesn't even speak street names (other than numbered freeways), and it sure makes a big difference. Taking the lead in the GPS app race is Navigon MobileNavigator, which recently added support for spoken street names-a major failing in the three apps that I previewed in a Macworld Video last month.

Last week, I got to spend a little bit of time with Navigon's Johan-Till Broer, who showed me the next version of MobileNavigator, due as a free App Store update sometime in October. The traffic update also does a better job of estimating the speeds of various roads without live traffic data. It adds live traffic to the party, downloading traffic updates over the digital cell network and rerouting you around slow spots. The end result should be that MobileNavigator will do a better job of suggesting the fastest route you should take to your destination, based on both current conditions and the time of day you're traveling. I've found Sygic Mobile Maps to be a solid app, although it feels more like a port of a standalone GPS device than a native iPhone app. Sygic, maker of the Sygic Mobile Maps GPS navigation app, recently updated its app to support spoken street names, as well as catching up with the other apps by integrating the addresses of the contacts in your iPhone's address book.

However, you can't beat the price-Sygic is trying to drive sales of its updated app by reducing the price (temporarily, at least) to $40 for an app containing only United States maps and $60 for the app containing maps of all of North America. TomTom's promised car kit for the iPhone, which promises a mount, speaker, and improved GPS reception, has yet to arrive here in the States. (Our friends at Macworld UK are reporting that the car kit is available for order on that side of the Atlantic, with shipping times listed as "two to three weeks.") As for the TomTom app, the company promises "several updates by the end of 2009," but hasn't given details. While Navigon and Sygic are not familiar names to most Americans, TomTom is a strong brand and its iPhone app has sparked a lot of interest, although the iTunes charts would suggest that it may have fallen behind Navigon in terms of sales. Presumably spoken street names and live traffic are high on the agenda. Look for a comprehensive comparison of iPhone GPS apps from Macworld in the near future.

Reviewing these apps is hard, requiring a lot of driving (and a dedicated driver so the reviewer doesn't cause an accident!), and the features of the apps keep updating at a rapid pace. In the meantime, check out my video above if you'd like to see the apps in action. From my perspective, right now Navigon MobileNavigator is the best choice available, but this game is far from over.

After a kick in the pants from the leader of the Linux driver project, Microsoft has resumed work on its historic driver code submission to the Linux kernel and avoided having the code pulled from the open source operating system. The submission was greeted with astonishment in July when Microsoft made the announcement, which included releasing the code under a GPLv2 license Microsoft had criticized in the past. Microsoft's submission includes 20,000 lines of code that once added to the Linux kernel will provide the hooks for any distribution of Linux to run on Windows Server 2008 and its Hyper-V hypervisor technology. Greg Kroah-Hartman, the Linux driver project lead who accepted the code from Microsoft in July, Wednesday called out Microsoft on the linux-kernel and driver-devel mailing lists saying the company was not actively developing its hv drivers.

If they do not show back up to claim this driver soon, it will be removed in the 2.6.33 [kernel] release. HV refers to Microsoft Hyper-V. He also posted the message to his blog. "Unfortunately the Microsoft developers seem to have disappeared, and no one is answering my emails. So sad...," he wrote. They are not the only company." Also new: Microsoft forms, funds open source foundation Kroah-Hartman said calling out specific projects on the mailing list is a technique he uses all the time to jump start those that are falling behind. Thursday, however, in an interview with Network World, Kroah-Hartman said Microsoft got the message. "They have responded since I posted," he said, and Microsoft is now back at work on the code they pledged to maintain. "This is a normal part of the development process.

In all, Kroah-Hartman specifically mentioned 25 driver projects that were not being actively developed and faced being dropped from the main kernel release 2.6.33, which is due in March. On top of chiding Microsoft for not keeping up with code development, Kroah-Hartman took the company to task for the state of its original code submission. "Over 200 patches make up the massive cleanup effort needed to just get this code into a semi-sane kernel coding style (someone owes me a big bottle of rum for that work!)," he wrote. He said the driver project was not a "dumping ground for dead code." However, the nearly 40 projects Kroah-Hartman detailed in his mailing list submission, including the Microsoft drivers, will all be included in the 2.6.32 main kernel release slated for December. Kroah-Hartman says there are coding style guidelines and that Microsoft's code did not match those. "That's normal and not a big deal. But the large number of patches did turn out to be quite a bit of work, he noted. It happens with a lot of companies," he said.

He said Thursday that Microsoft still has not contributed any patches around the drivers. "They say they are going to contribute, but all they have submitted is changes to update the to-do list." Kroah-Hartman says he has seen this all before and seemed to chalk it up to the ebbs and flows of the development process. The submission was greeted with astonishment in July when Microsoft made the announcement, which included releasing the code under a GPLv2 license Microsoft had criticized in the past. Microsoft's submission includes 20,000 lines of code that once added to the Linux kernel will provide the hooks for any distribution of Linux to run on Windows Server 2008 and its Hyper-V hypervisor technology. Follow John on Twitter

IBM on Monday will announce a BI (business intelligence) and planning suite aimed at midsized companies that need more insight into their business than a spreadsheet can provide, but not the complexity of an enterprise-level product. The suite consists of three modules that can be bought separately or as a unit. Dubbed Cognos Express, the applications are meant for businesses with between 100 and 999 workers, said Ben Plummer, general manager of the IBM Cognos midmarket business unit. A reporting tool offers drag-and-drop-style report creation and ad hoc querying; an "Advisor" module is used for forecasting and "what-if" scenario analysis; and a product called Xcelerator allows users to crunch and visualize data with an in-memory analytics engine through the Microsoft Excel interface.

When [midsize customers] open it up, it pops out and scares them." Pricing for Cognos Express starts as low as US$12,000 for one module and five users. IBM claims the package can be deployed in "a matter of hours." It is managed through a Web-based console that reduces dependency on help from IT staffers, according to IBM. Enterprise-class BI systems are much more complex, given factors such as the need to juggle and integrate various data assets from acquired companies, Plummer said. "It's not that they want it to be complicated, it's complicated by default." Meanwhile, BI vendors have historically sold medium-size companies repackaged versions of enterprise-grade software, he said. "They try to cram an enterprise-size package into a midmarket-size box," he said. "It's like a Jack-in -the-box. IBM is not ruling out delivering Cognos Express as a SaaS (software as a solution) offering in the future, but for now is sticking with on-premises deployments, according to Plummer. "Companies of this size, they say, 'can you just bring it in and put it on the server I have?'" he said. Wood Ranch is investing in about 25 Cognos Express licenses due to business growth and a desire to gain better insight into its finances amid a bleak economy, according to Mark Quandt, vice president of finance. An initial Cognos Express customer is Wood Ranch BBQ & Grill, a California restaurant chain with 13 locations. Margins in restaurants are thin to begin with, and eating out is a discretionary expense for consumers, he said. "To the extent you've got more visibility into the business, you're going to come out ahead," Quandt added. "We felt this is the way we needed to go." Wood Ranch has been working with all three modules for about two weeks, and will use the software to analyze data from its point-of-sale system.

If you wanted to work with the data, the only option was to export into Excel," Quandt said. "Excel is a great personal tool ... [but] with 13 restaurants we got to a point where we wanted to get a little more sophisticated in your analysis." However, IBM's decision to employ the Excel interface in the Xcelerator module is a plus, given the familiarity Wood Ranch's workers have with it, Quandt said. The POS system generates reports that do "a good job of overall sales, and your sales mix, but you're stuck into their templates. Under Wood Ranch's corporate structure, each restaurant is considered an independent company from a legal standpoint, and consequently each has its own database, Quandt said. Wood Ranch has only two IT workers, but Quandt does not expect he'll need to bring on additional resources to run Cognos Express. The Cognos Express suite will help the chain pull together all the data and conduct modeling and forecasting, he said.

Along with e-learning and content management software, midmarket companies are expected to invest heavily in BI as the economy improves and IT budgets recover, said Forrester Research analyst Tim Harmon. But IBM may have an edge given its strong channel, according to Harmon. "The additional margins and incentives that are available [to partners] with this product are not insignificant," he said. He praised Cognos Express on its technical merits, but said competing midmarket offerings from Tibco and Qliktech are of comparable quality. Partners will also be able to fine-tune the product for verticals, a job that becomes more granular as companies get smaller, according to Harmon.

IBM Thursday announced upgrades to and a roadmap for its 15-month-old Lotus Symphony suite of productivity tools, emphasizing it indeed offers an alternative to Microsoft Office.

The move comes after Microsoft recently said that a court order to remove Office from store shelves next month could leave consumers and businesses "stranded without an alternative set of software."

Microsoft is battling a patent infringement case brought by Toronto-based i4i over XML file formats. The 2007 case resulted in a $290 million judgment against Microsoft and an injunction that bars it from selling Word 2003 and Word 2007 after Oct. 10 unless the offending technology is removed.

"What we are trying to do with Symphony is establish that there is an option in the market and companies don't have to spend the money they spend for productivity suites," says Ed Brill, director of product management for Lotus Software.

Slideshow: Products of the Week

Along with Symphony, Google Docs and OpenOffice are other productivity suite alternatives to Microsoft's Office, which dominates market share and is a revenue gold mine for the company.

"Symphony is not a product that we just threw out there," said Brill. "We have been investing in an on-going basis."

IBM plans to release Symphony 2.0 in 2010, the same timeframe Microsoft plans the next version of Office. Code named Vienna, the Symphony 2.0 software will be based on the most recent version of OpenOffice.

But for now IBM, which offers Symphony as a free download and the default productivity software in Notes/Domino 8, is adding a new set of drag-and-drop widgets that include integration with popular Microsoft backend software such as the SharePoint Server. The software also integrates with Google Gadgets and Lotus's own Sametime and Connections platforms.Part of the widget package is the OrgChart Widget, which integrates with profiles in Lotus Connections so users can be added into meetings that convene online with a single click.

Other widgets include the Learning Widget, which combines local and Web-based information; a Team Workspace Widget that provides access to documents stored in Lotus Quickr or Microsoft SharePoint; the Symphony 2 Wiki Widgets provide conversion of documents for publishing on wikis; the Treasure Box Widget keeps a "favorites list" inside Symphony of frequently used documents, graphics and applications; and the Export Graphic Widget supports export of common formats such as .gif, .jpeg, .png, .bmp.

In addition, the ChartShare Widget provides screen sharing for up to 20 people with support for co-creation and editing of presentations. It also supports integration with Lotus Sametime Unyte Live's meeting capability. The ChartShare Widget also gives presence information on every contributor to the presentation and a link to instant messaging.

The widgets work with Symphony 1.3, which features support for Microsoft Office 2007 file formats such as .docx, xlsx, and .pptx. The .docx format is part of the ongoing i4i patent infringement suit against Microsoft.

Symphony is available for Mac, Windows, Ubuntu Linux, Red Hat Linux and Suse Linux. Symphony is available for free from the IBM Web site.

IBM offers flat-fee support contracts to large corporate users for $26,000 per year.

Follow John on Twitter

Catering to the growing need for parallel programming, Microsoft is implementing capabilities for this functionality in both the existing Visual Studio 2008 development platform and its planned successor, Visual Studio 2010.

With parallel programming, developers must accommodate multiple CPU cores instead of programming for a single core.  In a blog post late last Friday evening, S. "Soma" Somasegar, group vice president for the Microsoft Developer Division, outlined capabilities including SOA debugging and an add-in for Visual Studio 2008 for debugging MPI (Message Passing Interface) programs.

[ Check out the  Microsoft Visual Studio 2010 highlights. ]

An SOA debugger is planned for Visual Studio 2010, geared to the Cluster SOA capability introduced in Windows HPC (High-Performance Computing) Server 2008. Cluster SOA features a parallel programming model.

"Up until now, debugging Cluster SOA was limited to basic WCF/.Net style debugging with no cluster integration.  In Visual Studio 2010, an add-in for Cluster SOA enables the SOA Settings tab, allowing you to choose a head node, debug nodes and services, deploy runtime libraries, and clean up automatically," Somasegar said.

Multiple MPI capabilities are eyed for developers as well.

"Although Visual Studio 2005 had a simple built-in debugger for MPI programs, it did not provide a full 'F5' experience.  The new add-in for Visual Studio 2008, which is also integrated into Visual Studio 2010, allows you to select a cluster head node, how many cores you want, and hit F5 to debug your MPI program," Somasegar said. Also available for MPI debugging is an add-in from Allinea offering rank-based context switching and other capabilities.

Visual Studio 2010 also will analyze behavior of a particular MPI rank or node via an MPI profiler. This is done via integration with HPC job-scheduling. The profiler also shows line-level profile information.

Meanwhile, HLRS/ZIH, of Stuttgart, Germany, has ported its Marmot MPI analysis tool for Visual Studio, Somasegar said. "Marmot can be used to check the validity of parameters passed to MPI calls and detect irreproducibility, deadlocks, and incorrect management of resources," Somasegar said.

Microsoft previously has touted its Concurrency Runtime technology for Visual Studio 2010 as a boost for multicore programming. It offers  more control over application resources. Language extensions for parallelism also have been eyed. The company has not announced a formal release date for Visual Studio 2010.

Experts say Microsoft's submission Monday of virtualization driver source code to the Linux kernel marks a watershed event in the vendor's understanding of open source's future.

"This is another sign of Microsoft's maturation with respect to open source," says Jeffrey Hammond, an analyst with Forrester Research. "There has been a real set of stepping stones toward a pragmatic and practical embrace of open source. This is like the final capstone."

Microsoft stuns Linux world, submits source code for kernelMicrosoft's Linux kernel submission raises virtualization questions

Microsoft has made code submissions to PHP, given significant financial support to the Apache Foundation and added open source code to its product portfolio in such places as Windows HPC Server, System Center management software and its Visual Studio development tools.

And the vendor operates a Linux/Windows integration lab with partner Novell.

"There is no going back now on their attitude with the GPL. They can no longer say Linux is a cancer when they have 22,000 lines of code in the kernel," Hammond said.

Microsoft says its goal is to become a platform Linux users can turn to.

"As open source is adopted on a range of platforms we need to understand it really clearly to make ourselves one of the best platforms to adopt it on," said Sam Ramji, who runs the Open Source Software Lab for Microsoft and is the company's director of open source technology strategy.

Forrester's Hammond said winners on the Linux side could be all the alternative distributions of the open source operating system. The new Linux kernel drivers give them the tools to run on Microsoft's hypervisor technology - Hyper-V. 

"With Ubuntu, I see a lot of developers picking it up but they have not had the [virtualization] support like Novell and Red Hat," he said.

The driver code that Microsoft open sourced and submitted to the Linux kernel was first developed and certified specifically for Novell's Suse Linux and Red Hat Linux.

"This gives the Red Hat and Debian guys equal access to support and the capability to run in a mixed environment where there is a Windows Server," Hammond said.

Chris Wolf, an analyst with the Burton Group, said Microsoft is deftly positioning itself for the future of mixed corporate networks and emerging cloud infrastructures.

"Microsoft gets where the industry is and they know they can't fragment themselves off from the industry," Wolf said. "Getting in the Linux kernel and broadening support for Linux definitely has to be a key part of their virtualization and cloud strategy."

Microsoft has been building its cloud infrastructure strategy since the introduction of its Azure cloud OS last year. The software is slated to be available in November, a month before the open source virtualization drivers submitted Monday appear for the first time in the Linux kernel.

"It is good stuff from Microsoft and it will be interesting to see how VMware responds," Wolf said.

Follow John on Twitter: twitter.com/johnfontana

The Federal Trade Commission today announced a wide-ranging attack on cyber-vultures looking to feast on the current moribund economic situation.

Dubbed "Operation Short Change," the law enforcement sweep announced today includes 15 FTC cases, 44 law enforcement actions by the Department of Justice, and actions by at least 13 states against those looking to bilk consumers through a variety of schemes, such as promising non-existent jobs; promoting overhyped get-rich-quick plans, bogus government grants, and phony debt-reduction services; or putting unauthorized charges on consumers' credit or debit cards.

"Thousands of people have been swindled out of millions of dollars by scammers who are exploiting the economic downturn," said David Vladeck, Director of the FTC's Bureau of Consumer Protection during a press conference today. "Their scams may promise job placement, access to free government grant money, or the chance to work at home. In fact, the scams have one thing in common-they raise people's hopes and then drive them deeper into a hole."

At the heart of Operation Short Change, are new FTC cases against companies the agency says have conned consumers out of millions of dollars. In each case, the FTC alleged that the defendants' practices were deceptive or unfair and/or made illegal electronic funds transfers or violating the Telemarketing Sales Rule.

In the law enforcement actions include:

-John Beck/Mentoring of America, two principals, and three purported "inventors" marketed three get-rich-quick schemes, duping hundreds of thousands of consumers into paying approximately $300 million. The defendants marketed "John Beck's Free & Clear Real Estate System," "John Alexander's Real Estate Riches in 14 Days," and "Jeff Paul's Shortcuts to Internet Millions." The defendants allegedly made false and unsubstantiated claims about potential earnings for users of these systems. They used frequently aired infomercials to sell the systems for $39.95 and then contacted the purchasers via telemarketing to offer "personal coaching services," which cost several thousand dollars and purportedly would enhance their ability to earn money quickly and easily using the systems. Some consumers also continued receiving unwanted sales calls after they told the defendants' telemarketers to stop calling.

-Wagner Ramos Borges, through a host of front companies, including "Job Safety USA," allegedly systematically targeted people seeking maintenance and cleaning work. Luring job seekers with print and online classified advertisements in newspapers throughout the country, Borges allegedly tricked them into paying $98 for a worthless and needless credential called a "certificate registration number" supposedly so that the consumers could get maintenance or cleaning jobs–jobs that Borges did not provide.

-Grants For You Now and its affiliates and principals operated Web sites such as grantsforyounow.com, grantoneday.org, and easygrantaccess.com that deceived consumers by promising them free government grant money to use for personal expenses or to pay off debt. According to the FTC complaint, after obtaining consumers' credit or debit account information to process a $1.99 fee for grant information, the defendants failed to adequately disclose that consumers would be enrolled in a membership program that cost as much as $94.89 a month. Some consumers also were charged a one-time fee of $19.12 for a third-party "Google Profit" program. All the defendants' Web sites falsely offered a "100% No Hassle Money Back Guarantee

-Cash Grant Institute and its principals allegedly waged an automated robocall campaign promoting bogus claims that consumers were qualified for grant money from the government, private foundations, and wealthy individuals that they could use to overcome their financial problems. They made similar misleading claims about "free grant money" on their Web sites, cashgrantsearch.com and requestagrant.com.

-Mutual Consolidated Savings, its affiliates, and principals used telemarketing robocalls and the Internet to push a phony "Rapid Debt Reduction" program to consumers in the United States and Canada, according to the FTC complaint. The defendants allegedly convinced consumers to pay them $690 to $899 for the program by misrepresenting that the program would reduce credit card interest rates, save thousands of dollars and enable consumers to pay off their debt three to five times faster than they could under their current payment schedule. The defendants also failed to make good on promises that they would refund the fees paid if consumers' credit card interest rates were not reduced.

-Google Money Tree, its principals, and related entities allegedly misrepresented that they were affiliated with Google and lured consumers into divulging their financial account information by advertising a low-cost kit that they said would enable consumers to earn $100,000 in six months. They then failed to adequately disclose that the fee for the kit would trigger monthly charges of $72.21, the FTC complaint states.

-Penbrook Productions, run by Michael Allen Brooks, promoted a work-at-home scheme online that used spokesperson "Angela Penbrook," and charged $197 for the opportunity to become a "certified" rebate processor, earning as much as $225 per hour. According to the FTC complaint, after purchasing, consumers discovered that the work-at-home "opportunity" had nothing to do with processing rebates, but merely instructed the consumers about becoming an affiliate marketer. Despite Penbrook's "100% Ironclad, 3-month 'Make Money Or It's Free,' Triple Satisfaction Guarantee," consumers then found that they could not get a refund. The defendants thus misrepresented that consumers would be hired as rebate processors, made false earnings claims, and misrepresented the refund guarantee.

-Classic Closeouts, illegally made unauthorized charges and debits to the consumers' accounts months or years after they bought low-cost clothing or household goods from classiccloseouts.com, the FTC charged. The charges usually ranged from $59.99 to $79.99, and Classic Closeouts charged some consumers' accounts multiple times. Consumers' efforts to contact the defendants to contest the charges were unsuccessful. Many consumers also disputed the charges with their financial institutions. After the financial institutions reversed the unauthorized charges, the defendants contested these disputes, falsely claiming that consumers had chosen to join the Classic Closeouts "frequent shopper club."

The FTC recently issued a warning that economic-oriented phishing activities were growing.

Specifically the FTC said it was urging user caution regarding e-mails that look as if they come from a financial institution that recently acquired a consumer's bank, savings and loan, or mortgage. In many case such emails are only looking to obtain personal information - account numbers, passwords, Social Security numbers - to run up bills or commit other crimes in a consumer's name, the FTC stated.

The FBI Cyber Crime Task Force added it had arrested the first person under a new computer fraud law that makes it a federal crime to commit extortion relating to unauthorized access of, or damage to, a protected computer system or to impair the confidentiality of information obtained from that computer. In this case the person was trying to exploit customers of flailing insurance giant AIG.

In addition, ScanSafe and other Web security watchers are reporting a big uptick in the number of hackers using the Bank of America brand in a phishing attack that uses the bad economy as a lure.

Perhaps the concern is unfounded a recent PC World article notes. The article states that more than half of us are deleting messages from banks and financial institutions without even thinking twice. Experts say recipients who receive these e-mails believe that all the messages are part of phishing e-mail scams.

While the Federal Trade Commission does a lot of posturing about how it help consumers protect their valuable personal information, through laws and education, the agency has for the second time in less than a year delayed enforcement of its key identity theft rules until August.

The reasons for the delays are an old tune by now; banks and financial institutions can't get ready for the program which was originally set to go into effect Nov. 1, 2008. Other groups such as hospitals and physicians offices have complained about the Red Flag requirements saying they will cost too much to implement. A survey done by the MedPage today of 100 hospitals found that they would have to spend over $10,000 to comply with the Red Flag Rule.

Mozilla on Thursday patched 11 vulnerabilities in Firefox, more than half of them labeled "critical."

The update was the first since late April, when Mozilla rushed out a refresh to plug a hole that the company's developers has inadvertently introduced in the Windows version of the browser, and came just days after the launching of a "tweener" build of the upcoming Firefox 3.5.

Of the 11 flaws fixed in Firefox 3.0.11, six were rated critical, one "high," two "moderate" and two "low" in Mozilla's four-step system.

Three of the six critical bugs were in the browser's rendering and JavaScript engines, a frequent target of Mozilla's patching. "Some of these crashes showed evidence of memory corruption under certain circumstances and we presume that with enough effort at least some of these could be exploited to run arbitrary code," Mozilla said in the advisory for the engine patches, using its now-standard boilerplate language.

The SSL tampering vulnerability was reported to Mozilla by three researchers working for browser rival Microsoft, and a fourth at Purdue University. The four - Shuo Chen, Ziqing Mao, Yi-Min Wang and Ming Zhang - co-wrote a paper titled "Pretty-Bad-Proxy: An Overlooked Adversary in Browsers' HTTPS Deployments," that they published May 1 ( download PDF). Mozilla ranked the vulnerability as "high."

Other patches prevent hackers from pinching browser cookies, executing JavaScript attack code and spoofing Web addresses.

Thursday's update was the fifth this year for Firefox 3.x, but not the first for Mozilla's browsers this week.

On Monday, Mozilla rolled out Firefox 3.5 Preview, a build the company said is a near-finished version of the official Release Candidate, or RC. Although new-found bugs had delayed the RC's release yet again, Mozilla wanted to get something in testers' hands, and so took the unusual step of delivering the Preview.

At this point, Mozilla has not set a scheduled for posting Firefox 3.5 RC, once slated to appear the first week of June. In notes on a status meeting the company held Wednesday, Mozilla simply noted that it would release the RC "you know, when it's ready."

Firefox 3.5 Preview is being offered only to users who have already installed Beta 4 of the browser upgrade.

Firefox 3.0.11 can be downloaded for Windows, Mac OS X and Linux, but current users can also call up their browser's built-in updater or wait for the automatic update notification, which should pop up in the next 48 hours.