Microsoft's CodePlex Foundation leader soaks in stinging critique

After a stinging critique from a noted expert in establishing consortia, the leader of Microsoft's new CodePlex Foundation says such frank evaluation is welcome because the open source group's structure is a work in progress. The CodePlex Foundation's aim is to get open source and proprietary software companies working together. Sam Ramji, who is interim president of the CodePlex Foundation, was responding to last week's blog by Andy Updegrove, who said the group has a poorly crafted governance structure and looks like a sort of "alternative universe" of open source development.

Updegrove, a lawyer, noted expert on standards, and founder of ConsortiumInfo.org, laid out in a blog post five things Microsoft must change if it wants CodePlex to succeed: create a board with no fewer than 11 members; allow companies to have no more than one representative on the Board of Directors or Board of Advisors; organize board seats by category; establish membership classes with rights to nominate and elect directors; and commit to an open membership policy. He added, however, "There are some best practices [for running the boards of non-profits] that we are not as familiar with as we would want to be." Slideshow: Top 10 open source apps for Windows  Stephanie Davies Boesch, the foundation's secretary and treasurer, is the only board member with experience sitting on a non-profit's board. Despite the stinging tone in Updegrove's assessment, Ramji says he is thankful for the feedback. "Andy's been incredibly generous with his expertise and recommendations," Ramji says. "It is the kind of input and participation we were hoping to get by doing what is probably non-traditional for Microsoft but not necessarily non-traditional for non-profit foundations, which is to basically launch as a beta." For instance, Ramji says that the decision to go with only five people on the board came from Microsoft's experience that larger groups often have difficulty with decision making. Ramji says Updegrove's suggestion to have academic representation on the board was "outstanding. And basically it is re-writable.

We did not think of that." And to Updegrove's point on becoming an open membership organization, Ramji says, "our goal is to become a membership organization and Andy has some excellent recommendations for that."He says the fact that Updegrove took the time to respond "in the format that he did is more proof that there is something worth doing here." Ramji, compares the Foundation's formation to the early days of a software development project. "We have said in these first 100 days we are looking at everything as a beta. Obviously, there are some areas like contributions and licensing agreements we put a lot of time into but even those can be modified." Microsoft announced the foundation Sept. 10 with a stated goal "to enable the exchange of code and understanding among software companies and open source communities." The company seeded the group with $1 million and Microsoft employees dominated the interim board of directors and board of advisors. One is a call for a broad independent organization that can bridge cultural and licensing gaps in order to help commercial developers participate in open source. Ramji says the foundation has spent the past couple of weeks listening to feedback in "Twitter messages, email, and phone calls in order to understand what people hope this can be." Within that feedback two patterns have emerged, Ramji says. The other focuses on creating a place where open source .Net developers can gain strong backing. "Look at projects related to Mono, you also can look at NUnit, NHibernate, we really feel optimistic that the Foundation could help them gain a higher level of credibility in the open source community. Miguel de Icaza, the founder of the Mono project and the creator of the Gnome desktop, is a member of the Foundation's interim board of directors.

They feel they have been lacking that strong moral support," Ramji says. From a high level, Ramji says the Foundation stands as a sort of enabler that helps independent developers, companies and developers working for those companies navigate the nuances and practices of open source development so they can either contribute source code to projects or open source their own technologies. "One suggestion has been that the Foundation should house all the best practices we have seen software companies and open source communities use," said Ramji. "We want to have a place where everyone interested in how to participate can come and read and if they choose they can use our license agreements or can use the legal structure of the Foundation to grant patent licenses and copyrights for developers and derivative works." Those licensing agreements have a distinct focus, Ramji said, on the rights that are related to code that is being contributed and on how to contribute the patent rights on that code. Ramji says the goal is to service multiple projects, multiple technologies and multiple platforms rather than having one specific technology base, which is how most current open source foundations are structured. "It's early days and we have received a lot of good ideas from experts in a variety of fields from law to code to policy that is what we had hoped for," says Ramji. "Someone wrote it is nice to see Microsoft engaging early on without all the answers and to have the community solve what they would like to see. Once those issues are settled, code would be submitted using existing open source licenses. That is satisfying for me and refreshing to others.

This is the right way to proceed." Follow John on Twitter

The six greatest threats to US cybersecurity

It's not a very good day when a security report concludes: Disruptive cyber activities expected to become the norm in future political and military conflicts. From the GAO: "The growing connectivity between information systems, the Internet, and other infrastructures creates opportunities for attackers to disrupt telecommunications, electrical power, and other critical services. But such was the case today as the Government Accountability Office today took yet another critical look at the US federal security systems and found most of them lacking.

As government, private sector, and personal activities continue to move to networked operations, as digital systems add ever more capabilities, as wireless systems become more ubiquitous, and as the design, manufacture, and service of information technology have moved overseas, the threat will continue to grow. " Within today's report, the GAO broadly outline the groups and types of individuals considered to be what it called key sources of cyber threats to our nation's information systems and cyber infrastructures. According to the Director of National Intelligence, a growing array of state and nonstate adversaries are increasingly targeting—for exploitation and potential disruption or destruction—information infrastructure, including the Internet, telecommunications networks, computer systems, and embedded processors and controllers in critical industries. From the GAO: Foreign nations: Foreign intelligence services use cyber tools as part of their information gathering and espionage activities. Criminal groups: There is an increased use of cyber intrusions by criminal groups that attack systems for monetary gain. While remote cracking once required a fair amount of skill or computer knowledge, hackers can now download attack scripts and protocols from the Internet and launch them against victim sites.

Hackers: Hackers sometimes crack into networks for the thrill of the challenge or for bragging rights in the hacker community. Thus, attack tools have become more sophisticated and easier to use. These groups and individuals overload e-mail servers and hack into Web sites to send a political message. Hacktivists: Hacktivism refers to politically motivated attacks on publicly accessible Web pages or e-mail servers. Disgruntled insiders:The disgruntled insider, working from within an organization, is a principal source of computer crimes.

The insider threat also includes contractor personnel. Insiders may not need a great deal of knowledge about computer intrusions because their knowledge of a victim system often allows them to gain unrestricted access to cause damage to the system or to steal system data. Terrorists: Terrorists seek to destroy, incapacitate, or exploit critical infrastructures to threaten national security, cause mass casualties, weaken the U.S. economy, and damage public morale and confidence. The Central Intelligence Agency believes terrorists will stay focused on traditional attack methods, but it anticipates growing cyber threats as a more technically competent generation enters the ranks. However, traditional terrorist adversaries of the United States have been less developed in their computer network capabilities than other adversaries. Testifying before the Senate Judiciary Committee, Subcommittee on Terrorism and Homeland Security today, FBI Deputy Assistant Director, Cyber Division said that while the FBI has not yet seen a high level of end-to-end cyber sophistication within terrorist organizations, it is aware of and investigating individuals who are affiliated with or sympathetic to al Qaeda who have recognized and discussed the vulnerabilities of the U.S. infrastructure to cyber attack; who have demonstrated an interest in elevating their computer hacking skills; and who are seeking more sophisticated capabilities from outside of their close-knit circles. "In addition, it is always worth remaining mindful that terrorists do not require long term, persistent network access to accomplish some or all of their goals.

The likelihood that such an opportunity will present itself to terrorists is increased by the fact that we, as a nation, continue to deploy new technologies without having in place sufficient hardware or software assurance schemes, or sufficient security processes that extend through the entire lifecycle of our networks," Chabinsky said. Rather, a compelling act of terror in cyberspace could take advantage of a limited window of opportunity to access and then destroy portions of our networked infrastructure.

Cyberattacks on U.S. military jump sharply in 2009

Cyberattacks on the U.S. Department of Defense - many of them coming from China - have jumped sharply in 2009, a U.S. congressional committee reported Thursday. That's a big jump. Citing data provided by the U.S. Strategic Command, the U.S.-China Economic and Security Review Commission said that there were 43,785 malicious cyber incidents targeting Defense systems in the first half of the year.

In all of 2008, there were 54,640 such incidents. The committee is looking into the security implications of the U.S.' trade relationship with China. If cyber attacks maintain this pace, they will jump 60 percent this year. It released its annual report to Congress Thursday, concluding that a "large body of both circumstantial and forensic evidence strongly indicates Chinese state involvement in such activities." "The quantity of malicious computer activities against he United states increased in 2008 and is rising sharply in 2009," the report states. "Much of this activity appears to originate in China." "The cost of such attacks is significant," the report notes. Attacks on department systems have been rising steadily for years.

Citing data from the Joint Task Force-Global Network Operations, the report says that the military spent $100 million to fend off these attacks between September 2008 and March 2009. A Defense Department spokesman did not have any immediate comment on the report's numbers Thursday. In 2000, for example, only 1,415 incidents were reported. The department figures are "probably more accurate now," than they were nine years ago, he said. The increase is in part due to the fact that the U.S. military is simply better at identifying cyberthreats than it used to be, said Chris Poulin, the chief security officer of Q1 Labs, and formerly a manager of intelligence networks within the U.S. Air Force. Security experts have long known that many computer attacks originate from Chinese IP (Internet Protocol) addresses, but due to the decentralized nature of the Internet, it is very difficult to tell when an attack is actually generated in China, instead of simply using Chinese servers as a steppingstone. Who knows.

Q1's Poulin says that his company's corporate clients in the U.S. are seeing attacks that come from China, North Korea, and the Middle East. "We do definitely see patterns coming from specific nation states." He said that because China's government has taken steps to control Internet usage in the country, it could probably throttle attacks if it wanted to. "China's defiantly initiating attacks," he said. "State-sponsored? But they're certainly not state-choked."

HP's history of billion-dollar technology buys

HP's news that it would lay down $2.7 billion to acquire network switch maker 3Com not only causes industry watchers to look ahead at what could come of such a deal, but also reminds many of the IT vendor's long history of billion-dollar acquisitions. This way HP will also be able to run its next-generation data centers on 3Com networking equipment. Hottest tech M&A deals of 2009 3Com HP announced on Nov. 11 it would pay big bucks to add 3Com's Ethernet network switches, routers and security products to its ProCurve business. The deal also strengthens HP's converged data center product portfolio vs. that of Cisco and its partners. "It gives HP a core switch - a brand-new core switch," said Steve Schuchart of Current Analysis of 3Com's H3C 12500, which the company is pitting against Cisco's Nexus 7000. "It gives them a real platform to move forward with," Schuchart said in an interview with Network World Senior Editor Jim Duffy, adding that the HP ProCurve 8212 and 5400 series switches didn't really cut the mustard for core applications. "This is newer, bigger and a much more purpose built switch." EDS About 18 months ago in spring 2008, HP announced it would invest $13.9 billion in exponentially expanding its global IT services business via the acquisition of EDS. Aiming squarely at IBM, HP's EDS buy pushed the IT vendor quickly up the list of services providers to land behind IBM as the second largest global outsourcing company worldwide.

He explained that if IBM Global Technology Services is working with a client at the services level, there is more of a chance the customer will buy IBM technology. At the time, industry watchers speculated that HP not only wanted to enhance its services business but also potentially sell more data center equipment via outsourcing deals. "IT services are a big and strategic part of the marketplace and they influence technology purchases downstream," said Ben Pring, research vice president at Gartner, at the time of the deal. If HP can get its foot in the door with more services customers, hardware and software sales could follow. "If HP had a bigger professional services umbrella and footprint, they would get greater access to a very strategic marketplace," Pring said. HP's net gain included automation technology that could be applied to configuring and provisioning physical and virtual components across network, system, storage and application components in a data center. Opsware In 2007, HP paid what some industry watchers said was too much money for data center automation darling - and Marc Andreessen offspring - Opsware.

The acquisition was one of the first significant moves by one of the four market leading management software makers to incorporate broad automation technologies across their product portfolios. "The next big step for the big four management vendors [BMC, CA, HP and IBM] is a move into automation in the areas of active configuration management and dynamic resource allocation. BMC and CA will have to almost spring into the market with a fully shaped technology through acquisition." In fact, HP spending $1.6 billion for the automation software company had the indirect effect of upping the price for Opsware competitor BladeLogic,which BMC later acquired for $800 million. It will be a big disruptive play and a defining technology when they move into automation technologies," said Will Cappelli, a research vice president at Gartner, in an interview with Network World at the time of the deal. "It will be more of a challenge for BMC and CA than for HP and IBM because the latter have server and storage technologies from which they can incrementally grow. Mercury Interactive One of HP's initial moves to broaden its niche network management software, known as OpenView at the time, into a larger IT management software suite involved paying $4.5 billion to buy application management vendor Mercury Interactive. The technology Mercury offered addressed applications from development to quality testing to performance on production networks and would boost HP's management play beyond its OpenView Network Node Manager and Operations products. "None of those deals have been large enough to significantly impact HP's software revenue. HP had been on a buying binge of sorts snapping up smaller management software makers such as Peregrine Systems, Novadigm and Consera Software, but those vendors didn't promise the revenue increase that Mercury could offer, analysts said at the time.

The Mercury acquisition really bumps up HP's software business to where a significant portion of their revenue will now come from software," said Rich Ptak, co-founder and principal analyst at Ptak, Noel & Associates at the time the acquisition was made public. The Mercury buy was expected to increase that to more than $2 billion annually, according to HP. In 2008, HP's software revenue  had reached more than $3 billion. In 2005, HP reported net revenue of $1 billion from its software business. Compaq HP's bid to acquire Compaq in 2002 garnered much industry speculation and concern from customers, but ultimately the two companies came together with their separate computer, printer and server businesses for about $25 billion. At the time, Gartner suggested HP faced many challenges in terms of the respective companies' business and how they might be spun out or eliminated to ensure success going forward. "Both HP and Compaq depend on tactical partnerships with outside vendors to meet their customers' software infrastructure requirements," Gartner concluded. With regulatory approval concerns, product support worries and what was tagged a "sour PC market" at the time, HP received much negative press surrounding its bid for Compaq.

Gartner also said at the time that HP and Compaq didn't have a strong track record in the software infrastructure arena, contrary to CEO Carly Fiorina's assertion that the new HP will set the standard for innovation; and that HP will have to spin off its software businesses to take any kind of lead in the software arena. The deal soon soured for HP, which reported less than five years later losing $48 million in its VeriFone software business. VeriFone In 1997, HP paid about $1.2 billion to acquire e-commerce and smart-card technology maker VeriFone to help customers in the financial services and other industries advance Internet-based business. In 2001, HP sold its VeriFone assets to Gores Technology Group. Follow Denise Dubie on Twitter here.   Do you Tweet?

Pantone releases iPhone App

If you're a designer whose inspiration strikes while you're on the go, Pantone has a new iPhone app for you: myPantone. The app provides the sRGB, HTML, and LAB values on each color swatch, and its cross referencing system lets users identify colors across color libraries. The app gives graphic, multimedia, fashion, interior, and industrial designers the tools to capture, create, and share Pantone color palettes while they're riding the bus to work, waiting on line at the supermarket checkout, or anywhere they happen to be. "MyPantone gives designers the freedom to access Pantone colors anywhere, without the need to be in their office or carry around cumbersome guides," said Andy Hatkoff, vice president of technology licensing for Pantone. "Now with myPantone's Portable Color Memory in their pocket, designers no longer need to agonize trying to recall an exact color." MyPantone gives designers access to all the Pantone color libraries, including the Pantone Matching System for coated, uncoated, and matte stock; the Pantone Goe System for coated and uncoated stock; Pantone Pastels for coated and uncoated stock; and the Pantone Fashion + Home Smart Color system. In addition, myPantone facilitates creation of harmonious color palettes by finding complementary, analogous, and triadic combinations for selected colors.

Once you create a color palette, you can view or share it with others. And, the app can extract colors from any image stored in your iPhone's camera roll or let you choose individual colors from an iPhone photo and match them to specific Pantone colors. For viewing color chips, you can use Pantone's slate of built-in backgrounds or you can use one of your own photos as a background. You can attach text notes or voice annotations, as well. Sharing options include sending color palettes via e-mail, sending palettes to other iPhone users, and sharing via Facebook or Twitter.

You can e-mail palettes as color patches, or as application swatch files for use in Adobe Creative Suite, CorelDraw, and QuarkXPress. MyPantone is available for $10 at the iPhone App Store. Designers can also share their color palettes with other designers by sending them to Pantone's hosted Web site. It is compatible with iPhone OS 3.0 or higher and can also be used with the iPod Touch.

SANS official talks security

This is the second of two parts of an interview of Stephen Northcutt by technologist David Greer. How do you see the evolution of the problem space of information security? Everything that follows is by Messrs Greer and Northcutt with minor edits. (See part 1.) * * * DG: It seems like many of the current security issues are problems that we have been dealing with for decades.

SN: Twelve years ago, we were standing up for a cyber capability for the United States. We do make progress; for instance we now have the Cyber Guardian program and have already graduated the first class. All the things we are saying today and the stuff we are doing to our cyber capability I heard 12 years ago. The attack surface just continues to get larger and larger and larger. We are more connected, so there's a lot more vulnerability points because we are increasingly connected and more code is exposed to potential attacks. So we're dealing with more lines and more kinds of codes.

We are not dealing with that many fundamental problems. There is an ever-greater need for security people who can integrate with the business. The specifics are changing, but the classes of the problems haven't changed very much. I was just trying to explain to someone that the No. 1 thing a manager wants out of a security person is communication skills. Our challenge is to develop people's communications skills. We've done survey after survey after survey.

You can't do business without communication. If we don't put a tremendous amount of attention and simplify, simplify, simplify, we end up with things we cannot manage. I would also say that my personal observation is that people often think complexity is its own reward. This is true on the security level, technology level and organization-process level. SN: A couple of years back I spent some time with the trade organization that represents the 100 largest banks in the U.S. We were trying to do some work around information security risk. DG: How do you see evaluating and managing risk in the security environment today?

More than once I heard the finance guys say "You information security folks have no idea what you're doing in terms of risk management. In finance we know for any set of financial transactions within a few dollars of what our risk is." One of those quants was in the risk management department at Bear Stearns which is gone now. You are using qualitative methods when you need quantitative. The finance folks have an advanced terminology and methodology. We need to make sure in information security we are never arrogant and that we make every effort to present risk to senior management in such a way that they can govern wisely.

I am sure senior management were briefed on the risks, but because house prices and stock prices kept going up they thought this incredible risk of bubble deflation was an acceptable risk and they found out they were wrong. I think there are three parts to that. 1. Start using metrics to measure and quantify risk. Instead of just saying "We might get hacked," we should explain the financial cost of a data breach or the destruction or manipulation of our data.3. Finally, we need to present the information well and at the management level. There are several books such as Andrew Jaquith's "Security Metrics: Replacing Fear, Uncertainty, and Doubt" and W. Krag Brotby's "Information Security Management Metrics: A Definitive Guide to Effective Security Monitoring and Measurement"; tools such as security information and event management (SIEM) and vulnerability management products that are internally consistent provide a quantitative score.2. We need to describe risk in terms of the business objectives. I know that is a strength of the MSIA program at Norwich.

DG: As we move toward cloud computing do you see these risks increasing? I think every security person needs to read "The Exceptional Presenter: A Proven Formula to Open Up and Own the Room" by Timothy J. Koegel and "The Cognitive Style of PowerPoint: Pitching Out Corrupts Within" by Edward R. Tufte once every 18 months or so and struggle to apply that information to our lives.

MS won't punish users for switching to hosted software

Microsoft's licensing of internal versions of software vs. their online counterparts won't penalize users for buying on-premises licenses and then switching to online hosted software, according to CEO Steve Ballmer. Ballmer, in a meeting with Network World at the annual SharePoint Conference, said moving between enterprise applications like SharePoint and Exchange deployed internally to versions of that software operated in the cloud by Microsoft will be "seamless." "Customers are saying give me some credit here, this is more like an upgrade than it is like a new buy, give us a little credit,"he said. Ballmer says Sidekick episode 'not good,' but Microsoft ensuring that its online services won't make the same error. Users have been questioning whether they can move licenses online without having to take a credit and renegotiate with Microsoft on licensing terms. "I know it will take them time to get it straight; it is really complicated," said Guy Creese, an analyst with the Burton Group. "They claim software plus services as a mantra and if that is true they need to make it so these two environments [cloud and on-premises] are seamless [from a licensing perspective]." Ballmer said users need to break it down by separating Internet and intranet deployments from cloud and on-premises. "Internet stuff we do is all priced basically per application or per server and it will be priced that way whether it is offered in the cloud, as a service or on-premises," he said. "I think that is pretty clean and I think that is the way that people would like to see things licensed." He said intranet applications are essentially priced by the number of users and that fact is true whether it is in the cloud or on-premises. "So one is user-based and one is application based." But Ballmer said Microsoft will be flexible in the way the company prices cloud versus on-premises.

For example, if a user has a client access license for SharePoint running internally but decides he wants Microsoft to run SharePoint in the cloud, the customer only pays to have Microsoft operate the SharePoint service. "You don't need to convert [the license], you can use your on-premise license and just buy the service capability; that you can do." If you want to transition you can do that too but most of our customers say just let me use the license that I already bought and have you operate this thing for me." Follow John on Twitter: twitter.com/johnfontana He said users that want to come to the cloud can buy the service and use the license they own or they can start in the cloud and buy an integrated license that pays for both the service Microsoft operates and the license. "We designed it to be seamless, in a sense it looks more complicated now because you have two choices." "We have a big enough install base of people that bought licenses that say, 'Hey, when we buy your service we don't want to be re-buying what we have already paid you for in terms of software.' We have to recognize that our customers expect a transition step where we give them credit for the software that they already own," he said.