Monday, March 30, 2009

Updated Recommendations For Protecting Wireless, Remote Access Data, From NIST


ScienceDaily (Feb. 25, 2009) — Telecommuting has freed many to work far from the confines of the office via laptop, but the price of working while sipping a latte at that sunny café is the danger that a public network will not keep the data that passes through it safe. Now, to combat the risk inherent in remote access, the National Institute of Standards and Technology (NIST) has updated its guide on maintaining data security while teleworking.


The revised guide offers advice for protecting the wide variety of private and mobile devices from threats that have appeared since the first edition appeared in August 2002. Together with the preponderance of dangerous malware on the Web, the vulnerability of wireless transmissions from mobile devices has created dramatic new security challenges.

“In terms of remote access security, everything has changed in the last few years. Many Web sites plant malware and spyware onto computers, and most networks used for remote access contain threats but aren’t secured against them,” says Karen Scarfone of NIST’s Computer Security Division. “However, even if teleworkers are using unsecured networks, the guide shows the steps organizations can take to protect their data.”

Among these steps is the recommendation that an organization’s remote access servers—the computers that allow outside hosts to gain access to internal data—be located and configured in ways that protect the organization. Another is to ensure that all mobile and home-based devices used for telework be configured with security measures so that exchanged data will maintain its confidentiality and integrity. Above all, Scarfone says, an organization’s policy should be to expect trouble and plan for it.

“You should assume external environments contain hostile threats,” she says. “This is a real philosophy shift from several years ago, when the attitude was essentially that you could trust the home networks and public networks used for telework.”

The new guide provides recommendations for organizations. A companion publication* offers advice for individual users on securing their own mobile devices.

While intended primarily for U.S. federal government agencies, the guide has been written in broad language in order to be helpful to any group that engages in telework. Formally titled Special Publication 800-46 Revision 1, Guide to Enterprise Telework and Remote Access Security.

* SP 800-114, User’s Guide to Securing External Devices for Telework and Remote Access.


Adapted from materials provided by National Institute of Standards and Technology.

Does Humor On The Internet Mold Political Thinking?


ScienceDaily (Mar. 11, 2009) — Jokes are not merely a source of popular enjoyment and creativity; they also provide insights into how societies work and what people think. Humor is so powerful it can help shape geopolitical views worldwide, according to Professor Darren Purcell and his team from the University of Oklahoma in the US.


Their study of humor including the analysis of two Achmed the Dead Terrorist skits, has recently been published online in Springer's GeoJournal.

Humor is a powerful communications tool with potential political implications at various levels of society, as the recent Danish political cartoon representations of the Prophet Mohammad and the political repercussions and resulting economic boycotts demonstrated. Purcell and colleagues' paper looks at humor as an important form of popular culture in the creation of geopolitical worldviews.

The authors use 'disposition theory' - a framework that allows them to understand who will regard which content as funny, and how derisive humor can be seen as amusing - to examine particular types of humor in texts which reflect society's concerns, developments and relationships, and by extension, the geopolitical implications of these texts. With an emphasis on social context, the theory suggests that the appreciation of humor is dependent, in part, on whether one holds a positive or negative attitude, or disposition, toward the object of humor.

Purcell and colleagues analyze two stand-up comedy routines performed by American ventriloquist Jeff Dunham. The skits center on the character of Achmed the Dead Terrorist, an unsuccessful suicide bomber. The humor plays on anti-Arab/Muslim sentiment. Dunham uses his audiences' disposition towards terrorists to get laughs, while at the same time challenging his audience members to look at their own views of terrorism, Islam, and American efforts in Iraq.

Purcell and colleagues show that disposition theory is useful to help place humor as a fluid, global phenomenon shared through various social networks via the Internet. Thanks to new communication technologies including YouTube.com, audiences around the world are engaged and can participate. The technology takes participants seriously by providing a point of entry where they can put forward their views of the world. This amplifies the potential impact of any geopolitical text.

They conclude that "the diffusion of humor with geopolitical content to a global viewing audience, via personal networks spanning multiple scales, forces us to consider the role of individuals (via forwarding and dissemination) as producers and reproducers of geopolitical codes and active participants in constructing enemies and threats, even in the guise of a two-foot tall puppet."

Internet Can Warn Of Ecological Changes


ScienceDaily (Mar. 23, 2009) — The Internet could be used as an early warning system for potential ecological disasters, according to researchers from Stockholm Resilience Centre at Stockholm University and the University of East Anglia.


Ecosystem services such as water purification and food production are of fundamental importance for all planetary life. However, these are threatened by sudden changes in ecosystems caused by various pressures like climate change and global markets. Collapsing fisheries and the irreversible degradation of freshwater ecosystems and coral reefs are examples that have already been observed. Averting such ecosystem changes is of vital importance.

Despite improved ecosystem monitoring, early warnings of ecological crisis are still limited by insufficient data and gaps in official monitoring systems. In an article for the journal Frontiers in Ecology and the Environment, centre researchers Victor Galaz, Beatrice Crona, Örjan Bodin, Magnus Nyström and Per Olsson, and Tim Daw from the School of International Development at UEA, explore the possibilities of using information posted on the Internet to detect ecosystems on the brink of change.

List servers fundamental for coral bleaching monitoring

“Information and communications technology is revolutionizing the generation of and access to information. Systematic ‘data mining’ of such information through the Internet can provide important early warnings about pending losses of ecosystem services,” said lead author Dr Galaz.

In 1997-98, unusually warm seas caused unprecedented levels of mass ‘coral bleaching’. Field observations of the global phenomenon were shared instantly through an email network, demonstrating how communication technologies can allow rapid assessment of emerging threats from informal sources. In their article Can Web Crawlers Revolutionize Ecological Monitoring?, published online this week, the authors explore the untapped potential of web crawlers - software programs that browse the World Wide Web in a methodical, automated manner – to mine informal web-based sources such as email lists, blogs and news articles, as well as online reports and databases, for ecosystem monitoring.

“If we look at coral reefs, for example, the Internet may contain information that describes not only changes in the ecosystem, but also in drivers of change, such as global seafood markets,” Dr Daw explained.

Why use web crawlers?

The article highlights the fact that analysis and response are not necessarily organized around a single government actor, but might take place as the result of collaborations between different state and non-state stakeholders.

“If the outputs are available more widely, analysis and responses could even be the result of autonomous actions, assumed by independent organizations and individuals,” said Dr Galaz.

The authors focus on three potential approaches in using web crawlers to forewarn ecological shifts.

Firstly, web crawlers can collect information on the drivers of ecosystem change, rather than the resultant ecological response. For example, if rapidly emerging markets for high value species lead to overexploitation and collapse of a fishery, web crawlers can be designed to collect information on rapid changes in prices of key species, landings or investments in particular regions.

Secondly, but less certain, future early warning systems can make use of the recent insight that shows that ecosystems sometimes ‘signal’ a pending collapse. The variability of fish populations has for example been shown to increase in response to over-exploitation.

Thirdly, web crawlers may find information that describes ecological changes at small scales, which may warn of similar shifts in other locations. This includes early warnings of invasive species, as well reduced resilience of ecosystems at larger scales due to the small-scale loss of interconnected populations or ecosystems.

Further development needed

Although a promising start, Galaz and his co-authors stress the need for further research into the use of eco-monitoring web crawlers.

“We recognize that crucial challenges need to be addressed before a web crawler-based early warning system can contribute to the avoidance of abrupt ecosystem change,” added Dr Crona.

However, the authors conclude that existing successes in early detection of epidemic outbreaks with similar tools prove the untapped potential and importance of making smarter uses of information posted on the World Wide Web.

Sending Out Internet Warnings For Outages, Viruses


ScienceDaily (Mar. 17, 2009) — A long-overdue internet early warning system for Europe could help the region avoid deliberate or inadvertent outages, reduce the spread of new computer viruses, and ensure continuity of services.


Malte Hesse and Norbert Pohlmann of the Institute for Internet Security at the University of Applied Sciences Gelsenkirchen, Germany, point out that there is a growing need to improve the stability and trustworthiness of the internet, whether one is referring to web access, email, instant messaging and file transfer systems.

They add that raising awareness of critical processes and components on the internet among those responsible for their operation is essential. Moreover, there is a need to learn about internet use so that needs and service demands can be best catered for.

The internet is an incredibly diffuse network with no single, centralised control hub. Its complexity is not bounded by geographical, political, administrative or cultural borders, which means it presents an amazing challenge to the global society hoping to make best use of it and avoid criminal and terrorist activity that might occur online.

The internet's strength lies in this decentralised structure, but that also represents a problem in that it is not governed and consists of almost 30,000 autonomous systems each managed by individual organisations mostly within the private sector. The researchers obtained this figure using their AiconViewer tool developed by colleague Stefan Dierichs in 2006. Unfortunately, private organisations are exposed to a high level of competition, especially in times of recession, and this precludes the open exchange of important management information.

Nevertheless, if a European early warning system is to be built there is a need for a shift in attitude. "The cooperation of companies, organisations and governments is important to create a global view of the internet. By that we will be able to detect attacks in time and answer interesting research questions about the internet," the researchers say.

Early warning systems are present in various systems and are a crucial component of effective risk management in enterprises and for national homeland security systems. In order to create a European early warning system, funding has to be provided mainly by public sources in combination with income which can be generated through added value for the private partners, the researchers conclude.


Journal reference:

  1. Malte Hesse and Norbert Pohlmann. European internet early warning system. International Journal of Electronic Security and Digital Forensics, 2009, 2, 1-17

Safer, Better, Faster: Addressing Cryptography’s Big Challenges


ScienceDaily (Dec. 15, 2008) — Every time you use a credit card, access your bank account online or send secure email cryptography comes into play. But as computers become more powerful, network speeds increase and data storage grows, the current methods of protecting information are being challenged.


Once shrouded in secrecy, cryptography (using mathematical algorithms to secure, hide and authenticate data) has come out into the light in the current digital era. No longer restricted – in Western countries at least – by tight usage and export controls, cryptographers are now collaborating more extensively than ever before to create better algorithms and faster encryption methods to protect the vast volumes of data being generated by governments, businesses and citizens.

In many ways, European researchers are leading the way in addressing the big challenges facing the future of information and data security. “There are three big issues facing cryptographers,” says Bart Preneel, a professor at Katholieke Universiteit Leuven in Belgium and president of the International Association for Cryptologic Research. “Cost, speed and long-term security.”

The first two problems are closely interconnected, a consequence of the trend towards storing more information in more distributed systems, from the flash drives and smart cards in your pocket, to the computer in your home or the network at your office. Cost, in this sense, refers not only to the cost of hardware capable of robust encryption, but also the energy cost of running cryptographic processes on increasingly tiny, low-power devices. Cryptographic programmes also need to be faster if they are to secure the vast amount of information now being stored.

The 10 terabyte question

“In a few years we will have devices in our pockets with 10 terabytes of storage capacity – current methods are far too slow to encrypt that amount of data practically,” Preneel notes.

Time is also a problem in another sense. A lot of data being generated today will need to be kept secure for decades or even centuries to come, but history has shown that gains in computer processing power make it easier to crack cryptographic codes. Algorithms developed in the 1970s, for example, can now be readily broken by researchers.

“We may want to store medical information securely for a long time, not just for the duration of someone’s life, but in the case of DNA data for the lifetime of their children and grandchildren as well,” Preneel says.

Those challenges and others were addressed by an international network of researchers led by Preneel. With funding from the EU, the ECRYPT network of excellence brought together 32 leading research institutes, universities and companies to produce some of the most valuable contemporary research on cryptography, generating 10 percent of all papers and research articles in the information security field published worldwide over the last four years.

Structured into five core research areas, dubbed “virtual laboratories,” the researchers developed improved cryptographic algorithms, ciphers and hash functions, studied cryptographic protocols and implementation methods, and worked on more robust algorithms for digital watermarking.

Among their main achievements are eight new algorithms with the capacity to outperform AES, the Advanced Encryption Standard developed by two Belgian researchers in the 1990s and subsequently adopted by the US government to protect classified information. They also developed a new and improved method for creating cryptographic protocols based on game theory, and created lightweight cryptographic algorithms for use in low-power, low-computing-capacity devices such as smart cards and Radio Frequency Identification (RFID) tags.

Three competitions of the kind that sparked innovation in digital cryptography in the 1970s and 80s were also organised to find winning applications in the fields of stream ciphers, cryptographic software benchmarking and digital watermarking.

Towards real-world applications

The researchers’ work will all but certainly feed into commercial cryptographic applications over the coming years. A block cipher, for example, is due to be used on commercial RFID technology, while another application has been developed by Danish project partner Aarhus University for secure auctions in the agricultural sector.

Many of the researchers are continuing their work in a second project, ECRYPT II, which began in August 2008. Whereas ECRYPT received funding under the EU’s Sixth Framework Programme for Research (FP6, 2002-2006), the follow-up initiative is being funded under FP7 (2007-2013). The new project will deepen research in core areas that were addressed more broadly by the first initiative.

“We know that our studies have been read by banks, businesses and governments around the world, but because we made the information publicly available we don’t know how they are using it,” Preneel says.

Cryptography has not, therefore, shed its veil of secrecy entirely.

This is part one of a two-part series on ECRYPT.

Apple slates dates for WWDC, touts Snow Leopard, iPhone 3.0


Apple announced Thursday that its Worldwide Developers Conference will begin June 8, which had previously been pegged by reports as the most likely launch date for the company's next operating system, dubbed Snow Leopard.

WWDC, which will again be held in San Francisco at the Moscone Center, will run from June 8-12, according to an invitation Apple sent out today and the conference's Web site, which went live Thursday morning.

[ Is Snow Leopard Apple's secret business weapon? | How does Mac OS X Snow Leopard stack up versus Windows 7? | Check out InfoWorld's special report on the iPhone 3.0 OS . ]

Those dates were predicted earlier this month by David Zeller, a writer for the The Baltimore Sun. Zeller based his guess on an opening marked only as a generic "corporate meeting" on the Moscone Center's online calendar. He also speculated that Apple would hold its keynote on June 8 and use that time to unveil Snow Leopard, a.k.a. Mac OS 10.6, a performance and stability upgrade that Apple promised last June would be available in about a year.

Apple has used the WWDC keynote as a platform to strut major news and new products. Last year, for example, CEO Steve Jobs revealed the new iPhone 3G, which Apple started selling in July.

It's not known whether Jobs will attend WWDC this year; when he announced in January that he was taking time off for health reasons, Jobs said his leave of absence would run through the end of June.

Although the WWDC Web site still lacks a schedule that details developer session topics, Snow Leopard will obviously play a central role. "WWDC's detailed technical sessions teach you how to take full advantage of new foundation technologies to ensure your application is ready and completely optimized for Mac OS X Snow Leopard," said the page dedicated to the new operating system.

"Turbocharge your application by using the new performance oriented frameworks, Grand Central Dispatch and OpenCL, to efficiently tap the vast gigaflops of computing power in the CPU and GPU," the page continued, referring to some of the already-touted improvements to Mac OS X, including one that will let software "steal" computing power from the graphics processor.

Apple's also planning to highlight iPhone 3.0, the just-previewed smartphone upgrade, at WWDC. "See how to add new features such as in-app purchases, peer-to-peer connection via Bluetooth, communication with hardware accessories, and alerts using Apple Push Notification service," said the conference site.

WWDC admission is priced at $1,295 through April 24, at which point it will climb to $1,595.

Computerworld is an InfoWorld affiliate.

Intel boosts speed of low-power processors


Intel is expected to refresh its line of laptop chips Monday with new ultra-low-voltage processors that should make ultraportable laptops operate faster without sacrificing battery life.

Intel currently offers ultra-low-voltage processors for fully functional thin and light laptops, such as Apple's expensive MacBook Air and Lenovo's ThinkPad X300. These chips fit into small spaces and draw less power than conventional laptop chips. The chips are about the size of a dime, or 60 percent smaller than mainstream laptop chips.

[ Stay ahead of advances in hardware technology with InfoWorld's Ahead of the Curve blog and newsletter. ]

Laptops with the new chips should boost the speed of applications while drawing the same power as with the earlier chips. The chips will run at clock speeds of up to 1.6GHz, a speed bump from earlier chips that ran at up to 1.40GHz. Depending on the applications, existing ultraportable laptops with ULV chips run from anywhere between four to seven hours.

"There should be no difference to battery life with these speed bumps," said a source familiar with Intel's plans.

The chips will be available on Monday, according to the source. They will be a part of Intel's Montevina mobile platform, which bundles mobile laptop chips and wireless capabilities into laptops.

The refresh comes ahead of Intel's planned release of new ultra-low-voltage processors for inexpensive laptops, which are due in the second quarter as part of the updated Montevina Plus platform.

The new chips include the dual-core Intel Core 2 Duo SU9600 chip, which will run at 1.60GHz, include 3MB of cache and draw 10 watts of power, according to an Intel document seen by IDG News Service. It will be priced at $289 for 1,000 units. The single-core Intel Core 2 Solo SU3500 chip will run at 1.40GHz, with 3MB of cache. It will be priced at $262.

Intel is also releasing faster low-voltage (LV) processors as part of the laptop chip refresh. The Intel Core 2 Duo SP9600 dual-core chip will run at 2.53GHz, an improvement from previous LV chips that ran at up to 2.40GHz speed, according to the document. The chip will include 6MB of cache, draw 25 watts of power and be priced at $316.

Also being launched is the SL9600 Intel Core 2 Duo, which will run at 2.13GHz, include 6MB of cache and draw 17 watts of power. It will be priced at $316, according to the document.

IBM leading 'Open Cloud Manifesto' charge


Let's put the speculation about who's behind the Open Cloud Manifesto to rest right now: InfoWorldhas learned that IBM is the driving force. That's according to two cloud vendors who said they signed the document.

"We are part of the manifesto," said Pankaj Malviya, CEO of LongJump, which provides an on-demand platform for business applications. "We worked with IBM."

[ InfoWorld's Tom Sullivan interviewed author Nick Carr about the many ways cloud computing will disrupt IT. | And Gartner this week projected that cloud spending will skyrocket in 2009. ]

Big Blue, of course, is not working alone. Malviya did not know all the companies involved but mentioned Cisco and possibly some apps vendors. Amazon and Hewlett-Packard are also rumored to be involved, and the list is certainly longer.

"IBM is who reached out to us, too," said Bob Moul, CEO of on-demand integration provider Boomi. "We read it and it's kind of innocuous, like motherhood and apple pie, hard to argue with." And so Boomi, like LongJump, signed it.

Microsoft, however, took some issue with the document in a middle-of-the-night blog post written by Steve Martin, a Microsoft developer program product manager. Martin wrote that Microsoft was disappointed by the lack of openness in the manifesto's development.

That reaction surprised Rueven Cohen, founder and CTO of Enomaly, and one of the manifesto's authors, who fired back that the drafters have been in "active discussions with Microsoft" and that it "has literally come together in the last couple of weeks."

When asked about the confusion, Boomi's Moul explained, "I don't know about all that. IBM approached us about a week before it was to be released."

LongJump's Malviya, meanwhile, said he did not find the process or resulting document to be final or closed to contribution. "This is a declaration that we want portability, so let's start working toward it."

IBM's PR firm did not immediately respond to a request for comment. But Big Blue on its Web site has an "architectural manifesto" that of course begins with "cloud computing."

The Open Cloud Manifesto is slated to be released on Monday.

Inside HP Labs: Eight cool projects


Innovation is both an exciting, revolutionary event and a mundane step-by-step process. For every remarkable, headline-making discovery -- flash memory! high-def movies! quad-core processors! -- there are more iterative research projects that move technology forward inch by inch.

At Hewlett-Packard Labs, projects such as a new substrate for flexible displays might make headlines one day, but will finally emerge as a shipping product only years later. Another example: When supercomputers finally run at petascale speeds -- many millions of operations per second -- researchers go back to the drawing board and figure out how they will run at exascale (trillions of operations).

[ Keep up on the latest tech news headlines at InfoWorld News, or subscribe to the Today's Headlines newsletter. ]

HP Labs is a bit different from some labs in computing: It has 600 researchers on staff, but only about 50 large-scale projects that each have smaller, related projects. Microsoft, by contrast, is working on several hundred projects in 55 research areas and employs about 800 researchers.

Much of the HP research is directly tied to printing, imaging, and server technology. This year, several ongoing projects reveal what's ahead for the 60-year-old company. All of the projects described below were developed in HP Labs. Some are exposed externally -- meaning they are available publicly -- but they were all birthed from HP Labs.

Flexible displays
Imagine a computer display that is made almost entirely of plastic, can be discarded, or rolled up and placed into a satchel, and yet has all the brightness and color properties of the LCD on your desk. HP Labs has already invented the technology to make this happen, which is called self-aligned imprint lithography (SAIL) technology. Although the flexible display as a concept is not new, HP just recently worked with Arizona State University's Flexible Displays Center to create a first prototype, with the first full-scale rollout with the U.S. Army planned in the next few years.

"The patterning information is imprinted on the substrate in such a way that perfect alignment is maintained regardless of process-induced distortion," says Carl Taussig, director of the Information Surfaces Lab at HP Labs. This allows for more cost-effective continuous production on a flexible plastic material, in a low-cost, roll-to-roll manufacturing process. "The critical problem for roll-to-roll electronics fabrication is patterning and alignment of micron scale features," Taussig explains. "Imprint lithography is a high-speed, high-resolution process."

Color Thesaurus
HP Labs developed the online Color Thesaurus as a way to choose a color based on entering the name of a more well-known color and seeing slight variations. There are roughly 600 common color names such as cyan or lime green, but thousands of actual colors that designers can pick.

The thesaurus is also a printed book that shows all of the available colors and the name. (In an interesting twist, the color book was printed using another HP Labs research project called MagCloud.com, which allows you to create a magazine or booklet and request a printed version.)

Firefox likely to win race to fix PWN2OWN contest bug


Unless its two biggest rivals take extraordinary steps, Mozilla will be the first browser maker to patch a critical vulnerability used a week ago to win $5,000 in a hacking contest.

At the PWN2OWN competition last Thursday, a computer science student from Germany who would only give his first name as Nils cracked a Sony laptop running Windows 7 by exploiting a previously unknown bug in Microsoft's new Internet Explorer 8 (IE8). Nils quickly followed that hack with two more, of Apple's Safari and Mozilla's Firefox, both running on a MacBook.

[ InfoWorld's Roger A. Grimes writes about the curious case of the invulnerable Web browser in his Security Advisor blog. | Also check out the Test Center's complete guide to browser security. ]

For each successful exploit, Nils was paid $5,000 -- a total of $15,000 -- by contest sponsor 3Com's TippingPoint. He also was awarded the Sony Viao notebook for being the first researcher to hack the machine.

By the rules of PWN2OWN, researchers are not allowed to divulge details of their vulnerabilities and exploits, but instead sign over the rights to both to TippingPoint, which in turn reports the bugs to the appropriate vendor.

Earlier today, Mozilla's director of security engineering, Lucas Adamski, told IDG News reporter Robert McMillan that Firefox would be patched against a critical vulnerability that had been disclosed on the milw0rm.com site yesterday.

Later in the day, Mozilla added that it would also fix the bug Nils revealed. "Both issues have been investigated and fixes have been developed which are now undergoing quality assurance testing," the company announced in a post to its security blog this afternoon. "These fixes will be included in the upcoming Firefox 3.0.8 release, due to be released by April 1."

Mozilla has labeled the 3.0.8 update as a "high-priority fire drill security update;" in other words, an emergency patch. Before declaring the fire drill update, Mozilla had slated 3.0.8 for a mid-April release.

It's unlikely that either Microsoft or Apple will patch their browsers' bugs before Mozilla. Apple, for example, never generates Safari patches within such a short time span. For that matter, neither does Microsoft.

The IE flaw, however, may already be fixed -- at least in IE8. The PWN2OWN contest featured the release candidate of IE8, which is the version Nils hacked. Several hours before the contest kicked off, Microsoft released the final edition of IE8, which some have speculated included a fix for Nils' bug.

They point to ZDNet security blogger Ryan Naraine's interview with Nils, during which he said he "really appreciated" the work of Mark Dowd and Alex Sotirov, two researchers who announced last summer that they were able to circumvent Windows Vista-specific security measures designed to hamper attacks. Yesterday, Dowd confirmed that IE8's final build addresses his and Sotirov's tactics "completely."

It's not known, however, if versions of IE prior to IE8 are also vulnerable to attack. In his interview, Nils admitted only that he had not been able to trigger the bug in IE7.

If Mozilla follows past practice, it will release Firefox 3.0.8 next Wednesday in the afternoon Pacific time, at which point users will be able to download it directly from the Mozilla site or use the browser's built-in updater to download and install the patched version.

Mozilla last patched Firefox March 4.

Computerworld is an InfoWorld affiliate.

Ruby on Rails 2.3 arrives


Ruby on Rails 2.3, the latest version of the popular open source Web development framework, is now available, according to the Rails Web site.

A posting earlier this month revealed the release.

[ InfoWorld reported earlier this year about Ruby on Rails being on track for major upgrades. | How do Ruby IDEs compare? InfoWorld's Strategic Developer blogger Martin Heller finds out. ]

"Rails 2.3 is finally done and out the door. This is one of the most substantial upgrades to Rails in a very long time," according to a blog post on the Rails site.

Key features of version 2.3 include: templates, allowing a new skeleton Rails application to built with a developer's own default stack of configurations; engines, to share reusable application pieces; Rack, for accessing middleware; Metal, for writing optimized logic, and nested forms, to make complex forms easier.

The Rails team in May is expected to offer up an early version of Rails 3, which will serve as a merger of the Rails and Merb frameworks.

Rails was the brainchild of developer David Heinemeier Hansson.

Paul Krill is an editor at large at InfoWorld, specializing in news and features related to application development, Java, and .Net. He can be reached at paul_krill@infoworld.com.

Skype may launch a version of its mobile Voice-over-IP (VoIP) and instant messaging service as early as next week, according to a report from Om Mali


Businesses can make use of Twitter as a public relations channel, but they need to be aware of security when sharing ideas, Gartner has said.

The social networking site allows users to post short messages, or microblogs. Gartner predicts that by 2011, enterprise microblogging will be a standard feature of 80 percent of social software platforms, even though most are currently consumer orientated.

[ Become savvy about the professional uses of social networking; read InfoWorld's Six commandments of social networking at work ]

Many businesses are using Twitter as a public relations and marketing channel, Gartner said. Businesses are "tweeting" about corporate accomplishments, distributing links to press releases, and responding to other Twitterers' comments about their brand.

Gartner said this approach should be used with caution because uninteresting Tweets could hinder the brand image as much as help.

Other businesses use Twitter in an "indirect way", to enhance and extend their personal reputations, thereby enhancing the company's reputation. This approach relied on them saying clever, interesting things, attracting followers to read their blogs, Gartner said.

Some firms use Twitter as an internal platform, to communicate about what they are doing, projects they are working on and ideas that occur to them. But Gartner said it "does not recommend" using Twitter in this way, "because there is no guarantee of security".

"Inbound signalling" is a final way firms are using Twitter, as a resource showing information about what customers, competitors and others are saying about a company. "Savvy companies use these signals to get early warnings of problems and collect feedback about product issues and new product ideas," Gartner said.

Jeffrey Mann, research VP at Gartner and author of the 'Four ways in which enterprises are using Twitter' report, said: "In general, Twitter usage by employees should be covered by existing web participation guidelines."

"If organisations have not defined a public Web participation policy, they should do so as quickly as possible."

Follow highlights from ComputerworldUK on Twitter. Computerworld UK is an InfoWorld affiliate.

Report: Skype for iPhone may launch next week


Skype may launch a version of its mobile Voice-over-IP (VoIP) and instant messaging service as early as next week, according to a report from Om Malik of GigaOM. If true, the application could prove popular with cost-conscious iPhone users who'd like to save a few bucks by routing calls over AT&T's data network. Currently, iPhone users must use third-party services like Fring to access Skype.

Skype for iPhone could debut at the CTIA Wireless trade show that begins April 1 in Las Vegas next week, Malik speculates. Industry watchers have anticipated an iPhone version for some time, particularly since the popular service already runs on other mobile devices. In addition to Skype for Windows Mobile and Skype Lite for Java phones, there's a version for Google Android phones like the T-Mobile G1.

[ Related: Skype is diving deeper into business phone market. | Keep up on the latest tech news headlines at InfoWorld News, or subscribe to the Today's Headlines newsletter. ]

While Skype's mobile ambitions are good for iPhone users, the VoIP service won't bring the end of standard voice-call service anytime soon. As PC World's Liane Cassavoy reports in a recent review of four mobile VoIP apps -- EQO, Skuku, Skype for Windows Mobile, and Truphone -- mobile VoIP call quality isn't quite there yet, and the cost savings aren't that great.

"Most notably, the call quality remains iffy at best, and in some instances it's absolutely abominable. Also, depending on your calling habits, you may not see any savings at all. Most services still charge a per-minute rate, so you'll save on domestic calls only if you've exceeded your regular voice plan's allotment (in which case you might still be better off upgrading your voice plan)."

Cassavoy does point out, however, that international callers can save big bucks by using a mobile VoIP service.

While Skype for iPhone may not offer immediate benefits for everyone, the service has plenty of potential in the near future. If the next-gen iPhone, which may appear as early sometime this summer, includes video capture, Skype could turn the iPhone into a portable video conferencing device. That may not happen immediately, of course, but AT&T's upcoming 4G Long-Term Evolution (LTE) broadband network, slated to debut in 2011, could very well have bandwidth necessary to make two-way, real-time video a popular app, particularly among business users.

Deep computer-spying network touched 103 countries


A 10-month investigation has found that politically motivated hackers have infected computers with malicious software to steal sensitive documents, control Webcams, and completely control infected computers, including some belonging to 'high-value' targets


A 10-month cyberespionage investigation has found that 1,295 computers in 103 countries and belonging to international institutions have been spied on, with some circumstantial evidence suggesting China may be to blame.

The 53-page report, released on Sunday, provides some of the most compelling evidence and detail of the efforts of politically-motivated hackers while raising questions about their ties with government-sanctioned cyberspying operations.

[ Let InfoWorld help you learn how to secure your systems with Roger Grimes' Security Adviser blog and newsletter ]

It describes a network which researchers have called GhostNet, which primarily uses a malicious software program called gh0st RAT (Remote Access Tool) to steal sensitive documents, control Web cams and completely control infected computers.

"GhostNet represents a network of compromised computers resident in high-value political, economic and media locations spread across numerous countries worldwide," said the report, written by analysts with the Information Warfare Monitor, a research project of the SecDev Group, a think tank, and the Munk Center for International Studies at the University of Toronto. "At the time of writing, these organizations are almost certainly oblivious to the compromised situation in which they find themselves."

The analysts did say, however, they have no confirmation if the information obtained has ended up being valuable to the hackers or whether it has been commercially sold or passed on as intelligence.

Although evidence shows that servers in China were collecting some of the sensitive data, the analysts were cautious about linking the spying to the Chinese government. Rather, China has a fifth of the world's Internet users, which may include hackers that have goals aligning with official Chinese political positions.

"Attributing all Chinese malware to deliberate or targeted intelligence gathering operations by the Chinese state is wrong and misleading," the report said.

However, China has made a concerted effort since the 1990s to use cyberspace for military advantage "The Chinese focus on cyber capabilities as part of its strategy of national asymmetric warfare involves deliberately developing capabilities that circumvent U.S. superiority in command-and-control warfare," it said.

The analysts' research started after they were granted access to computers belonging to Tibet's government in exile, Tibetan nongovernmental organizations and the private office of the Dalai Lama, which was concerned about the leak of confidential information, according to the report.

They found computers infected with malicious software that allowed remote hackers to steal information. The computers became infected after users opened malicious attachments or clicked on linked leading to harmful Web sites.

The Web sites or malicious attachments would then try to exploit software vulnerabilities in order to take control of the machine. In one example, a malicious e-mail was sent to a Tibet-affiliated organization with a return address of "campaign@freetibet.org" with an infected Microsoft Word attachment.

Nehalem Mac Pro: The Mac reborn

This isn't merely the ultimate Mac, but an impossibly idealistic concept for a fast, green, silent, rugged, expandable, and affordable top-end workstation, made real

You can't tell from the outside that Apple's new two-socket, eight-core Mac Pro, based on Intel's new Nehalem Xeon CPU, is much changed from the two-socket, quad-core Mac Pro that preceded it. The only giveaway? One front panel FireWire port has been upped from 400Mbps to 800Mbps.

If Apple hewed to PC tradition, that port, and the swapped-in Nehalem guts, would be the headline changes to the platform. Nehalem Mac Pro could get my attention, and the attention of the top echelon of Mac users, with that alone. What completely blows me away is that Nehalem Mac Pro is a reengineering of the entire Mac Pro platform, the 2006 edition of which set a bar for build quality that nothing in its price class can touch.

MacPro_ss.jpg

Apple used Nehalem as an occasion to build the ideally fast and modern Mac, but it didn't stop there. In the new Mac Pro, Apple also created a computing platform that satisfies a combination of criteria that buyers only dream of demanding: Toxin-free, recyclable, quiet, low power, rugged, transportable, field-repairable, upgradable without tools, broadly configurable, internally and externally expandable, and the kicker, affordable.

Nehalem certainly deserves its due. It is thoroughly modernized with on-chip memory controllers, three-level cache, and a point-to-point bus design. The 1066MHz DDR3 RAM is the fastest memory yet made. Based on Apple's numbers, it looks like Nehalem packs 50 to 90 percent more firepower into the Mac Pro chassis compared to prior and current top configurations. The arrival of Intel's world-class architecture couldn't be more timely. Nehalem Mac Pro is a hand-in-glove fit for the full 64-bit Snow Leopard (Mac OS X 10.6) that will put Mac Pro on par with two-processor RISC Unix workstations.

I'm embarking on a full review, with performance testing, of the top-end Nehalem Mac Pro now, but I got an early look at a more basic Mac Pro config expressly so that I could share some of the more remarkable aspects of the platform. Some of the enhancements are new, and some continue along the path set by the original Mac Pro, but in combination, they afford owners a unique level of flexibility and investment protection. And they mark the new Mac Pro as wildly different.

Like the Mac Pro before it, Nehalem Mac Pro is loaded with I/O. This front panel has a headphone jack, along with two USB 2.0 ports and two 800Mbps FireWire ports. There are three more USB 2.0 ports, two more FireWire 800 ports, stereo line in and out, TOSLINK optical digital audio in and out, and two gigabit Ethernet ports around back. One day, these 800Mbps FireWire ports will be killer conduits to external storage, but cables and adapters for 400Mbps peripherals are available.

Mac Pro has four internal, side-facing 3.5-inch SATA drive bays. Empty bays are filled with aluminum drive trays in which you can mount raw SATA drives. Hard drives and PCI Express 2 expansion cards plug into Mac Pro the same way, by being inserted into backplane sockets. There are no loose hard drive cables in the system or in the drive trays, just SATA plug headers stuck right onto the logic board that mate directly with the drives. Inserting and removing drive trays requires no tools, and is so easy that you may, as I do, treat these as removable storage.

In addition to the SATA hard drive bays, there is a front-facing bay for a second 5.25-inch, half-height optical drive. I have not tested this assumption, but I suppose you could mount another SATA hard drive in there using a standard mounting bracket. That, by the way, is the only expansion operation that might require you to look at a cable, much less handle one. Connections between system logic boards are made via short headers, none of which you need to mess with. Getting rid of all that cable helped Apple get the toxins out of Mac Pro's recipe.

mac-pro-cpu.png

The bit that took my breath away, not only for its elegance but for its implications, is the processor tray. One lightweight tray holding the CPUs and RAM is the most easily removed module in this fully modular system. With this arrangement, it takes Apple no time at all to custom-build a Mac Pro to your specifications. It takes you no time at all to reprovision (i.e., swap trays among machines according to need) or effect repairs on-site without moving machines or pulling cables.

As I see it, the tray also allows Apple to track Intel's tick-tock architecture updates without subjecting the entire system to another redesign, or subjecting Mac Pro buyers to requirements for unique spare parts. Anything that Intel changes, even the size of the socket or the speed of the RAM, should be limited to the processor tray. This is the sort of forward-looking, longevity-focused engineering invested in very high end systems. If my take is right, then the 2009 Nehalem Mac Pro hardware platform, once purchased, is one that should stay stable and upgradable until, say, PCI Express 3 becomes an imperative.

Posted by Tom Yager on March 26, 2009 03:00 AM