Monday, March 30, 2009

Updated Recommendations For Protecting Wireless, Remote Access Data, From NIST


ScienceDaily (Feb. 25, 2009) — Telecommuting has freed many to work far from the confines of the office via laptop, but the price of working while sipping a latte at that sunny café is the danger that a public network will not keep the data that passes through it safe. Now, to combat the risk inherent in remote access, the National Institute of Standards and Technology (NIST) has updated its guide on maintaining data security while teleworking.


The revised guide offers advice for protecting the wide variety of private and mobile devices from threats that have appeared since the first edition appeared in August 2002. Together with the preponderance of dangerous malware on the Web, the vulnerability of wireless transmissions from mobile devices has created dramatic new security challenges.

“In terms of remote access security, everything has changed in the last few years. Many Web sites plant malware and spyware onto computers, and most networks used for remote access contain threats but aren’t secured against them,” says Karen Scarfone of NIST’s Computer Security Division. “However, even if teleworkers are using unsecured networks, the guide shows the steps organizations can take to protect their data.”

Among these steps is the recommendation that an organization’s remote access servers—the computers that allow outside hosts to gain access to internal data—be located and configured in ways that protect the organization. Another is to ensure that all mobile and home-based devices used for telework be configured with security measures so that exchanged data will maintain its confidentiality and integrity. Above all, Scarfone says, an organization’s policy should be to expect trouble and plan for it.

“You should assume external environments contain hostile threats,” she says. “This is a real philosophy shift from several years ago, when the attitude was essentially that you could trust the home networks and public networks used for telework.”

The new guide provides recommendations for organizations. A companion publication* offers advice for individual users on securing their own mobile devices.

While intended primarily for U.S. federal government agencies, the guide has been written in broad language in order to be helpful to any group that engages in telework. Formally titled Special Publication 800-46 Revision 1, Guide to Enterprise Telework and Remote Access Security.

* SP 800-114, User’s Guide to Securing External Devices for Telework and Remote Access.


Adapted from materials provided by National Institute of Standards and Technology.

Does Humor On The Internet Mold Political Thinking?


ScienceDaily (Mar. 11, 2009) — Jokes are not merely a source of popular enjoyment and creativity; they also provide insights into how societies work and what people think. Humor is so powerful it can help shape geopolitical views worldwide, according to Professor Darren Purcell and his team from the University of Oklahoma in the US.


Their study of humor including the analysis of two Achmed the Dead Terrorist skits, has recently been published online in Springer's GeoJournal.

Humor is a powerful communications tool with potential political implications at various levels of society, as the recent Danish political cartoon representations of the Prophet Mohammad and the political repercussions and resulting economic boycotts demonstrated. Purcell and colleagues' paper looks at humor as an important form of popular culture in the creation of geopolitical worldviews.

The authors use 'disposition theory' - a framework that allows them to understand who will regard which content as funny, and how derisive humor can be seen as amusing - to examine particular types of humor in texts which reflect society's concerns, developments and relationships, and by extension, the geopolitical implications of these texts. With an emphasis on social context, the theory suggests that the appreciation of humor is dependent, in part, on whether one holds a positive or negative attitude, or disposition, toward the object of humor.

Purcell and colleagues analyze two stand-up comedy routines performed by American ventriloquist Jeff Dunham. The skits center on the character of Achmed the Dead Terrorist, an unsuccessful suicide bomber. The humor plays on anti-Arab/Muslim sentiment. Dunham uses his audiences' disposition towards terrorists to get laughs, while at the same time challenging his audience members to look at their own views of terrorism, Islam, and American efforts in Iraq.

Purcell and colleagues show that disposition theory is useful to help place humor as a fluid, global phenomenon shared through various social networks via the Internet. Thanks to new communication technologies including YouTube.com, audiences around the world are engaged and can participate. The technology takes participants seriously by providing a point of entry where they can put forward their views of the world. This amplifies the potential impact of any geopolitical text.

They conclude that "the diffusion of humor with geopolitical content to a global viewing audience, via personal networks spanning multiple scales, forces us to consider the role of individuals (via forwarding and dissemination) as producers and reproducers of geopolitical codes and active participants in constructing enemies and threats, even in the guise of a two-foot tall puppet."

Internet Can Warn Of Ecological Changes


ScienceDaily (Mar. 23, 2009) — The Internet could be used as an early warning system for potential ecological disasters, according to researchers from Stockholm Resilience Centre at Stockholm University and the University of East Anglia.


Ecosystem services such as water purification and food production are of fundamental importance for all planetary life. However, these are threatened by sudden changes in ecosystems caused by various pressures like climate change and global markets. Collapsing fisheries and the irreversible degradation of freshwater ecosystems and coral reefs are examples that have already been observed. Averting such ecosystem changes is of vital importance.

Despite improved ecosystem monitoring, early warnings of ecological crisis are still limited by insufficient data and gaps in official monitoring systems. In an article for the journal Frontiers in Ecology and the Environment, centre researchers Victor Galaz, Beatrice Crona, Örjan Bodin, Magnus Nyström and Per Olsson, and Tim Daw from the School of International Development at UEA, explore the possibilities of using information posted on the Internet to detect ecosystems on the brink of change.

List servers fundamental for coral bleaching monitoring

“Information and communications technology is revolutionizing the generation of and access to information. Systematic ‘data mining’ of such information through the Internet can provide important early warnings about pending losses of ecosystem services,” said lead author Dr Galaz.

In 1997-98, unusually warm seas caused unprecedented levels of mass ‘coral bleaching’. Field observations of the global phenomenon were shared instantly through an email network, demonstrating how communication technologies can allow rapid assessment of emerging threats from informal sources. In their article Can Web Crawlers Revolutionize Ecological Monitoring?, published online this week, the authors explore the untapped potential of web crawlers - software programs that browse the World Wide Web in a methodical, automated manner – to mine informal web-based sources such as email lists, blogs and news articles, as well as online reports and databases, for ecosystem monitoring.

“If we look at coral reefs, for example, the Internet may contain information that describes not only changes in the ecosystem, but also in drivers of change, such as global seafood markets,” Dr Daw explained.

Why use web crawlers?

The article highlights the fact that analysis and response are not necessarily organized around a single government actor, but might take place as the result of collaborations between different state and non-state stakeholders.

“If the outputs are available more widely, analysis and responses could even be the result of autonomous actions, assumed by independent organizations and individuals,” said Dr Galaz.

The authors focus on three potential approaches in using web crawlers to forewarn ecological shifts.

Firstly, web crawlers can collect information on the drivers of ecosystem change, rather than the resultant ecological response. For example, if rapidly emerging markets for high value species lead to overexploitation and collapse of a fishery, web crawlers can be designed to collect information on rapid changes in prices of key species, landings or investments in particular regions.

Secondly, but less certain, future early warning systems can make use of the recent insight that shows that ecosystems sometimes ‘signal’ a pending collapse. The variability of fish populations has for example been shown to increase in response to over-exploitation.

Thirdly, web crawlers may find information that describes ecological changes at small scales, which may warn of similar shifts in other locations. This includes early warnings of invasive species, as well reduced resilience of ecosystems at larger scales due to the small-scale loss of interconnected populations or ecosystems.

Further development needed

Although a promising start, Galaz and his co-authors stress the need for further research into the use of eco-monitoring web crawlers.

“We recognize that crucial challenges need to be addressed before a web crawler-based early warning system can contribute to the avoidance of abrupt ecosystem change,” added Dr Crona.

However, the authors conclude that existing successes in early detection of epidemic outbreaks with similar tools prove the untapped potential and importance of making smarter uses of information posted on the World Wide Web.

Sending Out Internet Warnings For Outages, Viruses


ScienceDaily (Mar. 17, 2009) — A long-overdue internet early warning system for Europe could help the region avoid deliberate or inadvertent outages, reduce the spread of new computer viruses, and ensure continuity of services.


Malte Hesse and Norbert Pohlmann of the Institute for Internet Security at the University of Applied Sciences Gelsenkirchen, Germany, point out that there is a growing need to improve the stability and trustworthiness of the internet, whether one is referring to web access, email, instant messaging and file transfer systems.

They add that raising awareness of critical processes and components on the internet among those responsible for their operation is essential. Moreover, there is a need to learn about internet use so that needs and service demands can be best catered for.

The internet is an incredibly diffuse network with no single, centralised control hub. Its complexity is not bounded by geographical, political, administrative or cultural borders, which means it presents an amazing challenge to the global society hoping to make best use of it and avoid criminal and terrorist activity that might occur online.

The internet's strength lies in this decentralised structure, but that also represents a problem in that it is not governed and consists of almost 30,000 autonomous systems each managed by individual organisations mostly within the private sector. The researchers obtained this figure using their AiconViewer tool developed by colleague Stefan Dierichs in 2006. Unfortunately, private organisations are exposed to a high level of competition, especially in times of recession, and this precludes the open exchange of important management information.

Nevertheless, if a European early warning system is to be built there is a need for a shift in attitude. "The cooperation of companies, organisations and governments is important to create a global view of the internet. By that we will be able to detect attacks in time and answer interesting research questions about the internet," the researchers say.

Early warning systems are present in various systems and are a crucial component of effective risk management in enterprises and for national homeland security systems. In order to create a European early warning system, funding has to be provided mainly by public sources in combination with income which can be generated through added value for the private partners, the researchers conclude.


Journal reference:

  1. Malte Hesse and Norbert Pohlmann. European internet early warning system. International Journal of Electronic Security and Digital Forensics, 2009, 2, 1-17

Safer, Better, Faster: Addressing Cryptography’s Big Challenges


ScienceDaily (Dec. 15, 2008) — Every time you use a credit card, access your bank account online or send secure email cryptography comes into play. But as computers become more powerful, network speeds increase and data storage grows, the current methods of protecting information are being challenged.


Once shrouded in secrecy, cryptography (using mathematical algorithms to secure, hide and authenticate data) has come out into the light in the current digital era. No longer restricted – in Western countries at least – by tight usage and export controls, cryptographers are now collaborating more extensively than ever before to create better algorithms and faster encryption methods to protect the vast volumes of data being generated by governments, businesses and citizens.

In many ways, European researchers are leading the way in addressing the big challenges facing the future of information and data security. “There are three big issues facing cryptographers,” says Bart Preneel, a professor at Katholieke Universiteit Leuven in Belgium and president of the International Association for Cryptologic Research. “Cost, speed and long-term security.”

The first two problems are closely interconnected, a consequence of the trend towards storing more information in more distributed systems, from the flash drives and smart cards in your pocket, to the computer in your home or the network at your office. Cost, in this sense, refers not only to the cost of hardware capable of robust encryption, but also the energy cost of running cryptographic processes on increasingly tiny, low-power devices. Cryptographic programmes also need to be faster if they are to secure the vast amount of information now being stored.

The 10 terabyte question

“In a few years we will have devices in our pockets with 10 terabytes of storage capacity – current methods are far too slow to encrypt that amount of data practically,” Preneel notes.

Time is also a problem in another sense. A lot of data being generated today will need to be kept secure for decades or even centuries to come, but history has shown that gains in computer processing power make it easier to crack cryptographic codes. Algorithms developed in the 1970s, for example, can now be readily broken by researchers.

“We may want to store medical information securely for a long time, not just for the duration of someone’s life, but in the case of DNA data for the lifetime of their children and grandchildren as well,” Preneel says.

Those challenges and others were addressed by an international network of researchers led by Preneel. With funding from the EU, the ECRYPT network of excellence brought together 32 leading research institutes, universities and companies to produce some of the most valuable contemporary research on cryptography, generating 10 percent of all papers and research articles in the information security field published worldwide over the last four years.

Structured into five core research areas, dubbed “virtual laboratories,” the researchers developed improved cryptographic algorithms, ciphers and hash functions, studied cryptographic protocols and implementation methods, and worked on more robust algorithms for digital watermarking.

Among their main achievements are eight new algorithms with the capacity to outperform AES, the Advanced Encryption Standard developed by two Belgian researchers in the 1990s and subsequently adopted by the US government to protect classified information. They also developed a new and improved method for creating cryptographic protocols based on game theory, and created lightweight cryptographic algorithms for use in low-power, low-computing-capacity devices such as smart cards and Radio Frequency Identification (RFID) tags.

Three competitions of the kind that sparked innovation in digital cryptography in the 1970s and 80s were also organised to find winning applications in the fields of stream ciphers, cryptographic software benchmarking and digital watermarking.

Towards real-world applications

The researchers’ work will all but certainly feed into commercial cryptographic applications over the coming years. A block cipher, for example, is due to be used on commercial RFID technology, while another application has been developed by Danish project partner Aarhus University for secure auctions in the agricultural sector.

Many of the researchers are continuing their work in a second project, ECRYPT II, which began in August 2008. Whereas ECRYPT received funding under the EU’s Sixth Framework Programme for Research (FP6, 2002-2006), the follow-up initiative is being funded under FP7 (2007-2013). The new project will deepen research in core areas that were addressed more broadly by the first initiative.

“We know that our studies have been read by banks, businesses and governments around the world, but because we made the information publicly available we don’t know how they are using it,” Preneel says.

Cryptography has not, therefore, shed its veil of secrecy entirely.

This is part one of a two-part series on ECRYPT.

Apple slates dates for WWDC, touts Snow Leopard, iPhone 3.0


Apple announced Thursday that its Worldwide Developers Conference will begin June 8, which had previously been pegged by reports as the most likely launch date for the company's next operating system, dubbed Snow Leopard.

WWDC, which will again be held in San Francisco at the Moscone Center, will run from June 8-12, according to an invitation Apple sent out today and the conference's Web site, which went live Thursday morning.

[ Is Snow Leopard Apple's secret business weapon? | How does Mac OS X Snow Leopard stack up versus Windows 7? | Check out InfoWorld's special report on the iPhone 3.0 OS . ]

Those dates were predicted earlier this month by David Zeller, a writer for the The Baltimore Sun. Zeller based his guess on an opening marked only as a generic "corporate meeting" on the Moscone Center's online calendar. He also speculated that Apple would hold its keynote on June 8 and use that time to unveil Snow Leopard, a.k.a. Mac OS 10.6, a performance and stability upgrade that Apple promised last June would be available in about a year.

Apple has used the WWDC keynote as a platform to strut major news and new products. Last year, for example, CEO Steve Jobs revealed the new iPhone 3G, which Apple started selling in July.

It's not known whether Jobs will attend WWDC this year; when he announced in January that he was taking time off for health reasons, Jobs said his leave of absence would run through the end of June.

Although the WWDC Web site still lacks a schedule that details developer session topics, Snow Leopard will obviously play a central role. "WWDC's detailed technical sessions teach you how to take full advantage of new foundation technologies to ensure your application is ready and completely optimized for Mac OS X Snow Leopard," said the page dedicated to the new operating system.

"Turbocharge your application by using the new performance oriented frameworks, Grand Central Dispatch and OpenCL, to efficiently tap the vast gigaflops of computing power in the CPU and GPU," the page continued, referring to some of the already-touted improvements to Mac OS X, including one that will let software "steal" computing power from the graphics processor.

Apple's also planning to highlight iPhone 3.0, the just-previewed smartphone upgrade, at WWDC. "See how to add new features such as in-app purchases, peer-to-peer connection via Bluetooth, communication with hardware accessories, and alerts using Apple Push Notification service," said the conference site.

WWDC admission is priced at $1,295 through April 24, at which point it will climb to $1,595.

Computerworld is an InfoWorld affiliate.

Intel boosts speed of low-power processors


Intel is expected to refresh its line of laptop chips Monday with new ultra-low-voltage processors that should make ultraportable laptops operate faster without sacrificing battery life.

Intel currently offers ultra-low-voltage processors for fully functional thin and light laptops, such as Apple's expensive MacBook Air and Lenovo's ThinkPad X300. These chips fit into small spaces and draw less power than conventional laptop chips. The chips are about the size of a dime, or 60 percent smaller than mainstream laptop chips.

[ Stay ahead of advances in hardware technology with InfoWorld's Ahead of the Curve blog and newsletter. ]

Laptops with the new chips should boost the speed of applications while drawing the same power as with the earlier chips. The chips will run at clock speeds of up to 1.6GHz, a speed bump from earlier chips that ran at up to 1.40GHz. Depending on the applications, existing ultraportable laptops with ULV chips run from anywhere between four to seven hours.

"There should be no difference to battery life with these speed bumps," said a source familiar with Intel's plans.

The chips will be available on Monday, according to the source. They will be a part of Intel's Montevina mobile platform, which bundles mobile laptop chips and wireless capabilities into laptops.

The refresh comes ahead of Intel's planned release of new ultra-low-voltage processors for inexpensive laptops, which are due in the second quarter as part of the updated Montevina Plus platform.

The new chips include the dual-core Intel Core 2 Duo SU9600 chip, which will run at 1.60GHz, include 3MB of cache and draw 10 watts of power, according to an Intel document seen by IDG News Service. It will be priced at $289 for 1,000 units. The single-core Intel Core 2 Solo SU3500 chip will run at 1.40GHz, with 3MB of cache. It will be priced at $262.

Intel is also releasing faster low-voltage (LV) processors as part of the laptop chip refresh. The Intel Core 2 Duo SP9600 dual-core chip will run at 2.53GHz, an improvement from previous LV chips that ran at up to 2.40GHz speed, according to the document. The chip will include 6MB of cache, draw 25 watts of power and be priced at $316.

Also being launched is the SL9600 Intel Core 2 Duo, which will run at 2.13GHz, include 6MB of cache and draw 17 watts of power. It will be priced at $316, according to the document.