Category Archives: Security Content Management (SCM)

Research confirms that Facebook, Twitter and Google are spying on users

big brother

big brother (Photo credit: Vince_Lamb)

Social networks: can robots violate user privacy?

ECM Plus +++ Recent news in the international media has revealed numerous Internet privacy concerns that definitely deserve attention and further investigation, which is why Geneva-based High-Tech Bridge  decided to conduct a simple technical experiment to verify how the 50 largest social networks, web services and free emails systems respect – or indeed abuse – the privacy of their users.

According to High-Tech Bridge, the experiment and its results can be reproduced by anyone, as we tried to be as neutral and objective as possible. Continue reading

Leave a comment

Filed under Analysis, Analytics & Metrics, Business Risk, Compliance, Content Curation, Content Governance, Content Management, Content management system (CMS), Content Protection, Content Security, Corporate Civic Responsibility (CCR), Corporate Governance, Customer Relations Management (CRM), Data mining, Data privacy, Data protection, Data storage, Digital asset management, Enterprise Content Management, GRC (Governance, Risk & Compliance), Industry News, Information Governance, Information Management, Information security, Intellectual Property (IP), Intelligent Search, Legal, Policy Management, Regulatory Compliance, Reporting, Risk Analysis, Risk Assessment, Risk Management, Security Content Management (SCM), Social Content Management, Trusted Cloud, UGC - User-generated content, WCM

Mobile Content Revenues To Hit USD65 billion by 2016 – report

English: A variety of laptops, smartphones, ta...

A variety of laptops, smartphones, tablets and ebook readers arranged. (Photo credit: Wikipedia)

Juniper Research says tablet purchases, direct carrier billing implementation to fuel growth

By ECM Plus staff

ECM Plus +++ Annual revenue generated from content delivered to mobile handsets and tablets is set to rise by nearly US$25bn over the next three years, reaching US$65bn by 2016, according to a new report from Juniper Research. Continue reading

Leave a comment

Filed under 3GPP LTE, 4G, Analysis, Billing & Customer Care Systems (BCCS), Content Delivery, Content Monetization, Content Syndication, LTE, Mobile Content, Security Content Management (SCM), Telecommunications

Encrypted web moves step closer with HTTPS Everywhere 3.0

English: Logo of the Electronic Frontier Found...

The Electronic Frontier Foundation backing HTTPS Everywhere 3.0 web site encryption

EFF partners boost HTTPS Everywhere 3.0, now protecting 1,500 more sites

By ECM Plus staff

ECM Plus /London/ +++ The Electronic Frontier Foundation has stated its long-term mission to ‘encrypt as much of the Web as possible’.

The Foudantion said it now hopes to encrypt all of it.

According to the EFF, HTTPS Everywhere, the browser extension it produces with the ‘Tor’ Project and a community of volunteers, is now used by more than 2.5 million people around the world, the EFF said. Continue reading

Leave a comment

Filed under Content Security, Enterprise Content Management, Industry News, Information security, Open Source, Security Content Management (SCM)

Hearing to probe cellular data legal status

English: A cell phone tower in Palatine, Illin...

A cell phone tower in Palatine, Illinois, USA.

Foundation fights for cellular phone users’ privacy in Tuesday hearing

By ECM Plus staff

ECM Plus +++ The Electronic Frontier Foundation will urge a federal appeals court on Tuesday to recognize cellular phone users’ privacy rights and require that the government obtain a warrant before collecting cell phone location information.

According to the Foundation, the oral argument is set for 9 a.m. on October 2nd in New Orleans.

At issue in Tuesday’s hearing are government requests for judicial orders authorizing the disclosure of 60 days of location data from two separate cell phone companies as part of a routine law enforcement investigation. Continue reading

Leave a comment

Filed under 3GPP LTE, 4G, Content Protection, Content Security, Data privacy, Data protection, Industry News, Legal, LTE-Advanced, M-DMC (Mobile Digital Media Controllers), M-DMD (Mobile Digital Media Downloaders), M-DMP (Mobile Digital Media Players), M-DMS (Mobile Digital Media Servers), M-DMU (Mobile Digital Media Uploaders), M2M (Mobile-to-Mobile), Mobile Analytics, Mobile communication, Mobile Content, Security Content Management (SCM), Software, WirelessMAN-Advanced

Foundation sues over email and phone hacking allegations

EFF_logo_white

Electronic Frontier Foundation

EFF says government ‘withholding information’ about ‘unconstitutional spying’

Washington, D.C. – The Electronic Frontier Foundation (EFF) sued the Department of Justice (DOJ) today, demanding answers about illegal email and telephone call surveillance at the National Security Agency (NSA). Continue reading

Leave a comment

Filed under Collaboration, Compliance, Content Governance, Content Protection, Content Security, Corporate Civic Responsibility (CCR), Corporate Governance, Customer Relations Management (CRM), Data Governance, Data mining, Data privacy, Data protection, Data storage, GRC (Governance, Risk & Compliance), Industry News, Information Governance, Information Management, Information security, Internal Controls, Policy Management, Regulatory Compliance, Risk Management, Security Content Management (SCM), Segregation of Duties (SoD), Software, Telecommunications, Web governance

Free DDoS survival guide on how to keep content alive

DDOS - Denial of Servicel

DDOS – Denial of Servicel (Photo credit: -= Treviño =-)

New guide from EFF – Keeping Your Site Alive – outlines how to keep content online in case of a Denial of Service Attack

By ECM Plus staff

ECM Plus /London/ +++ A new online guide from the Electronic Frontier Foundation outlines how website operators can fend off DDOS attacks and keep their sites alive and accessible. DDoS – Distributed Denial of Service attacks – the flooding websites with traffic in order to make them unavailable – have become an increasingly popular way to take down or block Internet content, EFF said. Continue reading

Leave a comment

Filed under Content Management, Content Security, Enterprise Content Management, Industry News, Security Content Management (SCM), WCM, Web accessibility, Web Performance Optimization (WPO), Web publishing management

New Hampshire home for ‘Cyber Kill Chain’ centre

United Kingdom

Howarth opens new intelligence centre in Farnborough, England. Image: stumayhew

“Minister For War” opens Lockheed Martin’s new cybersecurity intelligence centre in Hampshire

By ECM Plus staff

ECM Plus +++ Lockheed Martin has just opened its first Security Intelligence Centre in Farnborough, England, which the company said would extend its ‘global reach’ and would be ‘augmenting facilities in the United States’.

The new Farnborough centre was opened in the presence of Conservative Party Member of Parliament The Right Honourable Gerald Howarth representing the constituency of Aldershot, also in Hampshire, England. Continue reading

Leave a comment

Filed under Analytics & Metrics, Content Security, Industry News, Information Management, Information security, Intelligent Search, Knowledge Management, Security Content Management (SCM)

Flaws identified in AES encryption

Advanced Encryption Standard InfoBox Diagram

AES: Drowning or waving?

Researchers identify first flaws in the Advanced Encryption Standard

BY ECM PLUS STAFF

ECM Plus +++ Researchers have found a weakness in the AES algorithm.

According to cryptanalysts, they managed to come up with a clever new attack that can recover the secret key four times easier than anticipated by experts. The attack is a result of a long-term cryptanalysis project carried out by Andrey Bogdanov (K.U.Leuven, visiting Microsoft Research at the time of obtaining the results), Dmitry Khovratovich (Microsoft Research), and Christian Rechberger (ENS Paris, visiting Microsoft Research). Continue reading

Leave a comment

Filed under Business Risk, Content Protection, Content Security, Industry News, Information security, Regulatory Compliance, Risk Analysis, Risk Assessment, Risk Management, Security Content Management (SCM)

Building and contents protected by insurance after riots

Protesters fighting police in Nørrebro, Copenhagen

Insurers: claim period to be extended

Association of British Insurers provide guidance on legal position of riot compensation

By ECM Plus staff

ECM Plus +++ According to the Association on British Insurers, on current estimates insured losses and damage suffered by individuals and British businesses are likely to be well over £100 million. Continue reading

Leave a comment

Filed under Asset management, Business Continuity, Business Risk, Content Protection, Content Security, Contingency Planning, GRC (Governance, Risk & Compliance), Industry News, Risk Analysis, Risk Assessment, Risk Management, Security Content Management (SCM)

Joomla releases 1.7, revs up open source cycle

Joomla!Day DE 2006

Joomla! 1.7 CMS unveils 1.7 with security tools

By ECM Plus staff

ECM Plus +++ Open source content management software specialist Joomla has just taken the wraps off the latest version 1.7 of its CMS offering.

The main updates in this version include search options, a defence against form manipulation and language-specific font settings. Continue reading

Leave a comment

Filed under Content management system (CMS), Enterprise Content Management, Open Source, Security Content Management (SCM), Vendor News, Web Content Management

Fourth amendment adds encryption protection for PDFs

Introducing Scroogle

WebSearch claims to be easier to use than other search engines. Image: Edgeworks Ltd.

New search and retrieval capabilities for encrypted PDF documents

By ECM Plus staff


ECM Plus
+++ New protection capabilities in the latest WebSearch v 4.0 component of DocuLex’s Archive Studio Content Management Suite can now search documents and retrieve content in real-time with password-protected PDF files. Continue reading

Leave a comment

Filed under Content Management, Content Security, Enterprise Search, Search, Security Content Management (SCM), Vendor News

Skype hack: Breach could have been prevented – StarForce

Skype Technologies S.A. logo

US$8.5 billion VoIP service hacked

Hack could have been delayed or prevented says StarForce

By ECM Plus staff

ECM Plus +++ The recent high-profile hacking of Skype could have been prevented – or at least delayed – by better software protection, according to copy protection software specialist StarForce Technologies.

According to the company, Russian hacker Efin Bushmanov has cracked Skype’s protocol and data encryption mechanisms. Bushmanov has also made it publicly available for the public. Continue reading

Leave a comment

Filed under Content Protection, Content Security, Data privacy, Data protection, Industry News, Security Content Management (SCM)

RFID hacked: TV station finds chips are toast

RFID chip pulled from new credit card

RFID chip pulled from a credit card

Television group reveals serious security flaws with RFID-equipped credit and debit cards

By ECM Plus staff

ECM Plus +++ A consumer television report into the insecurity of radio frequency ID – or so-called ‘RFID’ -equipped credit and debit card chips has shocked the security industry.

One vendor, SecurEnvoy said the apparent ease with which RFID has been hacked to create a `magic wand’. With the ability for anyone to effectively read RFID cards at a distance clearly demonstrates that RFID is no longers fit for purpose, observers say. Continue reading

1 Comment

Filed under Content Security, Industry News, Mobile Analytics, Mobile Apps, Mobile communication, Mobile Content, Risk Assessment, Risk Management, Scanning, Security Content Management (SCM), Telecommunications

Upcoming UK Census data offshoring destined for more debacles

Divisions of the United Kingdom

ONS offshoring Briton's personal data for £150m

Report reveals 2011 UK Census of private data set to go offshore to foreign agencies, increasing probability of yet more Government-sanctioned personal data breaches

ECM Plus – A new Guardian report indicates that the British Government’s Office of National Statistics is to pay a foreign agency £150 million in taxpayer money to undertake the collection and processing of Britain’s families and personal income data, with new extra questions in this year’s Census demanding to know the details of sources of incomes as well as additional personal data in the 32-page questionnaire, now destined to go offshore in the government cutbacks and ‘efficiency-savings’ outsourcing deal. Continue reading

Leave a comment

Filed under Business Intelligence (BI), Content Categorisation, Content Fingerprinting, Content Management, Content Monetization, Content Protection, Content Security, Data centres, Data Governance, Data privacy, Data protection, Data storage, Document archiving & retrieval, Document Management, Document scanning & imaging, Forms management, processing, eForms, Information Governance, Information security, Records Management, Scanning, Search, Security Content Management (SCM)

FEATURE: Key considerations for ‘cloudsourcing’ contracts – CAMM

Cloud computing sample architecture

Cloud computing sample architecture

BY JOHN WALKER

Economics dictates that the CEO, and CFO have a balancing act delivering security, and quality operational services, whilst at the same time, attempting to reduce the organisational and operational costs of delivering the business’ mission.

One opportunity in focus is that of ‘CloudSourcing’, where, depending on the size and type of business, they may be considering engaging in a contract in which part, or all, of their operations are placed into the hands of a Cloud Provider, be this SaaS, PaaS, IaaS or any other such ‘Anything-as-a-Service’ that may accommodate the operational model. Continue reading

Leave a comment

Filed under Business continuity, Cloud Computing, Compliance, Consultancy/Consulting/Systems Integration, Content Security, Data centres, Disaster Recovery, Features, High Availability, Hybrid Cloud, IaaS (Infrastructure-as-a-Service), Information Governance, Information Management, Information security, PaaS (Platform-as-a-Service), Private Cloud, Public Cloud, Regulatory Compliance, Risk Management, SaaS (Software-as-a-Service), Security Content Management (SCM), SIP, Telecommunications, Virtualization, VPS Cloud

FEATURE: Cloud computing – the calm before the storm

Sourcefire

Sourcefire. Picture: joelesler

Enterprises across the world are hunting down the best way to scale their computing capability. Finding ways to work smarter has become increasingly important in today’s cost-controlled market. IT departments searching for a solution often demand that the infrastructure has to be quick, cheap and dynamic and this is one of the reasons that cloud computing is being touted as a potential corporate game changer.

BY LEON WARD

Cloud Computing has been described as, arguably, the third revolution of IT, following the Personal Computer and Internet revolutions. But like most revolutions, progress towards widespread acceptance of the new regime is likely to take some time, amidst suspicion, a lack of confidence, wise skepticism and some false starts.

Many CIOs are in the process of moving applications and services into the Cloud. Some are considering Cloud-based computing due to economic reasons, while others are looking to create new dynamic IT services. Regardless of the reasons, with organisations contemplating moving to a Cloud environment many are forgetting a potentially fatal element, security. Before an IT director can make a clear sensible decision about a future Cloud strategy, let’s investigate where some risks lie, and work out where responsibility and accountability falls.

Ensuring a security evaluation is undertaken is a ‘must do’. Never simply assume that a service provider’s security is up to scratch. It must be checked. Matt Watchiniski, Sourcefire’s Director of Vulnerability Research Team, endorses this view. He says that as more and more enterprises and organisations move their applications to SaaS platforms, some provider is bound to fail miserably. We haven’t seen the major compromise, but this risk has to be on the horizon. So with storm clouds ahead, who is going to be in the dock when there is a failure? An understanding of accountability needs to be clear. Businesses using these types of services need to make sure they understand who is responsible for fixing these problems when they crop up, and who is legally accountable for the data loss. Outsourcing your data to the Cloud does not equate to outsourcing the risk, if your Cloud provider was responsible for the loss of your customer’s data, you could still find yourself accountable.

The impact of failure

Serious failures within a cloud infrastructure can have repercussions that reach much further than within a single enterprise. Last year, after a major server outage, thousands of users of the Sidekick mobile phone and messaging service were warned that their personal data and photos had “almost certainly been lost”. Over a week later Microsoft, owner of Danger the cloud-computing provider, confirmed that they had managed to recover “most, if not all the customer data”. This example publically highlights the potential danger of entrusting trusting personal data to the cloud, but it doesn’t mean there’s a major design flaw in the Cloud-Computing concept. It’s implementation specific, but it negatively impacts confidence in the whole market.

On the positive side, Cloud service providers typically have more resources to put into security and reliability than most businesses, and far more than a small business. Where would you rather your sensitive, client and internal data was stored? Public clouds advertise a robust, highly physically secure data centre. Additionally there should be a team of on-site security experts focused on protecting that information stored. Compare this to the alternative of the data being stored on a laptop which is continually moving around and being accessed in different locations. The data centre now seems the smart choice, but don’t forget you are handing over your information to someone else, and therefore losing direct control over it.

Compliance matters

Those considering a move to the Cloud need to consider how their market is regulated. Strict codes of conduct apply to many businesses and in some cases, regulations might stipulate that personal data has to remain within a specific country thus ruling out the use of certain providers who distribute data globally. In some situations the storage and processing of information away from a user or the enterprise is seen as a real advantage, a good example of this would be in a government, military or other high-security environment. Because of this advantage I expect to see some near-term implementations of Government controlled and designed community Cloud infrastructures. If those who are accountable for potential data loss are in control of the Cloud constructed to protect it, many of my concerns dissipate and central responsibility can be re-established around critical information that has traditionally been distributed. Imagine a world where DVDs of sensitive data are no longer lost in the post; they are simply re-referenced within the Cloud.

Make sure your house is in order

If the idea of storing and working with your critical data in a shared external infrastructure looks attractive in terms of cost metrics, before looking for a provider it is clear that some research needs to be undertaken.

Firstly, you need to prepare a list of mandatory security controls that you demand around the data you consider most sensitive, and then come up with suggestions of how a provider could potentially demonstrate these controls to you in action. Only then start to research the providers that believe they can meet the demands you place on your data. This should be part of ­any due diligence process. As the service consumer you should be in control of your data wherever it is, and you should have the ability to demand that any provider can prove their security capability, as it is likely that you will ultimately be accountable for a breach. Find out who you call if there is a problem and details around what service can you expect? In times of crisis you need guarantees that it will be prompt and responsive. The Cloud provider needs to be transparent.

If you have performed in-depth research before looking at service offerings you should understand the problems that face Cloud providers. Never be scared to call foul when you see a complex problem with an over simplified solution. It’s a cliché, but if it sounds too good to be true, it probably is. Always make sure you keep the horror show that is accountability in mind. Out of sight should never mean out of mind.

Leon Ward is Senior Security Engineer, Sourcefire – www.sourcefire.com

 

1 Comment

Filed under Cloud Computing, Compliance, Content Security, Data Governance, Data storage, Features, IaaS (Infrastructure-as-a-Service), Information Governance, Information security, PaaS (Platform-as-a-Service), Security Content Management (SCM)

Open Text’s ECM and SharePoint are a DoD cert

Image representing Open Text as depicted in Cr...

Open Text's records managment is a DoD cert

OpenText obtains DoD 5015.02-STD certification for ECM and SharePoint

BY PAUL QUIGLEY

ECM Plus – OpenText components of its ECM Suite 2010 together with SharePoint Server 2010 have met U.S. Department of Defense 5015.02-STD certification requirements for records management.

According to Open Text, this encompasses unstructured content including physical, electronic and email records.

The company said it had certified against all versions of the DoD standard since its inception more than a decade ago and had completed test process over twenty times on various platforms and versions.

Managed by the Joint Interoperability Test Command (JITC), DoD 5015.02-STD certification involves a rigorous testing process. OpenText had Records Management and its Application Governance and Archiving for SharePoint offerings tested in an environment that included SharePoint 2010 and a full set of MS operating systems, office applications and database servers.

Through the test, OpenText said it met requirements for applying records management policies and securely managing the entire lifecycle of the information.

Leave a comment

Filed under Content Security, Enterprise Content Management, Records & Information Management (RIM), Records Management, Security Content Management (SCM), Vendor News

Javascript code cuts off bloggers from the Web

EFF calls for immediate action to defend Tunisian activists against government cyberattacks

Commentary by Eva Galperin

Demonstrations and protests over unemployment and poor living conditions have been ongoing in Tunisia since the beginning of December, but last week the Tunisian government turned up the heat on bloggers, activists, and dissidents by launching a JavaScript injection attack that siphoned off the usernames and passwords of Tunsians logging in to Google, Yahoo, and Facebook.

The Tunisian government has used these stolen credentials to log in to Tunisians’ email and Facebook accounts, presumably downloading their messages, emails, and social graphs for further analysis, and then deleting the accounts entirely. Among the compromised accounts are Facebook pages administered by a reporter with Al-Tariq ad-Jadid, Sofiene Chourabi, video journalist Haythem El Mekki, and activist Lina Ben Khenni.

Unsatisfied with merely quelling online freedom of expression, the Tunisian government has used the information it obtained to locate bloggers and their networks of contacts.

By late last week, the Tunisian government had started arresting and detaining bloggers, including blogger Hamadi Kaloutcha, and cyberactivist Slim Ammamou, who alerted the world to his whereabouts at the Tunisian Ministry of the Interior using Google Latitude.

This weekend, Tunisian citizens began to report on Twitter and in blogs that troops were using live ammunition on unarmed citizens and started communicating with one another to establish the numbers of dead and injured.

Most notably, Tunisians have been posting videos of the protests, including the dead and wounded on Facebook, the only video-sharing site which is not currently being blocked by the Tunisian government, which makes access to Facebook especially important for the protest movement.

Because of the Tunisian government’s attacks on citizens’ login credentials, Tunisians should take the following steps to protect themselves: if HTTPS is available, use HTTPS to login to Facebook, Google, and Yahoo. If you are using Firefox, EFF’s HTTPS Everywhere plug-in will do this for you automatically.

EFF has received reports that the Tunisian government is periodically blocking HTTPS access to Facebook, Google, and Yahoo. If that is the case and you must login over HTTP, install the following Greasemonkey script to strip out the JavaScript which the Tunisian government has inserted to steal your login credentials. If you have logged in to Facebook, Google, or Yahoo recently over HTTP, login using HTTPS and change your password.

Additionally, EFF calls on Google, Yahoo, and Facebook to take action to protect the privacy of its users by alerting them of the potential compromise of their accounts and encouraging them to take the above steps.

Finally, Facebook has reported that is in the process of taking technical steps to protect the privacy of their users. We hope that they include the following: •Make Facebook logins default to HTTPS, if only in Tunisia, where accounts are especially vulnerable at this time. Google and Yahoo logins already default to HTTPS. •Consider allowing pseudononymous accounts for users in authoritarian regimes, where political speech under your real name is dangerous and potentially deadly. Many Tunisian activists are unable to reinstate Facebook accounts that have been erased by the Tunisian government because they were not using their real names. Websites providing services to Tunisian citizens cannot afford to sit on the sidelines while the Tunisian government launches malicious attacks on the privacy of users and censors free expression. Facebook, Google, and Yahoo should take these concrete steps as quickly as possible to inform and better protect their users.

Source: EFF

Leave a comment

Filed under Collaboration platform, Content Security, Enterprise Social Software (ESS), Industry News, Reporting, Security Content Management (SCM)

FEATURE: Getting To Grips With Data Classification

Image representing CREDANT Technologies as dep...

Image via CrunchBase

It has to be done so where do you start, where do you stop
and what do you do the in the meantime? Sean Glynn, VP Product Development Credant Technologies investigates.

Data classification is not a new concept. It is a fundamental requirement for information security, and the consequences for failing to fully implement a data classification scheme can be disastrous. Nevertheless, while many organisations start data classification projects they all too often struggle to complete them. This article outlines what should be included in a data classification project, examines why many fail to get off the ground and the steps companies should take to protect sensitive data while it’s in progress.

Data classification essentially means assigning a level of sensitivity to data used by an organisation, and it forms a critical component of Information Lifecycle Management (ILM). While classification systems vary from country to country, and indeed organisation to organisation, most have levels corresponding to the following general definitions (from the highest level to lowest): top secret; secret; confidential; restricted (or sensitive); and unclassified.

While computer programs exist that can help with data classification, ultimately it is a subjective business and is often best done as a collaborative task that considers business, technical, and other points-of-view. Different departments within an organisation all need to be consulted and will have different views on what is, and isn’t, sensitive and how it is best protected. An additional aspect to consider is whether a document that is confidential today will remain so for the duration of its life.

For example, a public company’s financial results will be extremely sensitive prior to announcement yet, once in the public domain, confidentiality is no longer an issue.

With so many people involved in the decision process, and the constantly changing status of information, it is easy to see what causes delays or even the complete downfall of many data classification projects.

Practical Tips for Implementing a Data Classification Scheme
With these challenges identified, we’ve outlined some practical approaches to implementing a data classification scheme to help you get started:

Understand what is realistically achievable: If you’ve ever tried to do everything at once you’ll recognise that inevitably nothing gets done and the same is true with data classification. That said, it is equally true that something is better than nothing. By breaking the project down into smaller, targeted and manageable pieces with regular reviews and implementation targets, you will start to chisel away at the task.

Set the bar at a realistic height: There are varying degrees of discipline and compliance with a data classification project. Unfortunately, not every organisation is lucky enough to have a completely disciplined workforce so, if there is likely to be resistance, opt for a simpler scheme ratherthan one that is overly regimented or complex and so likely to cause resistance among users.
 
Keep your friends close and your enemies closer: Regardless of how rigid or simplistic your control strategy is, it is going to need support from others within the organisation if it’s to be accepted and embraced. By consulting with key individuals early on in the process, and ensuring they feel part of its design and introduction, the project is less likely to receive hostility during its implementation.
 
Approve the data classification strategy asap, even if full implementation is delayed. First, it costs nothing at this stage; secondly, any new systems can be designed with data classification in mind, narrowing the implementation burden to existing systems; and finally, if confidential information is inadvertently disclosed, the security program can point to the classification strategy and push accountability to the line of business managers that have not yet implemented it.
 
Use regulation to argue your case: Increased legislation is one of the most effective tools that can be used by a security program. Reference these regulations to bring awareness of the need for data classification and give the security program the necessary muscle and support to get implemented.

Classify networks instead of data: For organisations where classification of data appears to be an unreachable goal, try classifying the networks instead of the data.  Whilst network classification is not a trivial exercise, it is often easier than the implementation of a comprehensive data classification scheme for data that is digitally stored in large organisations.

Something is better than nothing: While you’re going through the process of identifying your sensitive data and how best to protect it, it will quickly become clear if you have sensitive data that needs protecting. A comprehensive endpoint data encryption solution, protecting data where it resides on laptops, desktops, smartphones and the now ubiquitous USB Thumb Drives everyone seems to use, is an important tool that can be rolled out across the organisation, even before a data classification project is completed, and can then be utilised moving forwards.

However, be warned, not all encryption solutions offer the same protection. Ideally, you need something that:

- can be rolled out, managed and maintained centrally
is user specific, not device dependant, so that even if a PC is shared the users data isn’t

- will be enforced so users can not circumnavigate its use

- covers all forms of data regardless of the program in which it is created; the network where it resides or the device it is carried on

- should not impede the device’s performance

There is no short cut to faster data classification but there are solid arguments for why it should be undertaken, correctly. While it is true that information can’t be adequately protected if there’s no way of tracking its location, value and sensitivity to leakage, equally while it’s waiting to be rated it is vulnerable to exploitation. If you know you’ve got valuables somewhere in the building, you install an alarm system and make sure entry and exit points are secured – shouldn’t you at least do the same for your data?

Classification Definitions

Top Secret The highest level of classification of material on a national level. Such material would cause “exceptionally grave damage” if made publicly available;

Secret: Such material would cause “grave damage” if it were publicly available;

Confidential: Such material would cause “damage” or be “prejudicial” if publicly available;

Restricted: Such material would cause “undesirable effects” if publicly available. Some countries do not have such a classification;

Unclassified: Technically not a classification level, but is used for documents that do not have a classification listed above.

www.credant.com

Leave a comment

Filed under Content Categorisation, Content Security, Document Management, Features, ILM (Information Lifecycle Management), Reporting, Security Content Management (SCM)

HTTPS Everywhere anti-FireSheep security gaining traction

Hugh D'Andrade's design to commemorate Electro...

Image via Wikipedia

EFF tool provides protection from ‘Firesheep’

ECM Plus – The Electronic Frontier Foundation has just launched a new version of ‘HTTPS Everywhere’, a free security tool with enhanced protection for the Mozilla Firefox web browser against so-called “Firesheep” and other exploits of webpage security flaws.

According to the Foundation, HTTPS secures web browsing by encrypting both requests from the Firefox browser to websites and the resulting pages that are displayed. Without HTTPS, online reading habits and activities are vulnerable to eavesdropping, and accounts are vulnerable to hijacking.

Their report stated that while many sites on the web offer some limited support for HTTPS, it said it was often difficult to use. Websites may default to using the unencrypted, and therefore vulnerable, HTTP protocol or may fill HTTPS pages with insecure HTTP references. The HTTPS Everywhere tool uses carefully-crafted rules to switch sites from HTTP to HTTPS.

The new free version of HTTPS Everywhere responds to growing concerns about website vulnerability in the wake of Firesheep, an attack tool that could enable an eavesdropper on a network to take over another user’s web accounts, on social networking sites or webmail systems, for example, if the browser’s connection to the web application either does not use cryptography or does not use it thoroughly enough. Firesheep, which was released in October as a demonstration of a vulnerability that computer security experts have known about for years, sparked a flurry of media attention.

“These new enhancements make HTTPS Everywhere much more effective in thwarting an attack from Firesheep or a similar tool” commented The Foundation’s senior staff technologist Peter Eckersley. “It will go a long way towards protecting your Facebook, Twitter, or Hotmail accounts from Firesheep hacks. And, like previous releases, it shields your Google searches from eavesdroppers and safeguards your payments made through PayPal.”

Other sites targeted by Firesheep that now receive protection from HTTPS Everywhere include Bit.ly, Cisco, Dropbox, Evernote, and GitHub.

In addition to the HTTPS Everywhere update, the Foundation also released a guide to help website operators implement HTTPS. “Firesheep works because many websites fail to use HTTPS,” said technology director at the Foundation, Chris Palmer. “Our hope is to make it easier for web applications to do the right thing by their users and keep us all safer from identity theft, security threats, viruses, and other bad things that can happen through insecure HTTP. Taking a little bit of care to protect your users is a reasonable thing for web application providers to do and is a good thing for users to demand.”

The first beta of HTTPS Everywhere was released last June. Since then, the tool has been downloaded more than half a million times.

To download HTTPS Everywhere for Firefox:
https://www.eff.org/https-everywhere

For more on implementing HTTPS in websites:
https://www.eff.org/pages/how-deploy-https-correctly

Leave a comment

Filed under Content Management, Content Security, Industry News, Information security, Security Content Management (SCM), Web compliance, Web Content Management, Web Experience Management (WEM), Web governance

Google data privacy breach leaves brand reputation in tatters

Left to right, Eric E. Schmidt, Sergey Brin an...

Image via Wikipedia

Street View privacy breach damaging to user confidence in the brand

by Jack Adams, SEO Consultant, Greenlight

Google has escaped a fine for collecting personal data – including email addresses and passwords – being used in UK public Wi-Fi spots.

Despite labelling the act as a “significant breach” of the Data Protection Act that was “not fair or lawful,” the UK’s Information Commissioner’s Office (ICO), has simply requested that Google delete the offending data and given the search giant nine months to review its privacy practices.

The general consensus is that the ICO has been very lenient over this – its probe has been labelled “lily-livered” by Tory Member of Parliament (MP) Robert Halfon. But has Google really escaped lightly?

The damage to Google’s brand perception is substantial. In direct monetary terms it seems certainly Google has got off lightly, in the UK at least. Investigations in other nations are ongoing.

However, there can’t be any denying that some extent of damage has been done to users’ confidence in the brand and its squeaky-clean image, built around the company’s ‘don’t be evil’ motto.

This damage has been compounded by the widespread national news coverage of the privacy breaches, especially with the matter being discussed in UK parliament.

Even in general conversations, one slightly less-than-tech-savvy individual expressed reservations over Google, asking whether this breach means they should discontinue internet banking in fear of online fraud, demonstrating just how this news has pervaded the general public. The question though is whether this has put the respective individual off searching with Google? Will they now switch to a competitor?

The damage to Google’s brand perception is substantial.



Leave a comment

Filed under Analysis, Analytics & Metrics, Business Intelligence (BI), Content Security, Customer Relations Management (CRM), Data Governance, Data privacy, Data protection, Scanning, Security Content Management (SCM)

Information Commissioner: Street View breaches data protection laws

Christopher Graham, the UK Information Commiss...

Commissioner Graham finds Street View broke law

Government watchtdog finds search engine giant of unlawful Street snooping in private data dredge

ECM Plus – The UK Government Information Commissioner has found search engine behemoth Google will be subject to audit and must sign an undertaking not to breach data protection laws again.

The Information Commissioner further stated that if the search engine company were to undertake such an unlawful data breach in the UK again, they would ‘face enforcement action’ the ICO said in a statement.

Commissioner Christopher Graham said: “…there was a significant breach of the Data Protection Act when Google Street View cars collected payload data as part of their wi-fi mapping exercise in the UK.”

Commissioner Graham has instructed Google to sign an undertaking in which the company commits to take action to ensure that breaches of this kind cannot happen again.

Furthermore, in light of the breach of data protection, an audit of Google UK’s Data Protection practices will also be undertaken.

However, the Information Commissioner rejected calls for a financial penalty to be imposed on the search engine giant, but said that it was ‘well placed to take further regulatory action if the undertaking is not fully complied with.’

According to the ICO statement, iInternational data protection authorities that undertook in-depth investigations into Google’s activities found fragments of personal data, including emails, complete URLs – and passwords.

ICO said that following the admission by Google that personal data had indeed been collected, and the fact that Google used the same technology in the UK, the Commissioner decided that formal action was necessary.

Commissioner Graham is also requiring Google to delete the payload data collected in the UK as soon as it is legally cleared to do so.

Information Commissioner, Christopher Graham, added: “It is my view that the collection of this information was not fair or lawful and constitutes a significant breach of the first principle of the Data Protection Act.”

Said Graham: “The most appropriate and proportionate regulatory action in these circumstances is to get written legal assurance from Google that this will not happen again – and to follow this up with an ICO audit.”

Leave a comment

Filed under Business Intelligence (BI), Compliance, Content Security, Data centres, Data Governance, Data privacy, Data protection, Data storage, Enterprise Content Management, Enterprise Search, Industry News, Information Governance, Information security, Intellectual Property (IP), Knowledge Management, Reporting, Rights Management, Scanning, Security Content Management (SCM)

Does application security pay?

Information security

Information security frameworks

Communicate the business value of application security solutions in a language that matters to the board

by Craig LeGrande & Amir Hartman, Mainstay Partners

The last decade has seen a dramatic shift in the way companies manage information security and protect vital data.

In the past, businesses confronted the threat of cyber attacks and data breaches primarily by building firewalls and other “perimeter defences” around their networks, but the threat has continued to evolve, and more criminals are hacking into applications that are running on a plethora of new devices and environments, including cloud, mobile, and social media.As a result, the focus of threat protection is moving from securing the infrastructure to securing the software applications that businesses write and deploy.

The shift has created a market for a new generation of products and services – known as software security assurance (SSA) solutions – that help companies uncover vulnerabilities in their code, effectively fix these defects, and produce software that is impervious to security threats.In an effort to quantify the business value of SSA, Fortify Software (the leading provider of SSA solutions) commissioned Mainstay Partners to conduct in-depth interviews of 17 global customers – organisations that have implemented SSA, and representing a cross-section of industries. The study found that companies are realising substantial benefits from SSA right out of the box, saving as much as $2.4M per year from a range of efficiency and productivity improvements, including faster, less-costly code scanning and vulnerability remediation and streamlined compliance and penetration testing.

Exponential increases in benefits, however, are being achieved by companies that deploy SSA in more comprehensive and innovative ways. These advanced deployments include embedding software security controls and best practices throughout the development lifecycle, extending SSA programs into critical customer-facing product areas, and leveraging SSA to seize unique value-generating opportunities. For these strategic companies, the benefits of software security solutions can add up to as much as $37M per year.In our interconnected world, software is everywhere – not just in data centres or on desktop computers, but in mobile phones and all kinds of wireless devices and consumer products.

Software resides on the Web and in the cloud, where businesses rely on software-as-a-service solutions (SaaS) for mission-critical business functions. Application security protects the software that is running in all these environments and devices, and the business improvements of SSA are seen as extending to wherever applications are deployed.At a time when IT budgets are coming under closer scrutiny, chief information security officers (CISOs) say they are being called upon to justify SSA investments from a cost benefit perspective.

This article provides the evidence needed for information security executives to communicate the business value of software security solutions in a language that the board can relate to.Faster vulnerability remediation:Across the board, companies adopting SSA solutions report significant efficiency improvements in finding and remediating software security flaws:

By introducing automated SSA technology and best practices, organisations reduced average remediation from 1 to 2 weeks to 1 to 2 hours.Organisations saved an estimated $44K annually in remediation costs per application.For the average organisation, these cost savings are estimated conservatively to amount to $3M per year.Streamline compliance and penetration testing: Companies are facing tighter government and industry regulations for application security, particularly in new software standards in the financial services and health-care industries.By configuring the SSA solution to address specific compliance mandates, for example, organisations quickly identified and ranked vulnerabilities according to severity. The solution also generates a report that documents these activities, creating an audit trail for regulators:The average organisation adopting SSA saw its fees paid to compliance auditors fall by 89% – or about $15K annually.

The average organisation achieved a 50% reduction in penetration testing efforts, translating into annual savings of more than $250K.Avoid data breaches:The threat of a major data breach can keep CISOs awake at night, and most are aware of the history of high-profile security failures that have damaged company reputations and resulted in millions of dollars in legal and PR fees, remediation expenses, lost revenue, and customer churn:The average cost of a data breach is about $3.8M, or $204 per compromised recordCompanies can save an estimated $380K per year by adopting SSA solutions to avoid major data breaches.

Avoid software compliance penalties:Businesses that fail to comply with industry standards for software security can face substantial penalties. In the payment card industry, for example, penalties can range from $5K to $25K per month. Moreover, when lost sales, customer churn, and remediation expenses are also factored in, the full cost of PCI non-compliance can be substantially more:By ensuring compliance through systematic application security testing, companies can conservatively avoid approximately $100K in penalties annually.

Pay-for-performance benefits:In an innovative use of software security technology, companies that outsource software development to partners are leveraging solutions to drive cost-effective “pay for performance” programs:

Companies using SSA to screen and adjust the price of outsourced code can capture fee savings of about $100K annually while improving the overall quality of code delivered by development partners.

Faster product launches boost revenue and margins:For companies that sell e-commerce and other commercial software, discovering security flaws late in the development life cycle can delay new product introductions (NPI) by weeks or months, putting revenue and market share at risk and adding millions of dollars in development costs:

Companies can capture an estimated $8.3M of additional software revenue through a comprehensive SSA program to minimise product delays.Companies can realise development cost savings of about $15M per year from SSA-driven reductions in product delays.Maximise the value of M&A deals:Companies can extend the value of their software security solution by deploying it in strategic ways, i.e. using it to perform software security audits of acquisition targets that own core products critically dependent on software:

In the case of a company completing two $100M deals a year, using SSA to assess the software assets of prospective acquisitions can yield valuation benefits of approximately $10MRealising The Full Potential Of SSA

For companies able to exploit all of the opportunities for value creation, that potential can reach $37M annually. There are three stages that organisations typically go through on the path to SSA maturity:

Explore: These organisations deploy an SSA solution across a small number of applications (10–20) and developer teams as a proof-of-concept initiative.

Accelerate: These organisations are moving beyond “toe-in-the-water” pilot programs and are actively incorporating threat detection and remediation techniques across key development teams and applications.

Optimise: These organisations have embedded software security tools, processes, and training within a formal SDLC program. Many are also leveraging SSA solutions in innovative ways to generate additional business value and create competitive differentiation.

As this article has demonstrated, SSA solutions not only help companies minimise the risk of a successful cyber attack, but also offer substantial efficiency and productivity benefits that help control costs, speed software development cycles, and in some cases even boost revenue and asset values.BOX OUT A : Key FindingsThe full benefit potential of SSA solutions can reach $37M annually.Initial SSA deployments can create $2.4M in annual benefits.Average vulnerability remediation time fell from 1 to 2 weeks to 1 to 2 hours.Repeat vulnerabilities reduced from 80% to virtually zero.

Organisations saved an estimated $44K in remediation costs per application.Companies reducing time-to-market delays saved an estimated $8.3M annually.BOX OUT B : What should organisations look for in a SSA solution?

Not all vendors offer the same functionality and services. When evaluating the options, organisations should look for an SSA value-maximising solution that:

Offers both deep remediation functionality and a breadth of supporting services

Provides support for cross-team collaboration – bringing information security teams, developers, risk officers, and auditors together in a coordinated effort

Seamlessly integrates with existing application life-cycle management (ALM) and development environments, shortening time to remediation

Provides in-depth guidance on how to correct each security vulnerability, thus accelerating remediation further

Offers robust governance capabilities, including the ability to define and communicate security policies and rules across the organisationProvides research on the latest threat trends and techniques, ensuring that teams are aware of all emerging threats

The reduction in remediation time is due to several factors, including SSA capabilities and practices that (1) pinpoint the exact location of a flaw in the code lines, (2) prioritise vulnerabilities to focus resources on the most critical flaws, and (3) provide guidance on how to correct each vulnerability. Estimate based on a conservative 10 vulnerabilities per application, and 67 critical applications.

Mandates and standards commonly impacting application development projects include: the Payment Card Industry Data Security Standards (PCI DSS), the Federal Information Security Management Act (FISMA), Sarbanes-Oxley Act (SOX), the Health Insurance Portability and Accountability Act (HIPPA), and North American Electric Reliability Corporation (NERC) standards.

Assumes 50% reduction in penetration testing effort; legacy environment costs are based on an average of eight penetration tests per year at $67K per test.
(See “Top 10 Data Breaches and Blunders of 2009,” eSecurity Planet: “http://www.esecurityplanet.com/views/article.php/3863556/” http://www.esecurityplanet.com/views/article.php/3863556/ Top-Ten-Data-Breaches-and-Blunders-of-2009.htm.)

Fourth Annual U.S. Cost of Data Breach Study, Ponemon Institute, 2009. Assumes that the average company would experience a major data breach once every 10 years. Assumes that an average penalty period would last six months. Research indicates that penalties make up only 30% of the full impact of non-compliance (“Industry View: Calculating the True Cost of PCI Non-Compliance,” Ellen Lebenson, CSO Online). Assumes a non-compliance period lasting six months. Average penalty periods range from 3 to 24 months. Assumes average fee discounts of 1% applied to annual outsourced development expenditures of $10M.

Estimate assumes a $20B company earning 1.25% of its profit per quarter from new product sales; 50% of product introductions are assumed to benefit from SSA efficiencies, which help avoid an average of four critical vulnerabilities per product and 30 days of delays. Estimate assumes a $20B company incurring new product development costs equal to 3% of revenue; 50% of new products, or $300M in expenses, are assumed to be impacted by SSA efficiencies, which help avoid an average of four critical vulnerabilities per product and 30 days of delays; the resulting 5% productivity increase saves $15M in development expenses.

Sample customer assumptions include: $20B customer, 10% new product revenue contribution; 50% first year margins; two-month product delay due to vulnerabilities; 500 critical/severe vulnerabilities; $3.8M cost per breach @ 10% probability; $200M in M&A @ 5% valuation benefits.


Leave a comment

Filed under Content Security, Features, Information security, Security Content Management (SCM)

Mobile cloud gets Wyse

Thin Client

Image by ryan2point0 via Flickr

Wyse brings launches new mobile thin client for secure ubiquitous access to virtual desktops

ECM Plus – Wyse has launched the X50c mobile thin client with PCoIP and a new, software system for roaming access to virtual desktops.

According to Wyse, the X50c offers a secure lightweight device that can be supported remotely.

Wyse said that as organisations were seeking to support more flexible working with virtualisd environments across campuses, they wanted to deploy clients without issues of securing, managing and updating individual clients.

Wyse’s X50c mobile client is based on Wyse-enhanced SUSE Linux and is designed for secure mobile computing. According to the firm, the device is also optimised for use with the leading virtualization solutions, including VMware View 4.5. with the PCoIP display protocol.

The Wyse X50c requires no hands-on management and automatically updates and configures itself based on network settings. Weighing in at 3lbs, the X50c is built for high-quality mobile use with a powerful 1.33GHz Intel Atom processor, 11.6 inch LED backlit screen, and up to eight hours of battery power.

Leave a comment

Filed under CCM (Cloud Content Management), Cloud Computing, Desktop Virtualization, Security Content Management (SCM), Vendor News, Virtualization