Features

Top tips for entrepreneurs

Earlier this year, BCS, The Chartered Institute for IT launched BCS Entrepreneurs – a new specialist group to support entrepreneurs and start-up businesses.

Anil Hansjee, MBCS CITP, Chair of BCS Entrepreneurs, says: “Our aim is to build a community of entrepreneurs and help them turn their ideas into commercial reality by providing support and guidance from those who have already succeeded in this area.”

Anil has previously worked at Google where he ran mergers and acquisitions. He has also worked as a venture capitalist in the IT entrepreneurial scene, in investment banking and more recently as an angel investor in the start up scene.

He has the following top tips for entrepreneurs:

– Have the right team

– Know what you are (and are not) good at.

– Know who you want to plug the gaps – whether it’s new team members or advisors.

– Have a short list, follow them, court them, know what it will take to get them.

– Demonstrate why you are the right team, for example – focus on facts such as how long have you known each other or worked together; how you complement each other.

– Know your product

– Design and usability are crucial in consumer products. Experiment and fine-tune – understand what product changes make what difference to metrics.

– Use ROI based online marketing techniques and demonstrate what works – or not – and why. Show what you think you can do with money to improve product/expand traction.

– Know your KPIs and why they are important: demonstrate metrics, analytics and improvements along a path and know where more money will take you.

Remember, it’s other people’s money you are asking for

– Understand how you have used your money so far: where it’s been well spent and where it’s been wasted.

– Think about your future needs in a detailed cash flow usage over the next two years.

– Think about the path to steady monthly sources of incoming cash flow as a basis if possible.

– Use benchmarks to show your figures are market.

Understand your potential

– Don’t waste too much time predicting revenues out of very little context or evidence, but do have a sense of market opportunity/size.

– Think about business models that investors know and understand from elsewhere and can apply to you logically. Talk passionately about the big picture vision and the roadmap to that vision. Know what the key products you will need to launch are and key metrics you will need to aspire to over that period.

Get the business model right

– Explain how you build win-win relationships with customers and/or consumers to create protective hurdles in your business models – and dependencies of your customers on you.

– Understand the competitive dynamics – both similar and different stories to yours. What you are changing or what you are leveraging (if it already works well) from others.

– Is this disruption or incremental innovation? Plan for the appropriate success criteria appropriately i.e. not underestimating the time it takes to change certain habits or change established ecosystems.

– Know why you are giving something up for free (if you are) – what orthogonal business model or purpose do you have.

Find the best funding for you

– Be selective about who you approach for funding – make sure it’s the right fit in terms of culture, experience, success and personal dynamic – after all, this is a long term relationship. Focus on the individual, not the firm initially.

– Be bold in approaching VCs – network, go to events, ask for personal intros. Demonstrate you are a good entrepreneur in getting that meeting

– If you are raising from institutional investors especially ensure you are aligned in terms of long term. They will want to deploy a certain amount of capital with you over time and will have certain expectations of multiples of returns. It won’t work if you want to build a life style business, non-scalable business or don’t have the same time/business size ambitions and intentions as they do.

– Practice the elevator pitch – time is of the essence to capture the interest for a real meeting. Listen to other startups pitching at events so you can pick up constructive learning on what to improve yourselves.


Cloud – how a code of practice is good for your business

Andy Burton, Chairman, CIF

The cloud is at once touted by the industry as the greatest revolution in the delivery of IT services for a generation and an obscure concept by many prospective adopters.

BY ANDY BURTON

Whilst the former is not surprising, the latter is due to the lack of understanding about how to scope cloud services and how to integrate them within the wider IT strategy.

This uncertainty can be compounded by a lack of knowledge of who to trust in balancing an on-premise capability with an online one and has, therefore, made some IT managers and business leaders reluctant towards investing in cloud services.

A credible and certifiable Code of Practice that can provide transparency of cloud service providers and their capabilities with clear guidelines of what is important, and why, is one sure step to advance adoption.

Cloud service providers need to provide information that relates to their business and operations in a standardised format to cut through pure marketing messages to the core of what and how they offer services. By providing answers to essential questions in a common form will enable end-users to make rational and informed decisions on how to progress with specific vendors. As such, a Code can encourage consumers to have clarity and confidence in their choice of provider.

Due to businesses’ uncertainty of how to embark upon a strategy that includes cloud computing, it is important to understand specifically what it is by definition, and, how it can benefit both businesses and end-users at a practical level. Cloud computing at its most basic level enables someone to access computing power and applications ‘online’ via the internet on demand.

To help cut costs for businesses, it is typically offered on a pay-as-you-use or subscription model and there are no capital costs to participate. Operating independently from hardware, it also provides resource and services to store data and run application, in any devices, anytime, anywhere, as a service.

A Code can help end-users to select the best practices and the service providers that are most suitable to their business. It takes into account three key points: transparency, capability and accountability to accurately define the services offered, standards of operation and security.

As it stands, cloud computing is so new and driven by specific vendor messaging that it lacks transparency, and for some that leads to a lack of credibility. A Code can highlight information that’s vital to making an informed business decision, such as stating the vendor’s real legal entity (behind the web presence), where their data centre operations are based, if they are owned by another company, what their operational practices are etc.

In terms of capability, organisations complying with a Code of Practice should have documented management systems, processes and resources in order to deliver services consistently for their customers 24/7 and enable service level information to be accessed by them.

Accountability involves educating the customer on the legitimacy of organisations. Service providers should be accountable for their operational practices and public website declarations, and in particular, they should actualise any public claims that they make about their service on their websites or promotional materials.

A Code of Practice is necessary to engender the trust required between businesses and cloud service providers to collaborate on the delivery of an IT strategy. If cloud service providers follow the requirements within a Code of Practice and make the information needed to make an informed decision available they are able to place a certification mark on their websites that end users will be able to recognise as a public statement of their operational and ethical intent. What is not in doubt is that what we call cloud services will continue to grow in capability and adoption, what is not so clear is the pace at which that transformation will arrive.

Andy Burton is Chairman of the Cloud Industry Forum

 


FEATURE: Moving to the cloud – a decision maker’s guide

Ken Bates

So, you’ve made the decision to take your IT infrastructure to the Cloud, but what kind of Cloud services do you require? ‘Public’ and ‘Private’ clouds are terms that those who are considering Cloud services are using frequently, but what makes a cloud either public or private, and what are the advantages of either and when is each use most appropriate? Keith Bates, Chairman of Cloud Computing Centre, offers suggestions on where to start.

Understanding the Cloud

The first issue to address and clarify is the difference between the public and the private cloud. Under a public cloud model, a company or individual will subscribe to a service – a piece of software such as Salesforce.com, email such as Google mail or online document backup – and typically has no knowledge of the underlying technology, from operating system to database. Companies have no idea where key data is being stored – making the public model totally impossible and impractical for any organisation holding UK government data which must be stored within the UK borders.

Many of these public cloud services are increasingly provided by large organisations, such as Amazon and Google, to exploit their own spare capacity. However, these companies are not offering any Service Level Agreements (SLA) to guarantee performance. And whilst performance is typically satisfactory, during peak times, such as the weeks before Christmas, users can experience a significant drop in response – with no recourse available to them.

A private cloud, in contrast, offers companies the chance to specify every last detail of the infrastructure supporting and providing the application or service, from the make and model of the hardware, to network management tools and firewalls. The infrastructure is not shared with any third parties and the cloud provider will offer an SLA with clearly defined financial penalties for any breach in performance.

Typically hosted in a highly secure, Tier 3 or Tier 4 data centre environment within the UK, private cloud organisations know where the data is, who is managing it and who has access. This model is obviously more expensive than the public cloud but offers the level of performance and support required for high volume transactional processing systems, such as finance and ERP, which require guaranteed processing power.

A less expensive version of the private cloud can be achieved by sharing the resources with other organisations. Indeed the vast majority of private cloud solutions are delivered in this way. On occasions where the minutiae of the underlying infrastructure isn’t a concern, but the provision of an SLA is, then this offers the chance to leverage economies to scale and share a robust, secure infrastructure with a defined set of customers.

Flexible Model

In addition, a growing number of software vendors –such as finance, CRM and payroll providers – are opting for private cloud solutions and creating a public cloud model for their customers. This provides organisations with access to business-critical applications, and an SLA, but without the need to define the infrastructure. Alternatively, the same software can be purchased and hosted in a dedicated private cloud if that meets the specific business need.

Furthermore, growing numbers of organisations are embracing the hybrid cloud model – opting for the public cloud for applications such as email and using a private cloud provider for business critical applications. The key to making this hybrid approach work is to ensure the private cloud supplier is able to coordinate the process, indeed offering a single contract for the entire cloud solution, minimising the management overhead for the organisation.

Building the Right Business Case

The shift to the cloud offers companies an unprecedented opportunity to rationalise skills, decide which are still required internally and which would be far better, and more cost effectively delivered, by a third party cloud provider.

For most SMEs the chance to reduce the internal IT headcount is a compelling reason for moving to the cloud. But what are the pitfalls? The key issue is to ensure the SLA matches the needs of the business, from the criticality of the application, to the number of users and projected volumes. A number of websites now offer the chance to buy private cloud services from a mix and match menu. But this is only really suitable for an IT Director that is both highly experienced and highly confident in cloud computing. Get it wrong, and the business will end up with a very badly performing set of key applications or a jeopardised corporate security.

Most cloud providers will offer a set choice of firewalls. But can the organisation be totally confident that the choice will meet key business requirements, such as ensuring that customer credit card details are kept secure or that the business is not in breach of the Data Protection Act or Payment Card Industry Data Security Standard (PCIDSS)?

Or is it really viable to run very resource-intensive applications, such as video rendering, in the cloud? For an organisation located in a city centre that receives gigabytes of connectivity, there will be no problem. For a rural-based company, with poor connectivity, the results would be less than impressive.

If any organisation is new to the cloud – as most will be – it is therefore important to seek out the right advice and sit down with a provider to discuss the issues from performance to security.

Forewarned in forearmed

Having made the decision to go to the cloud, making the right private versus public decision is critical. The ‘one size fits all’ public model is constrained by lower service parameters and there is little or no option for tailoring the service – though this will be fine for some companies and applications. Indeed for a web server in the public cloud, an organisation that needs nothing more complex or resource-intensive than email and Microsoft Office can achieve nearly zero cost IT operations via the public cloud.

For the vast majority of companies, however, there are a number of business-critical applications that require consistent, guaranteed performance. Whether the company needs to opt for a totally dedicated, personally-designed infrastructure or is able to share resources, is then the key consideration, as is whether to embrace a hybrid cloud model.

And while the cost comparison is significant – around five times the monthly cost of a public cloud web server – the cost savings for a private cloud web server are still considerable. Simply replacing internal IT experts will typically pay for the shift to the cloud within the first year. Add in the additional flexibility of adding new applications and services within days rather than months and the ability to upsize and downsize to reflect business needs, and the cloud model is compelling.

The move to the cloud can and should deliver significant cost savings, provide access to more up to date and better performing infrastructure. But there is a significant risk attached to this fundamental architecture shift and organisations need to mitigate that risk by improving understanding and attaining the right advice up front.


FEATURE: The changing landscape of eDiscovery and ESI

Whether it’s search engines unlawfully collecting private passwords from home wi-fi networks or government agencies leaving laptops on trains festooned with private information of taxpayers, the risks associated with compromised data are rising fast. Bob Tennant, CEO at Recommind disusses the changing landscape of eDisclosure and considers the issues surrounding regulatory compliance.

It’s no secret that eDisclosure is still not fully understood by many UK firms and is largely thought to be an American problem. However, the requirement to produce electronically stored information (ESI), whether in response to eDisclosure requests, internal investigations or government scrutiny, is undoubtedly a global issue.

Furthermore, the risks and costs involved are rising exponentially as ESI reserves continue to explode in volume. When a Europe-based company like Alcatel-Lucent is required to pay a $137 million penalty to US regulators to settle bribery charges brought under US law, all companies need to assess their information risk profile.

In the UK, there has been significant backlash against the previously relaxed regulatory environment over the past few years, most recently marked by an increase in fines handed out by the Information Commissioners Office (ICO). The increase in the number of regulatory inquiries has meant that some businesses are beginning to take better care of their data. This includes monitoring and testing their internal controls so that they can deal with any disclosure demands efficiently and cost-effectively.

As eDisclosure has clearly now become a global issue, more companies need to ensure they are better prepared. It is no longer acceptable for IT directors to think of eDisclosure as optional – it should be fundamental to a company’s entire information management, compliance and risk mitigation programmes. The risk of damage to a business from compliance lapses and failure to meet disclosure demands is on a similar scale to, if not greater than, that of IT security challenges such as data privacy breaches. An oversight could have severe repercussions and leave businesses highly vulnerable to the consequences associated with information risk – including large fines, reputational damage, and loss of stakeholder and customer/client confidence – all of which have the potential to quickly cripple a company.

To combat this, companies need to take a proactive approach. If they don’t, responding to an investigation will be an expensive and extremely time-consuming endeavour due to the sheer volume of ESI that needs to be identified, collected reviewed and analysed. To put this in context, regulatory inquiries often result in the production of more than one terabyte of data (the equivalent of 75 million pages) and in huge costs involved in reviewing that data, made significantly more expensive if the responding company does not have an eDisclosure infrastructure already in place.

One major challenge, and possibly a reason that companies continue to downplay the importance of eDisclosure, is that actually taking control of all data, and making it searchable and discoverable can be a huge undertaking. There is also a natural human tendency to underestimate risk, and as a result not adequately assess the costs of not preparing for disclosure–as a major automaker recently found out. In reality however, the initial outlay will ultimately save significant money by not having to rely on external third parties when a disclosure demand is made. The costs associated with setting up a proactive response model instead of a reactive one will typically pay for themselves within the first three to six months.

In order to successfully manage ESI, companies should invest in solutions that can automatically categorise, index, access, preserve, delete and collect relevant information in any form. Since these challenges are similar to those presented by email and knowledge management, the use of sophisticated search technologies, especially when incorporated in to an application that directly addresses chain of custody, repeatability, and other defensibility concerns can achieve similar results in risk management. For example, when an investigation commences, concept search and classification greatly increases review efficiency as it will locate all information related to an issue, rather than relying on inefficient keyword searches that would either miss relevant data and/or bring up a sea of clearly irrelevant documents containing a particular search term. And since 70% of eDiscovery costs are in the review, any improved efficiencies translates directly to cost savings.

The effective management of this wealth of ESI will not only automate the eDisclosure process, but it will also allow companies to improve productivity and efficiency by providing staff with access to all the information they need for their daily jobs. This is particularly key in the current financial climate where less staff are being stretched to fulfil more roles. Despite this, most UK organisations are using out-dated, legacy search and data management tools which do not meet the sophisticated information needs of the staff, and are not capable of searching data in different formats and from diverse locations.

The right technology is key, but it’s not the whole story – the legal and IT departments must also work together to optimise their systems. While the IT team understands the technology issues, there’s a danger that they won’t fully comprehend which information should be preserved and disclosed, and which can be discarded, and require direction from legal and the business units. Equally, the legal department are experts in their domain, but need IT to help ensure all processes and systems are up to the job, without causing major upheaval to users.

Too often, the corporate legal and IT teams work at cross-purposes, leaving businesses at risk when hit with a disclosure demand. According to research conducted in May 2010, while 91% of CIOs believe that data breaches and fraud are the biggest risks associated with corporate information, almost a quarter of businesses do not have an information risk strategy in place.

In order to overcome this, we are likely to see a new breed of ‘information security’ risk professional emerge. A hybrid manager of litigation support and information management resource, that also has the ability to act as a technical expert is ultimately required to oversee eDisclosure within large firms. More collaboration between IT and legal will help to reduce this risk of regulatory fines, and ideally organisations should create cross-functional teams with representatives from both departments to deal with eDisclosure, information risk and compliance as a whole.

In 2011 companies need to provision for the new regulatory environment, which is proving to be intolerant of those organisations that cannot effectively identify data that is requested and produce it promptly. The explosion of ESI volumes, types and sources has made it more difficult to effectively control and protect information, making it harder for businesses to deal with eDisclosure requests. By incorporating sophisticated search and eDisclosure technologies into a company’s IT security strategy, businesses can effectively avoid the costly and severe repercussions of unmanageable eDisclosure and regulatory requests when they hit.


Information governance gone ga-ga?

British Prime Minister Gordon Brown captured d...

Information governance through document management: a guide to safeguarding information, reputation and corporate productivity with UK Government guidance in mind

by Adrian Butcher, Strategic ECM Consultant, Open Text

Most business managers will be aware of the recent data losses that have affected the public sector.

However, as far as is known, the main damage done was reputational. But this is just scratching the surface of the potential damage, as the risk of identity theft and fraud still hangs over millions of affected citizens.

Therefore, it is opportune to examine how information governance can be effectively established within any organisation – public or private – with help from Document Management (DM) systems and cognisant of UK Government’s suggested procedures, following their well publicised problems.

Those familiar with information management will also regard this as relevant to commercial organisations and not just those contracting with the public sector. Quite apart from statutory concerns (Data Protection), the loss of sensitive information has proved troubling for the private industry in many sectors. The latest statistics have shown that 227 losses have been referred to the Information Commissioner in the past year, of which just 176 relate to the public sector – so between a fifth and a quarter of losses reported to the commissioner are in the private sector.

Now however, a document has emerged from the UK Central Government that details the required responses and sets out proactive, mandatory moves to ensure that data losses don’t occur. This is the Cabinet Office document entitled ‘Data Handling Procedures in Government’ commissioned by the [former] UK Prime Minister from Sir Gus O’Donnell, [former] Cabinet Secretary and Head of the Home Civil Service.

However, a key question for IT managers and their clients is ‘can DM provide a valuable response this report without compromising efforts relating to the well known ‘Varney’¹ and ‘Gershon’² agendas?’. Fundamentally, managing information more effectively is good business as well as good governance. In providing cost-effective support for Information Governance, DM can also provide a platform for improved management – and manageability – of both information and the processes it supports.

Many data leakages tend to occur at a junior level, but fingers point to the top. Crucially, the greater near-term impact is most likely not in criminal misuse of information, more in the doubts cast over how such information is effectively managed. The loss of trust may have greater managerial and political impact than the loss of the information itself. What the O’Donnell paper refers to as ‘Data Handling Procedures in Government’ might perhaps also be described as information governance.

Key messages from the paper: personal information holding – major benefits, significant risks

It is evident, that across the UK’s mixed economy, there are benefits to the individual citizen or consumer in significant amounts of information about him or her being held by organisations of all kinds. It is no less evident that disadvantages can also arise if the information is not handled securely and ethically. Carelessness, recklessness and sheer malice are all forces to be contended with and if data and information is to be shared, confidence must be maintained.

With this in mind the O’Donnell report clearly sets out guidelines that aim to protect against carelessness, recklessness and malice. These include the need for a culture that values information and recognises custodianship responsibilities, as well as clear identification and communication of responsibilities, which are backed up by ‘accountability mechanisms’. Common standards and procedures also need to be put in place with real transparency as to how information is handled.

Furthermore, scrutiny and visibility of performance are crucial and the report goes so far as to set out specific obligations, including annual reporting on performance as part of the Statement on Internal Control and for the larger systems penetration testing by external bodies.

As for transparency, the report calls for ‘information charters’ and greater publication of information on particular information assets and their use. This recognises on paper what information management product and service providers have long asserted, that information in and of itself is an asset to be managed, exploited and protected.

Personal responsibilities: ‘no hiding place’

The report provides ‘no hiding place’ in terms of organisational versus personal accountability, reminding us that organisations must nominate individuals to pre-determined roles with responsibilities that are centrally defined at a high level.

In this way, responsibility for information handling is pointed at named individuals, matrixed from business process and systems perspectives and empowered with board-level ownership. DM can enable and assure corporate best practice.

Setting out required behaviours

This can be seen if we first look at the triumvirate of Culture, Policy and Procedural guidance. Whilst the achievement of culture change can be a notoriously ‘slippery’ objective, setting clear policies and the procedures that underpin delivery of this kind of policy, then assuring compliance, are typically the ways by which it is brought about.

We need failsafe mechanisms, which prevent non-compliance, whether accidental, reckless or even malign. We require assured governance over the governance itself; that clearly sets out rules as to who owns and can change guidance, backed up by audited compliance and process.

Infrastructures to support and enforce required behaviours

The vast majority of employees are both competent and committed to doing a good job with professional integrity. DM can help them in two fundamental ways. Firstly by ensuring everyone knows the rights and wrongs of information management and secondly providing structures and mechanisms to make compliance easier or even to a large degree automated. This can be achieved by making compliance an integral part of business process.

The DM system can ensure that there is a comprehensive body of policy and systematically related procedural guidance to cover all relevant situations in the working environment.

There needs to be a very clear management regime to ensure that all who should receive the guidance do – and are recorded as having done so – and there may well be a need for individuals to ‘self test’ their understanding of such guidance.

Of course, on its own, the above will not prevent incompetence or malice, but it does at least allow management to communicate and does not leave any employee bereft of the guidance he or she is entitled to receive.

Structures and mechanisms for easier – or partially automated – compliance

DM has much to contribute to structures and mechanisms for easier or partially automated compliance, but the key benefits are that with information under electronic stewardship the DM system can ensure that only authorised people can find, read, change, delete or distribute information.

Secondly, any and every event regarding particular information from viewing to downloading to changing, deleting etc, is audited and can be reported upon if required. Thirdly, where that information forms part of a clear end-to-end process a DM system can ensure that it remains within the process and finally, process officers need only see that information required by their individual role within the process, if need be.

Protecting against error, frustrating malice

Some of the most public and embarrassing information-loss events derive from mislaid, lost or stolen laptops. With modern DM, even this can be prevented by ensuring that a document or other piece of information, stored on a laptop, even one whose password has been circumvented, can be rendered unusable until the user has authenticated him or herself with the central DM system.

Conclusion

DM systems can help management prove, to an evermore demanding outside world, that it has not only formulated the right guidance, but put in place powerful and realistic mechanisms to ensure compliance and guard against complacency, simple error and outright malice.

Perhaps not a panacea then, but DM systems offer a genuinely ‘do-able’ response to a real, inescapable and ever more demanding obligation.


FEATURE: Is open source putting businesses at risk?

Conceptual Map of the FLOSS (Free/Libre Open S...

Conceptual map of the Free/Libre Open Source

Businesses today are built and operated by software that houses intellectual property, business processes and trade secrets that are vital to the health of an enterprise. Organisations must address potential weaknesses in their everyday operations before they become exploitable, according to Richard Kirk, European Director of Fortify Software

It’s the ultimate irony: The versatile software you depend on to run your business also puts it at risk. Your business applications hold the business processes and the data that form the lifeblood of your company. Yet, even as they open your business up to more customers and partners, the security holes your software contains leave you vulnerable to attack. Relentless and destructive data predators are ready to pounce.

Today’s hackers, organised crime cartels and enemy nations are highly adept at quickly turning security flaws into stolen data and cash. I’m not in the habit of finger pointing over flaws in packages – let’s face it we all know that application bugs exist, the only real question is why?
Open source development introduces risk to your business in unique ways. The inexpensive and readily available nature of open source makes it easy to adopt. But at what cost to enterprise security?

A Fortify-sponsored Open Source Security Study published in July, completed by leading application security consultant Larry Suto, examined 11 of the most common Java open source packages. It confirmed that the most widely-used open source software packages for the enterprise are exposing users to significant and unnecessary business risk.

The study validates that Open Source Software (OSS) development communities have yet to adopt a secure development process and often leave dangerous vulnerabilities unaddressed. Additionally, it found that nearly all OSS communities fail to provide users access to security expertise to help remediate these vulnerabilities and security risks. The study sparked debate on a number of topics related to OSS that anyone in IT or enterprise security should understand. The response to the report set off some familiar refrains, which miss the point and don’t get us any closer towards the goal of a secure enterprise.

What’s More Secure: Open Source or Proprietary?

Improving the engineering process of building more secure code applies to every software project, whether it is open source or ‘closed”. It’s not important who writes the code, but how.

Improving the engineering process of building secure code applies to every software project, whether it is open source or ‘closed’, and it’s not important who writes the code, but how. Any competent engineering team will be able to generate secure code if security is made a part of the design, just as they are able to bring a low cost solution to market, or a high performance solution.

From Fortify’s vantage point it sees literally thousands of development teams and it’s fair to say mostly within IT organisations for financial and other highly regulated industries, that have a very sophisticated process in place for application security. In the current Open source model, yes, those of us who care about security can feed vulnerabilities back into the machine, or fix them ourselves, but it will never be as sound as building it securely in the first place.

Security and Quality are the Same

Recently, the icon of OSS development, Linus Torvalds emailed the Linux development team, weighing in on the quality vs. security debate. In his email, Torvalds argued that “In fact, all the boring normal bugs are WAY more important, just because there’s a lot more of them.” Although it is true to say that quality and security are both important, I strongly disagree with Torvalds for several reasons:

· Quality is cumulative whilst Security is absolute.

· Quality is about making the main path of operation work, and accepting issues in the corner cases, whereas Security must cover everything

· Quality is a closed problem however Security is open. There is no list of known bugs you can go to and accept for Security as criminals compile these lists and they don’t share them. Quality is reinforced by customers in the open market.

What is The Path Out Of This Forest? A Managed Risk Approach

Traditionally, companies have largely depended on “perimeter-based” approaches like network security to prevent data predators and criminals from gaining access to corporate information. However, the demands of today’s open business environment weaken the protection provided by firewalls and other perimeter security efforts, leaving a corporation’s applications easily accessible and vulnerable to hackers.

In order to mitigate the business risk created by insecure applications, it is imperative that companies adopt a process that allows them to assess, remediate and prevent security vulnerabilities in all of their business software, whatever the source. Business Software Assurance (BSA) is a growing industry trend that refers to technologies and techniques that enable you to maximise the flexibility, enhanced capabilities and easy availability of enterprise software without exposing your operations to attacks that can threaten your business. In short BSA answers the question “How do you know your business is secure?” By identifying and resolving your most critical application vulnerabilities you can enhance software assurance.

What Next?

Ultimately, the solution is developers and security experts working together to build secure software right from the start. Fortify is already in discussions with Open Source providers with whom it is working to improve processes and I invite any open source group who would like to get involved in these conversations, and make security a part of the development process, to get in touch.

Call-To-Action For Organisations That Rely On Open Source Software.

Government and commercial organisations that leverage open source should use open source applications with great caution. Risk analysis and code review should be performed on any open source code running in business-critical applications, and these processes should be repeated before new versions of open source components are approved for use. Organisations considering open source software must thoroughly evaluate open source security practices. We recommend using the standards we recommend below to open source communities as a checklist. In addition, enterprises should:

– Raise security awareness within open source development communities and emphasise the importance of preventing vulnerabilities upstream. Enterprise security teams should articulate their security requirements to open source maintainers to accelerate the adoption of secure development lifecycles.

– Perform assessments to understand where your open source deployments and components stand from a security perspective.

– Remediate vulnerabilities internally or leverage Fortify’s JOR, which provides audited versions of several open source packages.

Open source projects should adopt robust security practices from their commercial counterparts. Open source development can benefit from private industry practices – notably those created by financial services organisations and larger ISVs. Open source communities can then advertise and substantiate effective security practices that blend process and technology. Best practices to consider include:

– People : Appointment of a security expert with the power to veto releases from getting into production – known as a gate model. Develop the expertise to conduct security activities and get security right.

– Process: Build security in by mandating processes that integrate security proactively throughout the software development lifecycle. Include relevant non-coding activities, such as threat modeling and the development of abuse cases.

– Technologies: Leverage technologies to get security right, which include static analysis in development and dynamic analysis during security testing in quality assurance.


FEATURE: All things must PAS

Paul Quigley talked to Simon Lande, CEO at Magus about web content management and PAS 124.

Magus CEO Simon Lande

As the race to get online hots up, the sheer volume and ambitiousness of businesses and marketers to be seen and to expand from main sites to microsites for targeted campaigns, as well as bread-and-butter product information and rich media content sites, the whole panoply need to maintain a structure approach to quality control and best practice.

Whilst it’s not quite the Wild West out there in Web world anymore, much still remains to be done to make sure sites are robust and fit-for-purpose. “It’s all about web site and quality monitoring and website compliance” explains Magus’ Simon Lande. “We have an application which helps, typically, large organisations – those with multi-editorial, multi-site environment, who have a set of standards and guidelines that they want to make sure are enforced online – we can take those guidelines and convert them into a set of electronic checkpoints, and for every page on every site, validate where there hasn’t been various levels of compliance.”

According to Lande, this approach has led Magus to work with industry standards setters, helping the wider industry tap in to their knowledge. “As a result of having a lot of expertise in that, we work with BSI, the British Standards Institute steering group,” Lande says, “that brought together the combined expertise in the industry to produce a best practice process document.”

Magus’ Lande believes that there are a lot of benefits that can accrue if standards are implemented correctly. “It protects the brand,” he stresses, “and you get a good return on investment on your web-spend, but, if you’re really struggling with the logistics of the issue, we produced this specification (PAS 124) which is all about defining, implementing and managing web site standards.”

With PAS124, Lande says that Magus’ work has to date been about putting something back into the industry in terms of best practice and process to help people get the maximum amount out of their web site operations and the maximum return on investment, and to deliver a good and effective user experience.

“From our commercial point of view, the fact that we’re the lead consultants on this PAS 124 process means that for organisations that need the operational aspects of how to manage these standards – and we know a fair amount about standards because we’ve been at the centre of this BSI project in addition to our own direct work with clients – it’s a really powerful application that can really help you become, if you like, a web site quality leader” says Lande, “making sure you are delivering the best possible user experience on the one hand, and streamlining and maximising your operational site management on the other.”

For web content management rollout, the benefits are two-fold, Lande asserts. “Where it touches on content management, is that essentially, there are two elements: what the standard does is make the most out of your CMS system, such as, what should be locked-down in the CMS, what should be left flexible, how to get the balance between globalisation and localisation issues; but the reason we developed this application is that there is this perception that people have, if I lock everything down in my CMS and I define the templates correctly, then I’m not going to have a compliance issue, because everything’s resolved in the CMS” he says.

According to Lande, a lot of CMS companies give that impression. “The reality is, that doesn’t happen,” he says. “And it’s not even really supposed to happen, and the reason is the CMS will lock down the top level components such as navigation and architecture, but the content areas, where the editors actually work, it’s not practical or workable to lock down every element of that because the system will become unusable and people will find it hard to manage.”

The solution, Lande believes, is not as complex as you might think. “So what you have to do is leave it open, and give them the guidelines which then mandate exactly how you should use the open areas” he says. “Our position is that you then use the website quality monitoring to ensure compliance with the guidelines. So it’s sort of picking up where the CMS leaves off,”

On a slightly more controversial note, Lande thinks the truth is better to explain up-front than for users to have unrealised expectations of what a new WCM system can and cannot do. “What your CMS will never do’ – that’s not a criticism of the CMSs,” says Lande, “it’s just that this is not what they are designed to do, this is what they are designed to do working on the boundaries between the two. In fact, the closer we can align with CMS companies, the better” he says.

Magus is currently working with CMS vendors such as SDL Tridion already. “We are integrating our compliance monitoring solution with their CMS so that people have both the production capability and the monitoring capability.

PAS124’s relevance to CMS organisations is a win-win for web content management, Lande says. “What BSI has not produced is a new web standard – there are a plethora of web standards out there, W3C, accessibility etc., people are really struggling with how to go about making the most of these – what the PAS124 document does is provide a best practice framework for processes, defining standards, how do I go about that, which ones do I need, which ones are applicable to me; implementing standards – how do I role them out, how do I communicate them on an ongoing basis.”

“There’s another PAS – PAS78 which relates to accessibility, and that is now going to the next step to become an official standard. If PAS 124 gets the traction and the momentum, it could evolve towards a sort of quality standard.” Lande believes this could be the first step. “Either a content management company has to go upstream to provide compliance monitoring we’re doing, or build it in.”


FEATURE: The iphone craze: cashing in for CMS vendors?

Robert Bredlau of e-Spirit discusses the rise of mobile content and how web operators could be losing out.

You would have had to have lived on Mars not to have heard about the Apple iPhone or BlackBerry. Millions of consumers and business people are testament to the popularity of the handheld devices that have made accessing the web on the move cheaper and easier.

The iPhone 3G S could well hasten this trend as its applications can now open twice as fast. There is no doubt people love the idea of experiencing the latest world news, sports results, music and movie releases whenever and wherever they are – as well as staying in touch with friends and family. Mobile devices like the iPhone have also enabled staff to work on the move, boosting the efficiency, agility and flexibility of businesses.

However, despite this revolution in communications, businesses have been alarmingly slow to capitalise. According to an e-Spirit survey of 100 CIOs, two thirds of UK businesses are failing to deliver high-end customer experience by not optimising their websites for the latest devices. This means many are failing to accommodate this exploding demand for mobile communication, information and entertainment.

This scenario makes no commercial sense, especially as consumers are increasingly using the internet to make purchasing decisions because of the cost savings and convenience it can offer. This is doubly so in a time of economic difficulty when businesses must exploit the 24/7 global marketing and sales opportunities which the web brings if they are to survive and thrive.

The trouble is that too much web content is designed for viewing on standard-size screens, so viewing them on anything smaller can result in a poor user experience.  Too many web pages are laid out for presentation and navigation on desktop-size displays to exploit the capabilities of desktop-browsing software. This means information fails to appear in real-time on mobile devices such as iPhones and formatting tends to lose all integrity. The stark truth is that users no longer want to have to wait until they’re sitting in front of a computer screen for an optimum browsing experience.

The good news is that developments in content management systems (CMS) have enabled companies to deliver mobile-specific content more quickly and easily – and help to enhance their customers’ interactions. CMS allows businesses to use content developed for other channels, such as standard web pages and delivers it to mobile devices without any loss of integrity. The beauty of this is that it not only means that content can be produced once, saving time and money, but also that customers enjoy the same quality of online experience no matter how they access the information.

Many CMS technologies also use content aggregators and device handlers that understand what content is available and the device it is being delivered to, allowing information to be rendered in a format that is device-specific and compatible.

These systems also have a range of innovative features, such as ‘Tell-a-Friend’, which allows visitors to send friends or colleagues a direct link via text message, enabling the recipient to open the mobile specific site with just one click.

On top of the mobile delivery capabilities, many CMS tools include automated database schemas and third party portals, providing companies with the ability to integrate content from subsidiaries across multiple legacy systems and databases, smoothly and seamlessly.

As with any free-flowing information, the issue of governance always remains high on the agenda. The freedom of mobile content has important implications for the business world, as critical information is leaving the confines of the firewall on a daily basis. This means that retaining control of who is accessing mobile information is becoming a pressing challenge for today’s businesses.

CMS technology helps achieve regulatory compliance and improves corporate governance of mobile content by using powerful audit tools to enable companies to roll-back a website to a previous point in its evolution, providing a complete history of content including the exact references, external documents and data sources.

There is no doubt that mobile content channels will continue to evolve and deliver new types of content. Mobile content will move away from merely delivering the sports results and breaking news to users, applications will get smarter, patiently monitoring your personalised preferences and delivering only the information you desire. The challenge for businesses and mobile vendors is how to move with the times, modifying responses as technology evolves. CMS technology is a must for companies who face these challenges head on.


Preparing for the arrival of knowledge work

knowledgemanagement

Image by mimax via Flickr

A day in the life of the knowledge worker

by John McCormick, VP & GM, knowledge worker product group, EMC

So is a product manager, an architect, an industrial designer and a marketing communications specialist. Regardless of the activity, knowledge workers spend a lot of their time searching for and evaluating information. This should come as no surprise. But just how much is ‘a lot’, and what does it cost the organisations that ultimately foot the bill? Industry analysts found that in an average 40-hour work week, a knowledge worker spends approximately 25 percent of his or her time searching for information and another 25 percent evaluating it. For a full-time employee earning $60,000 annually ($28.85 per hour) that’s nearly $30,000 per year. Those are big investments of time and money, especially when you consider that finding and evaluating information are only the first steps to using it productively – in other words, realizing any return on that investment.

The Evolving Search Market

Today’s enterprise generates an enormous quantity of information, often stored in siloed repositories that cannot be accessed by a single application. So knowledge workers have a lot to search through, and it’s rarely a simple task. Plus, they frequently need access to information outside the corporate firewall.

Unfortunately, up to now, the tools provided to enterprise knowledge workers have frequently fallen short of the mark in terms of search. This inadequacy gave birth to standalone search applications – an important market space with dozens of vendors, none of which grabbed more than a 15 percent market share.

Perhaps morphing rather than consolidating better explains what has occurred and is occurring in enterprise search. The platform players have seen the future of knowledge work, and it much more resembles Web 2.0 and rich Internet applications than it does even the most sophisticated enterprise search tool. It requires an application environment that provides smart workspaces for ad hoc information sharing integrated with the system resources and Web services necessary to find, access and manage collaborative content within the framework of an enterprise information infrastructure.

Beyond Search

Undoubtedly, the public Web – always connected and media rich with its potential for enhanced communications – has driven the expectations of enterprise knowledge workers. Blogging, wikis, photo and video sharing, community building and social networking are phenomena that simply beg to be used somehow in building a more effective, interactive business environment. By helping knowledge workers focus on the information, tasks and events that matter, these tools – properly positioned for the enterprise – promise to increase productivity, improve transparency, expedite business processes and eliminate knowledge gaps.

The question is literally and metaphorically, “How do we get there from here?” How can we assemble a rich set of knowledge resources that supports the dynamic relationships between information and people while they collaborate across the extended enterprise? We need to help enterprise knowledge workers:

– Find relevant information in context,
– Publish user-generated content and
– Integrate collaborative content with disparate enterprise resources

A Starting Point: Next-Generation Content Management

The next generation of enterprise content management platforms is a viable starting point for helping knowledge workers more productively use the time they’re already spending. ECM is one of the most commonly deployed enterprise applications. Today’s content management platforms can:

– Eliminate information silos,
– Enable easier repurposing of content,
– Enforce retention policies and brand standards,
– Streamline business processes,
– Offer greater security for sensitive content and
– Increase efficiency.

Up until now, the knock on content management has been ease of use. As a recent Forrester report points out, for many business people, the hassle of using an ECM system exceeds the system’s value. But that is changing and changing rapidly.
More Production – Less Frustration

ECM can make life easier and more productive for knowledge workers without sacrificing the controlled management and attention to compliance the enterprise requires. These platforms will incorporate:

A Web 2.0 client. From within the ECM client of the very near future, knowledge workers will easily do things like create blogs and team wikis that used to require separate, external applications. This client will support personalized information views as well as team and individual workspaces. Knowledge workers will have one-click publishing capability and the ability to manage tasks and projects via a powerful yet friendly interface.

This interface will support the coordination of content, people and processes. It will leverage an asynchronous, dynamic user experience for running within a browser and connecting through a desktop, laptop or mobile device. It will deliver a flexible and responsive interactive environment – RIA anyone?

More intuitive and productive search. Knowledge workers need to find information wherever it exists – inside a managed repository, in a secure corporate file system or on a desktop. With next-generation ECM, a single query will access various repositories (each of which organizes content in its own way) and return a consistent, integrated set of results. Once users have authenticated access to repositories, they will not need to know how those repositories work.

But as important as finding information is the ability to understand it. One way to do this is through visualization of the relationships between different types of information. Next-generation content management systems will enable intelligent filtering of and guided navigation through multiple information sources.

User-centric ECM will also permit knowledge workers to easily tag and classify items for the benefit of themselves and others. It will enable them to use social tagging – tagging content on the fly in meaningful ways, with personal terms and those used by others. Over time, the most relevant terms will become more popular and more frequently used. Suddenly, it becomes easier to discover and track what is important to us and our colleagues. These advanced content platforms will also help users syndicate and track relevant information using really simple syndication.

Moreover, knowledge workers will be able to organize search results graphically, a better means to probe the connections between various items. Interrelationships will be mapped, revealing patterns not recognized when scrolling through lists of results.
Access to content and content services through virtually any application. Many knowledge workers spend their days working from one or two core applications. Whether its Microsoft Excel, Word or Adobe InDesign, they are much happier (and more productive) if they can stay in that application and still search for, find and access the information they need and collaborate with others. Content management platforms of the future will leverage a plug-in infrastructure that enables seamless integration with desktop applications when working with files.

Secure and compliant information management. Although knowledge workers create most of the information in an enterprise, the security, retention and governance of that information are responsibilities of the IT manager who needs to control the use of knowledge worker tools without reducing productivity.

IT managers maintain multiple enterprise applications in a production environment. The fewer management challenges the better. Their priorities include reducing operational cost and risk while preserving service quality. From an IT manager’s perspective, any new service must deliver compelling benefits with little additional cost or risk.

Future ECM platforms will provide RIA capabilities from an enterprise architecture that has been developed to apply and enforce security, retention and governance policies behind the scenes – pervasively but not intrusively. That means collaborative workspaces and content won’t require the addition of another enterprise application to an already complex IT ecosystem. Nor will they increase an IT manager’s administrative overhead. These platforms will enable any time, anywhere access to enterprise content while securing content that travels outside the enterprise via information rights management.

Content Management and the Evolution of Knowledge Work

There are two critical elements in the evolution of knowledge work: the user experience of knowledge workers and the breadth of the resources they use. Rich Internet or Web 2.0-enabled content management connects a browser or mobile experience to networked resources. It provides smart ways to communicate, coordinate and collaborate with colleagues and partners.

Knowledge work has changed and will continue to change. The flexibility of next-generation content management will enable the platform to constantly adapt to a fluid work environment. Today that fluidity is driven by Web 2.0 and social computing, which have demonstrated that context is every bit as important as content. No one can predict what the next “prime mover” in the knowledge work and worker environment may be. But the deployment of a robust, flexible platform that can support the solutions that knowledge workers require is clearly the best preparation for uncertainty.


Related links:

ECM Plus podcasts