Letters to the Editor

Got a view or an issue that you want to share? Got an opinion on a hot topic? Got spmething insightful that you need to get your point across in a quick and easy way? The just email the Editor – at newsdesk (at) ecmplus (dot) eu.

Too big to fail?

Government’s procurement not fit for purpose


National news has yet again drawn our attention to the government spending huge amounts of public money on IT contracts where the costs spiral out of control. This time the contract in the spotlight is that of ATOS to carry out medical assessments for benefits claims.

There are numerous previous examples of tax-payers’ money being squandered on undeliverable IT projects, whilst SME innovators like ourselves are often overlooked.

Whatever the ins and outs of the ATOS deal, the technology to track claims and give quality, timely information to the client (in this case the DWP) is already available and can be reliably procured at a tiny fraction of the cost of the £112 million reported. The fact that an additional £60 million has been reportedly spent on appeals due to flaws in the existing system, simply beggars belief.

Successive governments have fallen into the same trap of purchasing systems from massive organisations, because in their eyes, ‘big means safe’. Purchasing in this way might give you some comfort that the organisation might not go bust (an argument which has been severely weakened since successive financial and economic crises), yet the real question for government should be ‘could we get better, for a lot less money?’ The answer is undoubtedly ‘yes’.

Mark Colonnese
Aquarium Software

Gmail outage


The recent loss of Gmail data by Google is an unnerving sign that outages can strike at any time to any business, further underscoring the need for a sound backup and disaster recovery strategy.

The ramifications of not being able to restore data quickly can cost businesses dear in terms of revenue and reputation, and will force it to rethink its backup and recovery strategy.

Such periods of downtime can lead to the interruption of critical business systems and a drop in productivity which will have a negative effect on income generation.

Google’s outage is also a warning sign to cloud operators that a resilient backup plan is vital to maintaining customer trust in cloud services.

A sound backup and disaster recovery strategy must include a balance between fast accessibility, security and cost. The best solutions are simple, streamlined and as optimised as possible. A tiered storage architecture using disk at the primary level and tape at the backend will help organisations to quickly retrieve information to avoid substantial losses from downtime.

Mark Galpin
Disk Product Manager EMEA/APAC

Suite spot

Between April 11th to the 13th of April, London will play host to the European SharePoint Best Practice Conference. This event will doubtless be of great interest to users of SharePoint; yet it seems unlikely that it will answer the most fundamental question facing any CIO seeking to improve information management across their organisation: whether SharePoint itself is really the best technology for their needs.

The most common reasons for choosing SharePoint are its affordability and flexibility. Yet the practical reality is that SharePoint requires a certain level of infrastructure to run, has a price tag that depends heavily on the number of users and servers involves, and will probably require extra modules of functionality before it can behave as a complete enterprise content management system.

Furthermore, there are several areas which are widely-acknowledged to be outside SharePoint¹s sweet spot, or lacking altogether, in terms of content management: security, email management, BPM and social media to name just a few. And that¹s to say nothing of industry-specific functionalities that many organisations require from ECM.

SharePoint has the potential to be a useful and effective tool for collaborative working; however, IT directors considering it as an alternative to ECM systems would do well to assess their business case, functional requirements and integration points before considering heir vendor options ­ including whether and where to use SharePoint. This will enable them to choose the technology set and vendor best suited to their operational needs; maintain control over the vendor selection process; and travel the shortest, straightest line possible to ECM success.

Yours faithfully,

Alisha Lyndon
Senior spokesperson Northern Europe,

Don’t forget the service in SaaS


By definition, when it comes to Software-as-a-Service, or SaaS, the quality of service is of the utmost importance.

This is particularly true when it comes to the security of sensitive data and corporate networks. Of course, the technology has to be secure and reliable to deliver the level of protection required; but all too often, SaaS providers forget the importance of good customer service.

Without prompt and efficient personal support to back it up, SaaS will inevitably fail and your customers will be let down.

While resellers that provide solutions in the cloud have to keep customer churn low, their customers will find it quicker and easier to go elsewhere if they are unhappy.

After all, they have no new hardware to purchase or legacy software to worry about. It is likely the main reason they went with the cloud in the first place was because it was easy and more cost effective to deploy without in-house skills.

There are compelling reasons to sell cloud-based services – not least the regular ongoing revenue stream and the chance to stay close to the customer. But when selecting a vendor to work with it is essential to look at the human element of SaaS and the vital role that good customer service and support plays in dealing with day-to-day needs.

When it comes to two-factor authentication, for example, this means essential issues such as resetting PINs, user login issues and token provisioning and replacement. If the CEO or sales manager can’t access his emails and files – you will soon know about it.

Yours faithfully,

Dave Abraham

From XServe to Ex


(On news that Apple Xserve is only available for sale through January 31, 2011)

Today is personally an interesting and bittersweet day, and Iʼm sure a bit of shock to many of the clients that Active Storage serves.

I think back to May 14, 2002, the day the Apple Xserve was introduced, and I smile. What I remember most from that day was the excitement of our small, energetic team that brought that amazing product to market.

Eight years later, I read the announcement of its demise and thought, ʻWow, that was a good run.ʼ But change is never easy, even if it makes way for the next great thing.

After spending many years in this business, I have come to realize that like everything else including market-changing products such as the Apple Xserve RAID, all products have a lifespan. And fortunately, the same spirit that brought Xserve to life lives on in the market, and continues to help evolve it.

Such was the case with Active Storage and the ActiveRAID. When we decided to build the next generation RAID product for Apple and Media creators, we were fortunate to be able to leverage the accomplishments of the past. Two years after our October 2008 launch, I look back and know it was all worth it.

Looking forward, I can tell you that Active Storage is not a company the dwells on the past. We were aware that this day may come, and while we are not quite ready to make an official announcement, I can say in all candor, we have not been standing still. Active Storage has some exciting things in the works that will shape the broadcast, post-production and creative professional market for years to come. None of us can predict the future, but Active Storage has been planning and developing for it. Innerpool™ was a glimpse into the future, and its announcement showed that the status quo is just not good enough.

Our investment backing, led by Intel Capital, is committed to our growth, and the Active Storage team from engineers to support, sales, finance, manufacturing and marketing, are up to the challenges of tomorrow. We formed Active Storage with the commitment to deliver the best storage experience for our users and we continually strive to improve it. We are respectful of the needs of our clients, and in these changing times, know that we wonʼt let you down.

No one can match our commitment, innovative spirit and experienced team to deliver the best experience to our users. The things we all accept as givens, separate us from the pack:

•Optimized performance and reliability – it just works

•Performance in adverse conditions – Active Storage delivers even when things are not perfect such as your network, technical issues, configurations – so you still meet your deadlines

•We are there 24/7/365 with support people who deeply understand what you are doing, have walked in your shoes and understand your workflow and not just your simple storage needs

•If there are issues, we resolve them – quickly. The entire company gets involved; not just SEs, but engineers and designers too

We are not a company that builds general market Windows enterprise products with an afterthought to Mac and Media customers – we build innovative solutions for the business and creative industries. Thatʼs what we do – thatʼs ALL we do

We build real management tools and rich applications with complete functionality – not Web pages. Why? Because users manage multiple systems in a network, whether Xsan or StorNext, a real native application does it better. Thereʼs a reason there are over 200,000 apps for the iPhone!

In these uncertain and slightly bittersweet times, rest assured that Active Storage is here to provide the bold thinking, powerful storage solutions, and world-class support your business deserves. Products may change, but our belief in building and supporting the best products and treating customers with care and respect is something that will never go out of style. Thatʼs a promise no one in our industry can match.

Warmest Regards,

Alex Grossman
Active Storage,

Corporate Gear Acquisition Syndrome?


On paper, mergers and acquisitions mean different things but more often than not the two are often used synonymously. There are three key areas that apply to both procedures which should be taken into consideration during the initial stages of a merger or buyout to ensure a profitable end result.

Firstly, a company needs to familiarise themselves with the new customer base they are merging with or taking over to ensure that it matches their existing customer base. More commonly known as ‘The Drag Effect’, if a company has an SMB model and infrastructure, they cannot buy into Fortune 500 marketplace and expect a smooth transition. It is imperative that an organisation also maintains the trust of existing customers during a merger or acquisition. If the equilibrium is disrupted during the process then it can be a difficult task to regain their trust.

Corporate Culture is the second factor to consider as the cultures must be as similar as possible. For example, Scandinavian working cultures are very different to European and US models. It is not a case of opposites attract and if they are too distinct, the merger will fail.

The end vision of the product is the third area a company should take time to focus on. There has to be a certainty that the whole of the resultant portfolio is suitable for the newer larger market that is now accessible. This may entail a sudden internationalisation of the product set, the infrastructure for which has to be present in advance.

For a company to complete a successful merger or acquisition it is vital that they have a comprehensive understanding of the organisation they intend to merge with or takeover. Failure to do so could result in a futile maneuver.

Yours truly,

Walter Scott
GFI Software

Neutrality: What’s all the fuss about?


Search engine Google and Verizon, a telco and internet service provider, are close to finalising a deal which would facilitate Verizon’s being able to speed online content to internet users more quickly, if the content’s creators are willing to pay for it.

The idea that some sites, through paying a fee (or other) to Verizon, can gain priority and run faster than others has backers. It also has its share of critics and protestors, all of who perceive such an arrangement as going against the ethic of Net Neutrality.
Google has two challenges, one of which is legislative, the other reputational and the search industry is watching….
Net neutrality, or Internet/network neutrality is the principle that people simply shouldn’t mess with the magic of the internet and should take a hands-off approach with its administration; more specifically that governments and internet service providers (ISPs) should not place any restrictions on the internet’s content or means of accessing that content – two users should essentially have access to the same content in the same way.
According to Google, “network neutrality is the principle that internet users should be in control of what content they view and what applications they use on the internet. The internet has operated according to this neutrality principle since its earliest days.
Fundamentally, net neutrality is about equal access to the Internet. The broadband carriers should not be permitted to use their market power to discriminate against competing applications or content. Just as telephone companies are not permitted to tell consumers who they can call or what they can say, broadband carriers should not be allowed to use their market power to control activity online.” (Guide to Net Neutrality for Google Users)
However, Google and Verizon have put forward a proposal to the Federal Communications Commission (FCC), to essentially retain this net neutrality on the public internet but to allow broadband operators and network operators to offer new services that might be discriminate in terms of their price and speed. They are proposing that broadband providers can allocate bandwidth for such discriminatory projects, working with other application or service providers as they see fit.
They mention a few specific examples to help illustrate their thinking, examples like health care monitoring, advanced educational services, or new entertainment and gaming options. Essentially, they are proposing they be permitted to create a two-tier system whereby network capacity could be sold to companies willing to pay for that service to in turn provide a higher quality service to their opt-in users.
Whilst Verizon has said it has no intention of selling bandwidth from the ‘public’ network, it wants to make certain it could provide dedicated bandwidth-based services to third parties if it wanted to.
Verizon CEO, Ivan Seidenberg said: “Verizon is standing tall. We said we agree that there should be no paid prioritization of traffic over the public Internet. Google (and others) will continue to innovate, and we have to feed that cookie monster. All we have asked is that we are allowed to offer services like Fios.”
Fios is a bundled home communications service Verizon offers that makes use of an end-to-end fibre optics network, offering internet, telephone and television. Verizon cannot offer it over the Internet, given neutrality requirements, so it is offered as a network separate from the internet.
The proponents of net neutrality clearly don’t like this one little bit, as creating a two-tier system, even if it means legislating neutrality in one of the tiers, results in the fragmentation that they fear and still discriminates in their eyes.
Given that Google’s unofficial motto is ‘Do no evil’, the backlash in some quarters has been brutal. On the ominous Friday the 13August, internet users from across the Bay Area converged outside Google’s offices in protest. The rally was organized by ColorofChange.org, Credo Action, MoveOn.org, Free Press and the Progressive Change Campaign Committee.
SavetheInternet.com summarised the sentiment as follows: “Google previously had been a champion of policies such as Net Neutrality — the fundamental principle that keeps the Internet open and free from discrimination. Its decision to team up with Verizon, long an opponent of such policies, has drawn the ire of public interest advocates.”
So who are pro / against net neutrality legislation? Many Internet giants are proponents of net neutrality, and also supporters of the US government’s involvement in regulating it to ensure the internet stays ‘open’. The likes of Amazon, Craigslist, Google (kind of), Facebook, Sony, IAC, and Twitter fall into this camp. President Obama himself does too:
“I am a strong supporter of net neutrality… What you’ve been seeing is some lobbying that says that the servers and the various portals through which you’re getting information over the internet should be able to be gatekeepers and to charge different rates to different Web sites… And that I think destroys one of the best things about the Internet—which is that there is this incredible equality there… Facebook, MySpace, Google might not have been started if you had not had a level playing field for whoever’s got the best idea and I want to maintain that basic principal in how the internet functions. As president I am going to make sure that that is the principle that my FCC commissioners are applying as we move forward.”
In the against net neutrality camp are a number of large hardware and telecommunications firms, who would invariably benefit from being allowed to redefine the way the Internet works as they control the means of accessing it. In addition, opponents also include heavyweights such as Bob Kahn (inventor of TCP – “net neutrality is a slogan that would freeze innovation in the core of the Internet”*) and David Farber (Professor – “The Internet needs a makeover”**). Robert Pepper, senior managing director of global advanced technology policy believes all the pro-net neutrality hype, is just that, hype.
What does the law say? The law that affects net neutrality differs globally. In the US there is considerable debate around the topic, with the FCC being involved in trying to legislate around this area, and sometimes not by choice. For instance, a court case against Comcast was the first to seriously touch on this aspect, with Comcast was accused of unlawfully throttling BitTorrent traffic in a class action suit. Comcast settled for $16m, with the FCC stating Comcast needed to comply with transparent network management practises.
In Europe, there has been a fairly complex process underway to decide whether to legislate in this area and to what degree. In 2006 there were mixed conclusions from a number debates. One, sponsored by AT&T, concluded net neutrality legislation would be unattractive. Other debates at the Royal Society and Institute of Public Policy Research, in the same year, reached conclusions that were in favour of it.
In 2009, as part of the Telecoms Package (to be implemented in May 2011 by all EU member states), service providers were to be held to a higher standard of transparency, making it compulsory for service providers to inform customers whether the service they are subscribing to includes any traffic management techniques and what the impact of those would be on service quality or any other restrictions. AT&T put forward 5 revisions to this, of which it successfully achieved 2 of them. The two revisions essentially leave the door open for discrimination against websites and users.
Are we truly net neutral today and if so, how long can it be sustained? There are a number of central arguments used in opposition to any kind of net neutrality legislation.
Firstly, that the ability to charge users/sites different rates for differing levels of access will provide the revenues to ISPs and other network operators necessary for them to recoup their investments in broadband networks. Verizon has said there is no current incentive for it to develop and deploy advanced, super-fast fibre optic networks if it can’t charge more for access to such networks. Verizon and a number of ISPs have often referred to firms like Google and Skype as ‘freeloaders’ for making money using networks that they have provided at a cost of billions.
Secondly, many suggest what we have right now isn’t in fact net neutrality at all. The biggest firms can invest in higher bandwidth deals and server replication to provide faster access for its users in comparison to smaller sites that wouldn’t be able to afford such infrastructure, for net neutrality isn’t even something that exists to uphold.
Thirdly, the increase in rich media means infrastructure providers have far more pressures on their resources than was once the case. Bret Swanson of the Wall Street Journal suggests Youtube streams as much data in 3 months than the world’s radio, cable and broadband television channels stream in one year, i.e. 75 petabytes. By extension he believes telecommunications firms are simply not ready for the era of ‘exabyte’ delivery and something needs to give.
The ball is in the FCC’s court. Given the FFC is facing powerful and influential pressure on both sides, it won’t be able to please everyone. The next few weeks will be very interesting as we see the debate unfold in the commercial, political, and invariably in the press world. Of further interest to us search people is what impact it may have on Google generally. Wired called Google a “net neutrality surrender monkey” this month, which means it has two challenges, one of which is legislative and the other reputational.
Andreas Pouros,
18 August 2010

Got a view or an issue that you want to share? Got an opinion on a hot topic? Got spmething insightful that you need to get your point across in a quick and easy way? The just email the Editor – at newsdesk (at) ecmplus (dot) eu.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s