Monthly Archives: January 2011

Sitecore supports JFK legacy with CMS content migration

Inaugural Address of John F. Kennedy, 35th Pre...

Inaugural Address of John F. Kennedy, 35th President of the U.S.

Kennedy’s legacy informs and inspires world with help from Sitecore and Velir

ECM Plus – The John F. Kennedy Library has just completed a major digitisation project for the museum’s archives of hundreds of thousands of documents, photographs and sound recordings.

The content is now more accessible and useful for educational and public service, the Museum curators said.

CMS vendor Sitecore and Velir said that the contents of the John F. Kennedy Presidential Library saw the launch of a new website using Sitecore’s software which helped the JFK Library’s Digital Archives go live last week.

In preparation for the 50th anniversary of John F. Kennedy’s inauguration as the 35th President of the United States, the website has been designed to improve access to some 200,000 pages of textual content, 1,500 images, as well as 1,250 files of audio recordings and moving images and even 340 telephone conversations.

According to the curators, the new website now offers the public access to the JFK archival assets with an enhanced search function for easier and faster access.

John Hawley, director of web development for the John F. Kennedy Presidential Library’s Museum said: “We wanted to ensure that we created a virtual experience that made it easy for people from around the world to discover and access educational and inspirational content that has the power to inspire all generations. From a technology perspective, improved site navigation was a large focus of the website redesign. The website needed to be easy to navigate so that it properly showcased the newly available information. For example, the new search engine allows visitors to enter the word ‘moon’ and pull up virtually every document, tape, and speech related to JFK’s mission to land men on the moon.”

Sitecore and technology partner Velir collaborated using its Content Management System to customise the new website and to integrate Sitecore’s CMS to enable online video, faceted search, and digital asset management.

The project, dubbed as “Access to a Legacy,” is the largest, most advanced digital archive created by a Presidential Library not “born digital”.

Among the documents include a handwritten version of Kennedy’s inaugural address and notes and tapes made during the Cuban missile crisis.

Leave a comment

Filed under Content management system (CMS), Digital Media Management (DMM), Document archiving & retrieval, Document Management, Document scanning & imaging, Information Management, Media asset management, Portal, Records & Information Management (RIM), Records Management, Scanning, Search, Vendor News, Web Content Management

Intel touts 10GbE data centre strategy for Cloud 2015

10 gigabit router

10 gigabit router topology

Intel simplifies data centre tech for 10GbE accelerated cloud roll-out


ECM Plus – Chip behemoth Intel’s data centre aspirations took another step this week as it introduces a new technology that it says will enable data centre traffic to run over a single cable – using an Intel 10 Gigabit Ethernet server adapter of the X520 family.

Intel said that its unified networking would enable IT departments to create what is dubbed ‘superhighways in virtualized data centers’ by consolidating multiple data and storage networks onto just one 10GbE network.

Apparently, this cable consolidation can further help reduce IT spend by $3 billion a year, Intel claimed, and that the 400 million feet of  cabling strewn around the data centres of the world would be enough to wrap around the Earth three times. Not quite sure why anyone would want to do that, but Intel seems to think global bondage by cable is a worthy cause. Takes all sorts.

Futhermore, said Intel, its high-speed unified data centre network topology is also a ‘cornerstone’ of itds Cloud 2015 ‘vision’ and its Open Data Center initiativeannounced last October.

“What’s frustrating for IT managers is that most of the data center dollars are spent on infrastructure costs, not on innovation” commented Kirk Skaugen, vice president and general manager for data center group for Intel. “Expanding Intel Ethernet to include Open FCoE will help simplify the network and drive more of the IT budget toward innovation. We think IT departments can lower infrastructure costs by 29 percent, reduce power by almost 50 percent and cut cable costs by 80 percent by moving to a unified network.”

The company said that its Open FCoE integrates capabilities into the operating system to deliver full unified networking without the need for additional expensive, proprietary hardware. IT departments, it said, could then use common management tools for server network and storage connectivity while integrating seamlessly with existing Fibre Channel environments.

“Server virtualization and converged storage networking, based on a shared 10 Gigabit Ethernet fabric, are key ingredients for a cost-effective infrastructure” added Paul Brown, vice president and general manager of Storage Networking Business, for EMC. “Achieving EMC E-Lab qualification is a gold standard in our industry for server, networking and storage interoperability. The extensive work done to qualify and validate the Intel Ethernet Server Adapter X520 will give our customers confidence that the product will integrate seamlessly with the EMC Symmetrix VMAX and EMC VNX family of storage products.”

Leave a comment

Filed under Cloud Computing, Data centres, Data storage, SAN (Storage Area Networks), Vendor News, Virtualization

NEWS: Parliamentary ICT head’s ‘clouded future’ keynote at Cloud Expo Europe

Statue of Oliver Cromwell outside the House of...

Statue of Oliver Cromwell in Parliament Square

Miller to share views on cloud for democratic information sharing systems


ECM Plus – Joan Miller, director of parliamentary ICT at the British Houses of Parliament is set to give a keynote speech at Cloud Expo Europe in London this week.

On Tuesday 2nd February at 9.30am, Joan Miller’s paper entitled ‘A Clouded Future’ will address how the British Parliament is to investigate options and the criteria to be used to assess which ‘version’ of the cloud is the one it will ‘back’.

Miller was appointed to the post in September 2005. With a brief to establish a new Parliamentary ICT organisation, the taxpayer-funded solution is inteneded to support services both at Westminster and in the offices of Members’ of Parliament’s representative constituencies across the nation.

Five years on and as a result, Miller is at the forefront of Parliament’s IT strategy. Said Miller: “We are cognisant that the Cloud represents a huge opportunity for Parliament to drive down IT costs while maintaining efficiency and it is top of our agenda to investigate the right way forwards for Westminster.”

She went on: “As a result I am delighted to be attending and speaking at an event which examines the issues and opportunities that Cloud Computing represents.”

Cloud Expo Europe 2011 takes place between the 2nd and 3rd of February at the Barbican Centre in central London.

The exhibition and conference are free to attend. To registerm visit

Leave a comment

Filed under CCM (Cloud Content Management), Cloud Computing, Content Management, Enterprise Content Management, Hybrid Cloud, IaaS (Infrastructure-as-a-Service), Industry News, Information Management, PaaS (Platform-as-a-Service), Private Cloud, Public Cloud

Cloud goes into interstellar overdrive with C12G’s partner programme

A diagram showing the scale of cloud computing...

Diagram showing scale of cloud computing

Partners can build, promote and sell cloud offerings in cahoots with C12G Labs


ECM Plus – OpenNebula open-source toolkit specialist C12G Labs has just unveiled a new partner programme to help develop strong business alliances with partners that understand cloud technology and its applications.

According to C12G Labs, it willprovide its partners with a customisable and flexible cloud management technology and the professional services to create novel cloud offerings and speed up their delivery to users.

The company said that the main objective of its partner programne would be to align developments and technologies to meet customer’s needs as well as to bolster support for the design and development of joint cloud products and services.

Three types of partnerships are offered, including ‘channel partners’ authorised to sell solutions and products that use OpenNebulaPro; ‘service partners’ are authorised to offer integration and consulting services around OpenNebulaPro as well as utility services based on OpenNebulaPro; as well as ‘technology partners’ who can provide a technology, product or service that complements OpenNebula and are also uthorised to advertise their compatibility.

C12G Labs added that the partner programme would combine deep software and service discounts with training, technical, integration and certification support to help partners create value.

Copies of the C12G Labs partner programme guides can be requested from

Leave a comment

Filed under Cloud Computing, Consultancy/Consulting/Systems Integration, IaaS (Infrastructure-as-a-Service), Open Source, PaaS (Platform-as-a-Service), Vendor News

FEATURE: Overcoming cloud computing barriers

Andy Burton, Chairman, CIF

Cloud computing has the potential to thrive and is on the verge of becoming mainstream within the IT industry.


As it encompasses a broad range of potential suppliers from ISV’s to hosting companies and global brands, however the availability of new technology is not in itself a reason for adoption, nor the tremendous commercial benefits of cost savings, agility and scalability on demand. Technical and commercial barriers need to be overcome in order for the industry to prosper.

There is a lot of market hype surrounding the cloud and its alleged benefits, which has created a lack of trust, resulting in business being reluctant to invest in cloud services.

The biggest reason behind the lack of trust in cloud computing are businesses not knowing confidently what cloud computing is, what it can offer and how to integrate with the rest of their IT infrastructure. The most important factor when addressing the concept of cloud computing is therefore to understand specifically what it is by definition, and, how it can benefit both businesses and end users at a practical level. Cloud computing enables someone to access computing power (processors, RAM, storage and bandwidth) and applications ‘online’ via the internet on demand.

This infrastructure is highly scalable and agile, operating independently from the hardware on which it operates. It is typically offered on a pay-as-you-use or subscription model and there are no capital costs to participate. With cloud service providers offering improved service levels and portable devices enabling anytime, anywhere use, the technology is set to change the way businesses operate dramatically.

Despite the positive commentary surrounding ‘the cloud’ and its alleged benefits, businesses have voiced doubts about investing in cloud services, and cite security, interoperability and trust as their major concerns. Contrasting information from those who promise remarkable financial savings, to those who highlight security and privacy ‘risks’, emphasises the need for cloud service providers to be clear and unambiguous on what services they deliver, including how, and from where.

It also supports the case for a greater consistency of understanding of cloud computing with clear guidelines and measurements. It is therefore necessary to implement a Code of Practice, to standardise information provision, and certify organisations that offer a high standard of cloud computing services. This will reassure businesses that ‘the cloud’ can be a significant aid and a worthwhile investment.

It is clear that barriers, such as the lack of trust towards cloud computing, must be overcome by service providers in order for the industry to thrive. As it stands, the cloud industry lacks transparency and structure, which is why there is a lack of trust. A Code of Practice is necessary to engender the trust required by businesses and to provide clarity to potential end users. If cloud service providers follow regulations within a Code of Practice, businesses will trust and have confidence in the services they are investing in.

Service providers should be accountable for their operational practices, and in particular, they should actualise any public claims that they make about their service on their websites or promotional materials. They must strive to gain the trust and respect of end users, so we can expect more wide spread adoption of the cloud.

Until then, industry bodies must continue to find new ways of encouraging trust in the cloud computing sector so that end users’ endorsement of the industry will pave the way for similar innovative technologies to prosper in the future.

1 Comment

Filed under CCM (Cloud Content Management), Cloud Computing, Features, Hybrid Cloud, IaaS (Infrastructure-as-a-Service), PaaS (Platform-as-a-Service), Private Cloud, Public Cloud, SIP, Virtualization

EFF demands quashing ‘copyright troll’ litigation

Logo of the Electronic Frontier Foundation

Electronic Frontier Foundation

Film companies are abusing the law to pressure defendants to settle

ECM Plus – The Electronic Frontier Foundation, the EFF, has asked an Illinois judge to quash subpoenas issued in predatory lawsuits involving alleged illegal downloading of pornography.

According to the EFF, In an amicus brief filed last week, EFF said it has argued that the adult film companies were abusing the law in order to coerce settlement payments – despite serious problems with the underlying claims. Charles Lee Mudd Jr. of Mudd Law Offices assisted EFF with the filing of this brief.

The Foundation said that its brief submitted last week was the latest in its efforts to stop so-called ‘copyright trolls’ – content owners and lawyers who team up to extract settlements from thousands of defendants at a time.

Allegedly, tactics employed by these ‘copyright trolls’ include improperly lumping defendants together into one case and filing it in a court far away from most of the accused people’s homes and Internet connections.

EFF said that when these adult film companies file these predatory lawsuits, there is the added pressure of embarrassment associated with pornography. All of these factors can, EFF stated, convince those ensnared in the suits to quickly pay what is being demanded instead of arguing the merits of their case in court.

“Copyright owners have a right to protect their works, but they can’t use shoddy and unfair tactics to do so” said EFF Intellectual Property Director Corynne McSherry. “We’re asking the court to protect the rights of each and every defendant, instead of allowing these copyright trolls to game the system.”

Last month, a judge in West Virginia blocked an attempt to unmask accused pornography file-sharers in seven predatory lawsuits.

EFF said that this was closely following the reasoning from an EFF amicus brief, the judge ordered the plaintiffs to file against each defendant individually. In December 2010, a judge in the District of Columbia dismissed hundreds of individuals named in the U.S. Copyright Group troll campaign because of lack of personal jurisdiction. EFF said it had filed an amicus brief in that case as well.

“As judges start to force copyright trolls to play by rules, this kind of mass litigation will no longer be a good business model. That helps protect the rights of Internet users everywhere” McSherry added.

The full amicus brief can be downloaded from

Click to access effamicus11411.pdf

For more on copyright trolls, visit

1 Comment

Filed under Digital rights management, Industry News, Rights Management

Red-Blue paradigm secrets finally opened up for business

A scene as it might be viewed by a person with...

Hypermyopia Blues: a scene as it might be viewed by a person with hereditary eye disease RP

Favourite hues are linked to visual acuity

ECM Plus – The eagle-eyed among us rally to red, and the Mr. Magoos are wooed by blue. So says Diana Derval of the market research firm DervalResearch, whose newest findings are based in neuroendocrinological science. Professor Derval, who says her research shows that visual acuity determines our favorite and least favorite colors, will present these findings at the Association for Research on Vision and Ophthalmology (ARVO) conference in January.

According to DervalResearch, nearsighted people – those with myopia – tend to prefer short-wave colors like blue, whereas farsighted folks (hyperopia) gravitate to long-wave colors such as red.

It’s all a matter of simple physics, Professor Derval explains. Each color refracts differently; in other words, colors hit different places on the retina according to their wavelength. Short-wave colors such as blue and violet target the front, whereas long-wave colors such as red and yellow hit the back. The focal point of the eye is the place where all color waves meet after passing the lens, but the exact location of the focal point varies among individuals. “Because nearsighted people focus light closer to the front of the retina,” explains Professor Derval, “watching blue colors is effortless for them. To perceive red colors, on the other hand, they have to tense the ocular muscles.” Conversely, farsighted people have a shorter eyeball and the focal point is beyond the retina. Looking at red is easy on their eyes, whereas gazing at blue requires that they tense the ocular muscles. People tend to gravitate towards the colors that relax them, Professor Derval says.

DervalResearch conducted the research on men and women of various ethnicities – Chinese, Caucasian, African, and Middle-Eastern. Subjects reported their lower-order aberrations (nearsightedness or farsightedness), and then declared which color they preferred, which color they found relaxing and which color they found irritating. The reported colors were classified by their wavelength in nanometers (nm). Professor Derval found the correlation between visual acuity and color preference to be slightly stronger in women than in men.

Patrick Jansen, owner of the Optics shop New Optics in Belgium, and chairman of the Carl Zeiss Academy Belgium, decided to put this scientific approach to the test when designing a recent advertising campaign. New Optics sent out 1,000 invitation cards advertising a special offer. One side of the card was printed on a blue background with the blurb, “If you choose this color side you must be nearsighted; let’s talk about it the next time you visit.”

The reverse was printed on a red background with, “If you choose this color side you must be farsighted; let’s talk about it the next time you visit.” Jansen says that 100 people visited the shop with their invitation, and most nearsighted people did indeed prefer the blue side, saying that blue had been their favorite color since their childhood. Most farsighted people preferred red.

DervalResearch is riding the wave of a brave new trend in “neuromarketing,” which combines cutting-edge neuroscience with marketing research. And there’s a lot more to the research than simply determining people’s color preferences based on their visual aberrations. DervalResearch findings also targets those other four senses: taste, touch, smell, and hearing. Most intriguingly, there is a hormonal connection to all of this sensory research; that’s the “endocrinological” part. “Consumers are unique individuals but they are also predictable,” says Professor Derval. “Their preferences and behavior are directly linked to their biological and sensory perceptions. And these perceptions are greatly due to the influence of prenatal hormones.”

Drawing upon thousands of measurements in over 25 countries, DervalResearch developed a powerful and predictive biological segmentation tool,” the Hormonal Quotient® (HQ). “We have discovered that people’s perception of products and services – via their taste buds, hair cells in the inner ear, rod and cone cells in the eyes, and skin sensors – is linked to their Hormonal Quotient,” she says. “Knowing consumers’ HQ makes it possible to predict not only their favorite colors, but also their preferred tastes, smells, shapes, textures, and sounds.”

DervalResearch’s HQ tool was developed by studying over 50 target groups, including top executives, housewives, entrepreneurs, purchasing managers, and opinion leaders. From this Derval was able to determine eight different Hormonal Quotient® profiles. Applied properly, Professor Derval says, these profiles will allow firms to design and deliver the right consumer experience across their markets in a very cost-effective way, predicting consumers’ sensory perceptions, purchasing behavior, and product preferences based on their biological profiles. “Companies no longer need to conduct traditional, recurrent, and costly surveys,” she says. “They just have to identify the profile and Hormonal Quotient® (HQ) of their consumers once.”

A wide range of industries – food and beverage, electronics, luxury items, fashion, cosmetics, automotive, pharmaceuticals, advertising, leisure, and tourism, to name but a few – could benefit from this research.Besides conducting marketing research for global brands, Diana Derval is also the author of a book, “The Right Sensory Mix: Targeting Consumer Product Development Scientifically” (Springer) (, based on her company’s research. The book explains how to understand and predict customer preferences and offers tools for tailoring the sensory mix of color, shape, taste, smell, texture, and sound. It includes case studies from top brands including Red Bull, Sofitel, Häagen-Dazs, Björn Borg, and Nintendo. Professor Derval intends her book to help advance her firm’s mission to build a bridge between scientific knowledge and business. Science, she notes, has most often been used to explain phenomena, but it has been under-utilized to understand and predict consumer preferences.

Markus Kohler, Director of Packaging at Philip Morris International, endorses Derval’s approach – and her book – saying, “Professor Diana Derval breaks ‘conventional’ consumer insights with a new, scientific approach, producing unexpected strategies for predicting consumer behaviors and new ways of identifying unexplored, profitable market segments.”

But this research is not all about luring consumers to spend more of their hard-earned money.

Besides enabling marketers to fine-tune their approach and deliver the optimal sensory mix to specific market segments, Professor Derval says her work has many other useful applications. “This research will make it easier to adapt medical, public, and private services to individuals who are sensitive to certain colors,” she added.

Leave a comment

Filed under Analysis, Digital Media Management (DMM), Marketing asset management, Marketing operations management, Product Information Management

e-Spirit goes cloud with novel SaaS web content platform

Image representing e-Spirit as depicted in Cru...

Content managment sniffs SaaS and cloud apps

FirstSpirit launches AppCenter for Integration of Web applications into the content management system, focus on workflow optimisation for editors

ECM Plus – e-Spirit’s new AppCenter for content management system FirstSpirit is the only solution on the market supporting the straightforward interface integration of virtually all web applications whether from cloud and SaaS environments, or from the desktop directly into the CMS.

According to the firm, this results in a whole new experience for online editors, since applications that were previously used separately are now available right in FirstSpirit with the accustomed interface and functionality and can be used directly in the CMS.

Editors no longer have to keep switching between many different programs in order to transfer content from other applications to the CMS or to conduct information research. This makes the editing processes faster and more efficient – with thousands of different applications.

e-Spirit’s said AppCenter could provide infrastructure for the integration of all types of web applications that makes the daily work of editors easier. This new FirstSpirit functionality allows editors to use key programs they are accustomed to directly in the FirstSpirit JavaClient, rather than outside the CMS which was the case previously.

Online image databases, image editing, video and web analysis services are examples of its use. The programs are opened using a simple one-click request in the CMS. Thanks to seamless integration with the applications, data are displayed in the CMS and can be used, edited and subsequently published on a website, intranet or any other channel. Text, images, audio and video files, road maps, product information etc. can be comfortably processed in the FirstSpirit editor interface regardless of the source and with media continuity. Prior to deployment, the layout in various output channels can be verified in the WYSIWYG preview.

“The FirstSpirit AppCenter opens up new horizons to companies for the optimization of their content management processes” commented Jörn Bodemann, CEO of e-Spirit. “It represents the continuation of our best-of-breed strategy to the Cloud: We allow companies to seamlessly integrate the applications they need into FirstSpirit so that editors can simply continue using their favorite programs in the CMS. Not only does this result in enhanced usability, it also improves the efficiency of editing processes and thereby leads to measurable savings for companies.”

AppCenter is based on the powerful web browser integration in the JavaClient which was already established in FirstSpirit. AppCenter API is suitable for integration into the editing interface of existing applications, new development, customer-specific programmes and third-party software based on different technology platforms, including web applications, platform-independent Java applications, operating system-independent native applications like Mozilla Firefox, or complex RIA applications such as Flash, AIR and Silverlight.

In combination with cloud and SaaS services, e-Spirit said that FirstSpirit AppCenter also supports the integration of complex applications that cannot be operated in-house due to extensive hardware requirements, maintenance and licensing costs such as with video transcoding, the company said.

Leave a comment

Filed under CCM (Cloud Content Management), Cloud Computing, Content management system (CMS), SaaS (Software-as-a-Service), Vendor News, Web Content Management

Private cloud push as Platform proffers ISF2.1

Duke, the Java Mascot, in the waving pose. Duk...
Duke, the Java Mascot

Platform Computing announces industry’s most complete
application-centric software for private cloud management


ECM Plus – Platform Computing has just taken the wraps off its new ISF 2.1 software for private clouds.

According to the firm, the software can run multi-tier applications as well as providing IaaS infrastructure and PaaS platform middleware as services.

ISF 2.1 also uses what they call a ‘single cloud pane’ for cloud administrators and delegates with support for Hadoop, Jboss, Tomcat and WebSphere. 2.1 also has policy driven automation for high availability across multiple data centres.

“Adopting an open cloud management solution that supports flexibility and choice is critical” opined Jay Muelhoefer, vice president of enterprise marketing for Platform Computing. “ISF enables companies to benefit from the cloud, not stitch it together across multiple vendors like a systems integrator.”

Platform said ISF2.1 automates delivery of complex enterprise infrastructure and production applications across heterogeneous virtual, physical and public cloud resources.

Leave a comment

Filed under CCM (Cloud Content Management), Cloud Computing, Data centres, High Availability, IaaS (Infrastructure-as-a-Service), PaaS (Platform-as-a-Service), Private Cloud, Vendor News

Cloud to smash through into outsourcing in 2011

IBM Cloud Computing

Cloud computing. Image: Ivan Walsh

Report says cloud will dominate market in 2011


ECM Plus – Adoption of cloud-based services is set to accelerate as buyers are developing business cases and piloting projects and devising strategies to migrate select legacy processes to lower-cost models, according to new research by EquaTerra.

It said that the outsourcing market was now becoming increasingly fragmented. EquaTerra’s survey found that demand for outsourcing and third-party services’ pie will be divided up differently as the industry continues to evolve.

Apparently, buyers are doing more multiple small deals in a range of sub-process areas, outsourcing new types of work and experimenting with a range of cloud-based offerings, the survey revealed. 

“Buying patterns are changing,” said Stan Lepeak, managing director of global research for EquaTerra. ”Throughout 2010 we characterized slower growth in traditional IT and business process outsourcing as cyclical and attributable to cautiousness associated with a severe recession. It’s increasingly apparent, however, that what’s going on is more systemic.”

Some fo the survey’s findings reveal that demand for conventional business process outsourcing and ITO is now ‘sluggish’. 48 percent of respondents cited demand in the fourth quarter was down three percent quarter over quarter and five percent year over year. 51 percent said the new deal pipeline growth for the quarter was also down 13 percent quarter-on-quarter, and that 24 percent year-on-year saw further evidence of a systemic change in buying patterns versus cyclical fluctuations.

According to EquaTerra, cloud was the ‘next logical step’ in the evolution of outsourcing. Cloud computing was redefining the concept by offering “as-a-service” capabilities ranging from infrastructure to platform to applications.

EquaTerra said that such offerings were particularly attractive to cautious buyers who welcome the opportunity to shift operating expenses to capital expenses, peg price to usage, lower overall costs, and to scale up or down as needed. Cloud-based services are also the logical next step for organisations.

The ‘As-a-Service’ categories were expected to experience significant uptake in 2011. Desktop applications were a cost-effective alternative to licensed/installed desktop software, such as documents and spreadsheets, project collaboration, short messaging, email and calendar.

Infrastructure, it stated, which includes integrated technologies that work together seamlessly to enable fast, flexible delivery of both IT infrastructure and managed services certified data centres, secure storage, disaster recovery, development and testing.

Added Lepeak: “Many firms in emerging markets are forgoing enterprise systems. They start up using standardized cloud-based systems and their operating costs are one-tenth of the legacy model.”

Leave a comment

Filed under Analysis, CCM (Cloud Content Management), Cloud Computing, Data centres, Data storage, Hybrid Cloud, IaaS (Infrastructure-as-a-Service), ILM (Information Lifecycle Management), Information Management, PaaS (Platform-as-a-Service), Private Cloud, Public Cloud, SaaS (Software-as-a-Service), SIP, Virtualization

FEATURE: Clouds do have silver linings

Visualization of the various routes through a ...

Visualisation of the various routes through a portion of the Internet

How can we expect users to trust the cloud, until it has really been put to the test? Well, it has been, and it works.

Considering the business world’s dedication to efficiency and minimizing expenditure, the personal computer revolution of the 1980s coulde seem pretty perverse to later generations. Why fill the office with PCs loaded with identical software rather than centralise on one mainframe with simpler, cheaper workstations on the desk?

And yet the PC survived well into the age of the Internet, when it first became possible to deliver all software as a service from a central source. The idea of ”Software as a Service” was good, but pioneering attempts failed, simply because broadband access was not good enough to support the service. But with today’s widespread broadband it has become a practical proposition.

It’s now called ”Cloud Computing” because actual processing takes place at some unknown location, or in a dispersed virtual machine, across the Internet cloud. And it only becomes practical when the Internet access is fast enough not to frustrate a user accustomed to the speed and responsiveness of on-board software. Simarly, the success of a virtual data centre must depend on network links fast enough to preserve the illusion of a single hardware server.

We do now have networks and access technologies fast enough to meet these challenges, but many are held back because they do not have the confidence to engage in the Cloud.

Knowing what we do about the determination and skill of cyber-criminals, how can we secure a system as amorphous and connected as the Cloud? And, after decades of experience in which enthusiastic technology-advocates have promoted systems too complex to be reliable, why should the public now put its trust in cloud computing?

The answer would be to find some way to test these shapeless and dynamic virtual systems with the same thoroughness and accountability as testing a single static piece of hardware. That is asking a lot, but it has been achieved – according to a recent report from European test specialists Broadband Testing.

The performance challenge
Cloud computing potentially offers all the benefits of a centralised service – pay for what you actually use, professional maintenance of all software, single contact and contract for any number of applications, processing on state-of-the-art hardware – but it has to match the speed, responsiveness and quality experience of local software if the service is going to be accepted.

So how does the provider ensure that level of service will be maintained under a whole range of real world operating conditions including attempted cyber attacks? The answer must lie in exhaustive testing.

But there is a fundamental problem in testing any virtual system, in that it is not tied to specific hardware. The processing for a virtual switch or virtual server is likely to be allocated dynamically to make optimal use of available resources. Test it now, and it may pass every test, but test it again and the same virtual device may be running in a different server and there could be a different response to unexpected stress conditions.

This is what worries the customer – is it really possible to apply definitive testing to something as formless as a virtual system? Broadband Testing’s report, Secure Virtual Data Center Testing (September 2010), provides the answer.

“Can we trust the cloud? The answer now is ‘yes’” according to Steve Broadhead, founder and director, Broadband Testing. “Virtual Security works in theory but, until there was a way to test it thoroughly under realistic conditions, solution vendors have had a hard time convincing their customers. Without Spirent we could not have done this – the testing proved not only highly rigorous, but also quite simple to operate.”

Maintaining the application library
Whether the central processing runs on a physical, virtual or cloud server, it needs to hold a large amount of application software to satisfy the client base, and that software needs to be maintained with every version upgrade and bug fix as soon as they become available. It’s a complex task, and it is increasingly automated to keep pace with development. There must be a central library keeping the latest versions and patches for each application package, and some mechanism for deploying these across the servers without disrupting service delivery.

At this stage the service provider is in the hands of the application developer – the service to the end user can only be as good as the latest version on the server. We hope the aplication developer has done a good job and produced a reliable, bug-free product, but the service provider’s reputation hangs on that hope until the software has been thoroughly tested on the provider’s own system.

In the case of a physical server, we do not expect any problem because the application is likely to have been developed and pre-tested on a similar server. But virtualisation and cloud computing adds many layers of complexity to the process. The speed of the storage network becomes a significant factor if the application makes multiple data requests per second, and that is just one of many traffic issues in a virtual server.

Faced with such complexity, predicting performance becomes increasingly difficult and the only answer is to test it thoroughly under realistic conditions.

One cannot expect clients to play the role of guinea pigs, so usage needs to be simulated on the network. It is critical to gauge the total impact of software additions, moves and changes as well as network or data center changes. Every change must be tested to avoid mission critical business applications from grinding to a halt.

Application testing in a virtual environment
There are two aspects to testing applications in a virtual environment. Firstly functional testing, to make sure the installed application works and delivers the service it was designed to provide, and then volume testing under load.

The first relates closely to the design of the virtual system – although more complex, the virtual server is designed to model a hardware server and any failures in the design should become apparent early on. Later functional testing of new deployments is just a wise precaution in that case.

Load testing is an altogether different matter, because it concerns the impact of unpredictable traffic conditions on a known system.

To give a crude analogy: one could clear the streets of London of all traffic, pedestrians, traffic controls and road works then invite Michael Schumacher to race from the City of London to Heathrow airport in less than 30 minutes. But put back the everyday traffic, speed restrictions, traffic lights and road works and not only will the journey take much longer, it will also become highly unpredictable – one day it might take less than an hour, another day over two hours to make the same journey.

In a virtual system, and even more so in the cloud, there can be unusual surges of traffic leading to unexpected consequences. Applications that perform faultlessly for ten or a hundred users may not work so well for a hundred thousand users – quite apart from other outside factors and attacks that can heavily impact Internet performance.

So the service provider cannot offer any realistic service level agreement to the clients without testing each application under volume loading and simulated realistic traffic conditions.

The Spirent test solution
Network performance and reliability have always mattered, but virtualisation makes these factors critical. Rigorous testing is needed at every stage in deploying a virtual system. During the design and implementation phases it is needed to inform buying decisions, and to ensure compliance. Then, during operation it is equally importantand to monitor for performance degradation and anticipate bottlenecks, as well as ensuring that applications still work under load as suggested above.

But large data centers and cloud computing pose particular problems because of their sheer scale. Spirent TestCenter™ is the company’s flagship test platform for testing such complex networks, and it meets the need for scalability in a rack system supporting large numbers of test cards, to scale up to 4.8 terabits in a single rack.

As a modular system, TestCentre can be adapted to any number of test scenarios. In particular, Spirent Virtual is a software module that specifically addresses the challenge of testing in a virtual environment. It was named the 2010 Best of Interop winner in the Performance Optimization category, on the strength of its innovative approach for testing the performance, availability, security and scalability of virtualized network appliances as well as cloud-based applications across public, private and hybrid cloud environments.

Spirent Virtual provides unsurpassed visibility into the entire data center infrastructure. It is designed specifically to meet the needs of a complex environment where as many as 64 virtual servers, including a virtual switch with as many virtual ports, may reside on a single physical server and switch access port. With Spirent Virtual in the TestCentre, it is not only possible to test application performance wholistically under realistic loads and stress conditions, but also to determine precisely what component – virtual or physical – is impacting performance.

To create realistic test conditions, Spirent Virtual software is used in conjunction with devices designed to generate massive volumes of realistic simulated traffic. Spirent Avalanche is such a device. It is designed to replicate real world traffic conditions by simulating error conditions, realistic user behavior, and maintaining over one million open connections from distinct IP addresses. By challenging the infrastructure’s ability to stand up to the load and complexity of the real world it puts application testing in a truly realistic working environment.

The latency issue
Even minute levels of latency can become an issue across a virtual server. So how does one measure such low levels of latency, where the very presence of monitoring devices produces delays that must be compensated for?

Manual compensation is time consuming and even impossible in some circumstances, whereas in the TestCentre this compensation is automatic and adjusts according to the interface technology and speed.

The acceptability of cloud computing depends upon delivering a quality of experience as good as local processing but without all the overheads of licencing and software version management. Quality of experience is a subtle blend of many factors such as latency, jitter and packet loss and all these can be precisely monitored on the TestCentre under wide-ranging traffic loads, both running pre-programmed tests automatically and allowing operator intervention via a simple user interface.

And the question of security
As well as delivering good quality of user experience, the cloud computing provider needs to satisfy the clients’ fears about security in the Cloud. The hacker that accesses a soft switch can re-route traffic at will, and so virtualisation leads to potentially severe vulnerability across the whole business – and the social infrastructure in the case of cloud computing. Again, the growth in virtualisation demands a corresponding increase in prior and routine testing.

Here it is not only the need to test under unusual load conditions – because those are the times when attacks are most likely to succeed – but also there is a need to simulate a whole range of attack scenarios. The application must still work when tested in the context of the network security devices working under attacks and vulnerabilities.

Spirent’s system delivers the most comprehensive, accurate user emulation of end user traffic and unexpected attack traffic even while at high load. Simply put, Spirent can model the user behavior while scaling to full Internet levels. This “no compromise” approach is important since measuring the impact to the user and the network while loading the application with real-world loading patterns helps identify, isolate and resolve problems before the provider commits them to service agreements and puts them on-line.

Putting the test to the test
Broadband Testing set out to determine whether it is possible to secure a virtual environment, knowing that their first problem was to create a rigorous and repeatable test process.

The security system under test would be the TippingPoint IPS-based Secure Virtualization Framework (SVF), and the test bed itself would consist of both the physical and virtual versions of Spirent’s Avalanche traffic generator. These were to be combined with a typical network environment including both physical and virtual elements in order to replicate a truly representative hybrid data center environment.

Using Spirent’s pioneering cloud computing testing solutions with performance, availability, security and scalability (PASS) methodology, Broadband Testing were able to monitor and test internal and external-to-internal traffic under normal operating and extreme conditions plus a wide range of attack scenarios. All the threats in the HP TippingPoint signature base were successfully blocked, and the only ones that passed were those not yet added to the then-current database.

David Hill, Spirent’s vice president for EMEA commented on the Broadband Testing report: “The key takeaway was that testing with Spirent stressed the

capability of the security solution right to its limits. ”People assume that security is the final objective, when what is really needed is a precise way to
quantify and tailor the level of security in a complex system. ‘Tried and tested’ means more than any amount of theoretical argument in this case.”

“The economic benefits of cloud computing are overwhelming, but so are the security concerns of network operators and their customers. This independent report breaks that deadlock, as reliable testing now makes it easy for system vendors to mitigate the risks of migrating to the cloud, while optimizing resource utilization under an exhaustive range of real-world operating and threat scenarios.”

Cloud computing offers many advantages to the user, but the provider must assure the client that the service will consistently deliver on its promises. Fail, and users will vote with their feet.

The only way to ensure success is to offer a tried and tested service. Broadband Testing has now shown that this can be done and it can be proven. Most significantly for practical purposes, they found that: ” the testing proved not only highly rigorous, but also quite simple to set up and run.”

Leave a comment

Filed under CCM (Cloud Content Management), Cloud Computing, Data centres, Data storage, Features, Hybrid Cloud, IaaS (Infrastructure-as-a-Service), PaaS (Platform-as-a-Service), Private Cloud, Public Cloud, SaaS (Software-as-a-Service), SIP, Virtualization

Cloud Expo Europe 2011: Investigating the cloud

Diagram showing three main types of cloud comp...

Image via Wikipedia

By Maggie Meer, Event Director of Cloud Expo Europe 2011

Software-as-a-Service (SaaS) and Cloud Computing are hot topics right now as they enable companies to better manage their IT investments and reduce costs.

Analysts predict that 20 percent of businesses will own no IT assets by 2012. But how do businesses know whether Cloud Computing is right for them? Should they adopt an on-demand type of environment hosted by software vendors or their partners; or go for a public cloud environment such as those offered by Azure or Amazon EC2 or implement a private cloud? Is Cloud Computing a revolution or actually nothing new? How will it change the IT landscape and what technological developments can we expect in the future?

Most importantly, how will cloud computing save money and ensure efficiency all round?

All these questions and many more will be addressed at Cloud Expo Europe which is to take place on 2nd and 3rd February in London at The Barbican.

The event will play host to the top providers of Cloud Computing, Virtualisation, Solution providers, SaaS, PaaS, IaaS, Storage Management and Next Generation Datacentre technology providers such as Amazon, Carrenza, Redhat, Terremark, Rackspace, OnApp, Ping Identity, Xiotech, Zeus and others.

Respected luminaries and key industry leaders will speak at the conference which features three tracks: Business, Solutions and Security & Storage. The conference will provide a balanced mix of real-life case studies, presentations from third party experts and seminars to investigate the issues and opportunities of Cloud Computing.

The public sector will be particularly well represented. A keynote by Joan Miller, Head of Parliamentary ICT, will address how Parliament is investigating the options and criteria to be used to assess which version of the Cloud they will back. Hillingdon Council’s Roger Bearpark will talk about the Council’s Cloud Initiative and Transport for London’s Emer Coleman will outline TfL’s recent move to the Cloud for the live feeds of information on the Tube.

Security continues to be a key concern for IT and business decision-makers considering cloud alternatives. What are the key differences between the traditional computing models and virtualized computing models, and what does this mean for security strategy? Large scale cloud deployments can result in large amounts of data in the Cloud.

Putting that data on low-cost storage results in unacceptable performance, but putting all the data on high-performance storage results in unacceptable costs. How can this issue be resolved and how can you ensure that the data is secure? The Security and Storage track will examine all these issues in depth as well as present the different technologies on offer.

Finally, several respected Industry experts will reveal research on the types of cloud computing environments being used, considered or ignored, what the drivers and expectations are, which countries are leading or lagging in adoption and what the trends are for the future.

2011 is the year of the Cloud and Cloud Expo Europe 2011 aims to provide visitors with the latest information, resources, ideas and examples which they can apply to their own working environment to maximise performance, improve scalability and minimise outlay whatever their business and company size.

1 Comment

Filed under Cloud Computing, Data centres, Data storage, Hybrid Cloud, IaaS (Infrastructure-as-a-Service), PaaS (Platform-as-a-Service), Private Cloud, Public Cloud, SaaS (Software-as-a-Service), SIP, Virtualization

Viral China video content set to pass 50m hits

CONTENT CLASSICS: SpongeBob Square Pant’s hilarious skit on China highlights humorous work ethic and ‘Google’ influence

1 Comment

Filed under Collaboration platform, Enterprise Social Software (ESS)

Social media mayhem as monetisation hawks set to swoop

Quel ricco sfondato di Mark Zuckerberg, founde...

From community collaboration to corporate cash cow?

Brands set To shift from community building to monetisation in social media in 2011


ECM Plus – It was only a matter of time. With many brands focused on building social communities in the past, their focus is shifting towards how they can make money out of these communities and social channels and social activities in 2011, says Punch Communications.

According to their research, launching a Facebook store will continue to enable brands to continue discussion with followers and fans, while also customers can buy and share products with friends and write reviews, Punch stated.

Holly Henstock at Punch added: ‘Brands have and are continuing to integrate social media channels within campaigns, alongside other channels. However with some brands now beginning to sell directly via Facebook apps such as the Facebook store, it looks likely that those brands who have invested so heavily in community building will begin to look at how they can make a return on investment. Clearly the main issue is how to do so without undermining the community itself – and it’s certain that some consumers will reject the concept of being sold to via brand pages. Nevertheless, the process is inevitable and looks set to create an interesting dynamic over the coming six to twelve months.”

Leave a comment

Filed under Analysis, Brand asset management, Content Monetization, Customer Relations Management (CRM), Enterprise Social Software (ESS), Marketing asset management, Marketing operations management

Boxing clever with cloud content management

Image representing Kimber Lockhart as depicted...

Image via CrunchBase

Thinking outside of it sees firm propel conntent into enterprise territory with SAS-70 and beyond


ECM Plus – When it comes to cloud content management, few players have yet to establish themselves as firmly in the burgeoning space as That said, even such players are not prone to complacency and perceive rapid reinvention and re-engineering as part and parcel of the business model.

Henceforth, news unfolds that has just unveiled an entirely new version of its cloud content management offering demonstrating enhanced collaboration and real-time capabilities, along with a simplified user interface, the company expounded.

According to the firm, the new Box will be rolled out to the company’s 5 million users over the course of the next month, creating a more intuitive, interactive experience for end users, while also enabling future customisation for enterprise IT departments.

“2011 is the year that the cloud will tip for content management in the enterprise, and at Box we’re committed to leading that movement by bringing simplicity to end users and IT departments alike” opined Aaron Levie, co-founder and CEO of Box. “With the new Box, we’ve overhauled the design to make sure that the user experience is more intuitive and elegant than ever before. This will continue to be our primary competitive advantage as organizations look to move off of feature-bloated and under-used legacy systems like Microsoft SharePoint.”

Box said users will have the option of switching between the old and new versions until the end of February.

Significant changes include a content-centric redesign. The new Box provides a richer content experience, with 30% more screen space devoted to the viewing files directly within the browser.

It also boasts real-time activity updates, whereby using Tornado technology, the new version of Box brings real-time updates and in-line uploading to users, giving them access to the latest content and collaboration activities from anywhere on Box.

More ways to collaborate: a new discussions tab at the folder level will facilitate conversations around projects rather than individual files, providing enhanced collaboration capabilities.

A new Box Apps Marketplace will also offer easy access to partner applications from a new Apps tab, promoting adoption of Box’s ecosystem of 150+ integrated applications.

Box said it had seen ‘significant enterprise traction’ in 2010, including adoption in 73% of the Fortune 500 and 3.4X enterprise revenue growth over 2009, due to deals with organisations such as Hawaiian Airlines, T-Mobile and Discovery Networks.

The company said it had also secured key integration partnerships with the likes of NetSuite and SugarCRM, and has also launched mobile applications.

The firm added that it had also achieved a major security milestone with SAS70 Type II certification.

Box added that it had also pushed 50 platform updates in 2010 including transformative integrated content viewing following the company’s acquisition of Increo Solutions. The new version of Box will enable the company’s engineering team to develop and implement new features even more efficiently in 2011.

“We asked ourselves what would make Box better if we could literally start from scratch… and then we did just that,” said Kimber Lockhart, Engineering Manager and former CEO of Increo Solutions. “With the new version of Box, we’ve removed all the clutter in both the implementation and design to give users an incredibly streamlined, intuitive experience, and in doing so we’ve laid the groundwork to bring new innovations to market faster than ever before.”

Leave a comment

Filed under CCM (Cloud Content Management), Cloud Computing, Content Management, Content management system (CMS), Data storage, Enterprise Content Management, Vendor News

Much ado about something in video content monetization

Image representing AdoTube as depicted in Crun...

Much ado about something...

AdoTube partners with FreeWheel for monetization of premium video content


ECMPlus – In-video platform specialist AdoTube has just been named by video monetization technology company FreeWheel as a one partner to stream ad formats and premium ads in conjunction with FreeWheel’s monetization rights management.

AdoTube helps publishers use their own sales teams to sell AdoTube’s multiple innovative in-stream ad formats, while also enabling them to opt-in to AdoTube’s premium ad network.

“The result should be higher standards and improved operations industry-wide.”
.Certified Partner status further ensures that AdoTube has met a substantial set of criteria that allows FreeWheel to vouch for AdoTube’s technology and solutions. FreeWheel requires companies to fully adhere to industry standards set forth by the Interactive Advertising Bureau, as well as an additional set of FreeWheel-specified criteria.

“This program is a way for us to work with our valued partners like AdoTube to continue to strive for the best levels of support, integration, and operational excellence on behalf of our mutual clients,” said Brent Horowitz, vice president of business development at FreeWheel. “The result should be higher standards and improved operations industry-wide.”

AdoTube has a systems integration in place with FreeWheel giving shared clients access to its robust advertising platform as a way to help monetize inventory and power in-video advertising.

“FreeWheel Certified Partner status highlights our continued commitment to creating the best possible advertising platform for publishers, validating our goal to help clients make the most of their in-video inventory” added Craig Aron, VP Business Development at AdoTube.

Leave a comment

Filed under Content Monetization, Digital rights management, Industry News, Rights Management, Streaming media

Web content market bifurcating into ‘features arms race’

Products and platforms present different risks for content management technology customers


ECM Plus – The web content management market is dividing into two distinct segments – platforms and products – which is presenting different risks and opportunities for customers, according to new research by independent analyst firm Real Story Group.

In their latest findings, RSG said that platform-orientated CMS tools now needed more development time and resources, but yielded more power.

Meanwhile, product-orientated offerings can be launched more quickly, RSG’s research stated, but lacks long-term flexibility.

When it comes to platform-orientated WCM vendors, the researchers found that platform-orientated CMS vendors were now engaging in an escalating “features arms race” to adopt more new media functionality and maintain higher price points against substantial pressure from an increasingly impressive set of product-oriented WCM vendors (see next section).

“There are some key exceptions here” said founder Tony Byrne. “The largest ‘enterprise’ vendors continue to be distracted by broader Document & Records Management ambitions and their CMS tools tend to exhibit slow innovation and update cycles.”

Source: The Real Story Group

The ‘Web Content Management 2011 Cross-Check’ charts represent four key dimensions – size, focus, vendor evolution, and product development – that customers can use to supplement functional and cost/value analyses in any major procurement decision.

For product-orientated WCM vendors, the major themes for such players are now profusion and steady progress. Most of these vendors are doing well in the marketplace, which contributes to the highly fragmented nature of this space.

“With respect to risk, these tools tend to have less turbulent product updates” added analyst Apoorv Durga. “Customers are more likely to face institutional/vendor risk here, as these represent mostly smaller companies or open source projects undergoing transition.”

“Each buying organization must balance its appetite for change and risk against a vendor’s orientation. “What represents a high risk to one customer may represent a chance for innovation and a competitive advantage to another” noted analyst Adriaan Bloem. “Alternatively, what represents a staid and slow-moving product set to one customer may represent stability and low risk to another.”

Leave a comment

Filed under Analysis, Web Content Management

Sitecore releases CMS 6.4 to boost marketers’ muscle

Image representing sitecore as depicted in Cru...

Image via CrunchBase

Sitecore CMS 6.4 enables design, development and marketing to create better campaigns and web experiences with fewer resources

ECM Plus – Sitecore has released a newer version of its CMS that gives marketers, and their diverse support teams, more capability to create and monetise campaigns and web experiences.

“No matter how powerful a CMS solution, if cross-functional teams can’t intuitively use the system, marketing campaign results and Web experiences will suffer” said Darren Guarnaccia, senior vice president product marketing, Sitecore. “That’s why Sitecore has dedicated itself to developing a world-class user interface that is powerful and simple to learn. Based on iterative feedback from the Sitecore community, we made significant strides in usability with features that anticipate how marketers think and work.”

Sitecore CMS 6,4 authoring interface has new inline editing and page management features which make it easy to alter website content and layout.

Marketers and business users can move and present features such as polls, marketing forms and other content and interactive elements without relying on specialised design and development teams.

The time and resources for creating and publishing Web content and marketing campaigns is significantly reduced.

Sitecore’s browser-based editor lets users author and manage digital interactions from web pages to sophisticated mobile experiences in a simple unified interface.

Editors can work in the browser of their choice as the latest release includes support for Firefox others as well as continued support for Internet Explorer.

Leave a comment

Filed under Content Management, Content management system (CMS), Enterprise Content Management, Marketing asset management, Marketing operations management, Vendor News, Web Content Management, Web Experience Management (WEM)

Taming the SharePoint ‘site creation monster’

SharePoint Foundation 2010 Lo nuevo.

SharePoint Foundation 2010

SharePoint site provisioning and governance assistant for SharePoint 2010 tames ‘site-creation monster’

ECM Plus – SharePoint Solutions has just released its new SharePoint Site Provisioning and Governance Assistant (SPGA) for SharePoint 2010, an add-on which ebables administrators to control the creation, provisioning and approval of SharePoint 2010 sites whilst enabling users to select the type of site they need.

According to the company, SPGA tames the “site-creation monster” as it is helpful for companies and organisations that regularly provide SharePoint 2010 sites and need a way to empower users to create sites while ensuring that the organisation’s SharePoint 2010 standards are adhered to.

‘Users are happy because their requests are handled quickly and efficiently, and administrators are happy because order is maintained and chaos is eliminated’ said’Jeff Cate, president of SharePoint Solutions.

With SPGA, administrators set all the boundaries on the front end, including workflow for approvals. Then, when a user fills out the site request form, he or she sets into motion an automated set of processes that quickly and automatically approve, create and provision the site. Company standards and site uniformity are maintained because they are “baked” into the profile in advance.

As both a site-provisioning tool and a governance tool, SPGA allows IT professionals to: maintain the integrity of an organization’s SharePoint 2010 taxonomy; enforce governance policies. bake” exactly what is wanted into SharePoint 2010 site request profiles; provide users with an easy way to request SharePoint sites or groups of sites
Specify different site request profiles for different sites or groups of sites; automate an organization’s approval process for requesting new SharePoint sites; monitor and track site requests.

Leave a comment

Filed under Enterprise Content Management, SharePoint, Vendor News, Web Content Management

CellStrat’s Mobile App Conclave expo heralds monetization

Solar Eclipse, New Delhi

Image by *Santosh via Flickr

CellStrat Announces Mobile Apps Conclave on January 21, 2011 in New Delhi

ECM Plus – Mobile Apps Conclave, the conference organised by CellStrat is this Friday 21st January at the India Habitat Centre in New Delhi, India.

Mobile Apps, App Stores and the Mobile Internet – the new technology phenomenon. An App for this, an App for that. You name the task – there is an App for that. The Mobile Internet has arrived and arrived big time, riding the bandwagon of Mobile Apps. iPhone revolutionized the concept of App Stores and now all major Tech behemoths have picked up on that theme. Whether it is business, marketing, entertainment, customer care or literally anything else, everything and everybody has an App for that, or is working on creating one).

App Economy is the new Economy. People, Institutions, Companies, Media – all are consuming Apps, developing Apps or orchestrating Apps. Social Networks are getting appified, Businesses need App Marketing, Agencies are using Apps for Audience Engagement, People are living their lives on Apps. It is all about Mobile Apps, App Stores and the new face of the Global Internet – The Mobile Web. The most ubiquitous and personal Digital Channel of all times – the Mobile Device has unleashed the next wave of creative evolution – the Business of Apps and App Stores.

First keynote will be delivered by Sanjay Swamy, the head – of Mobile Strategy, UIDAI (Unique Identification Authority of India) – Govt of India. UID is one of the most ambitious initiatives by the Govt of India. Sanjay will talk about the new opportunities in Mobile Space related to the UID initiative.

There is a panel on the theme of “Apps Everywhere : Apps as Engagement, Media and Commerce Platforms”. This panel will discuss the upcoming Mobile App revolution, what works and what doesn’t in the App world and the scope of the same in Indian parlance vs abroad. This panel will bring together brands and vertical industry experts who are leveraging the Mobile Web and Mobile Apps for Service, Engagement and Commerce.

After the first panel, there will be a talk by Google’s Country Manager on how Mobile Apps are influencing the Mobile Advertising streams, creating new Business Models for Brands and Media to ride the bandwagon of Apps and Mobile Web, all the way into customers’ hearts, minds and pocketbooks.

In the afternoon session, there will be a Fireside Chat between a Telecom Industry visionary Mr N K Goyal and Vishal Singhal of CellStrat. N K Goyal is the President of CMAI and one of the foremost leaders and evangelist in Telecom and Wireless verticals in India. This session will discuss the state of Mobile Industry in India and its drivers of growth.

Later in the afternoon, there will be two more panels which will delve into App Monetization, App Marketing and what’s in store for the future Mobile Innovations. The Mobile App Strategy session will delve into the complex problem of App Marketing and Monetization strategy. The Mobile Innovation panel will discuss the impact of Mobile Web, the Rise of Tables, Broadband Mobility and related advancements in the Mobile world.

Afternoon session will also see a Keynote from a senior Executive from the top mobile company of India – Nokia. This session will provide insights into the use of Mobile Apps in the Enterprise verticals like Supply Chain and Order Management.

Overall, it promises to be an action-packed event. Partners includes Presentation Sharing Partner is authorSTREAM on which all the presentations by CellStrat, event speakers, exhibitors and partners can be found. The Mobile App Partner, Hazel Media is providing an event app for this conference. Whereas the Mobility Partner, MobiVite is providing a Mobile WAP site for the same. Thus, true to its name, info on Mobile Apps Conclave is universally available across all Mobile platforms.

Nokia Tej and AgileCO are the Gold Sponsors for this conference and it is also supported by partners like CMAI and Indian Angel Network (IAN). Media Partners include the prominent magazine Voice&Data, online media firm, and also Business Wire India as the News Distribution Partner.

Lastly, CellStrat is also organizing an interesting Mobile Solutions Expo where some innovative app companies will be exhibiting their apps and solutions. This Expo promises to dazzle the audience with some interesting apps and solutions in mobile space.

Details of the event are available at

Leave a comment

Filed under Content Delivery, Content Monetization, Industry News, Mobile Analytics, Mobile Apps, Mobile Content

Open Text’s ECM and SharePoint are a DoD cert

Image representing Open Text as depicted in Cr...

Open Text's records managment is a DoD cert

OpenText obtains DoD 5015.02-STD certification for ECM and SharePoint


ECM Plus – OpenText components of its ECM Suite 2010 together with SharePoint Server 2010 have met U.S. Department of Defense 5015.02-STD certification requirements for records management.

According to Open Text, this encompasses unstructured content including physical, electronic and email records.

The company said it had certified against all versions of the DoD standard since its inception more than a decade ago and had completed test process over twenty times on various platforms and versions.

Managed by the Joint Interoperability Test Command (JITC), DoD 5015.02-STD certification involves a rigorous testing process. OpenText had Records Management and its Application Governance and Archiving for SharePoint offerings tested in an environment that included SharePoint 2010 and a full set of MS operating systems, office applications and database servers.

Through the test, OpenText said it met requirements for applying records management policies and securely managing the entire lifecycle of the information.

Leave a comment

Filed under Content Security, Enterprise Content Management, Records & Information Management (RIM), Records Management, Security Content Management (SCM), Vendor News

Tarsin and Media Exchange ink mobile licensing deal

Mobile web standards evolution

Mobile web standards evolution

Firms partner for push into mobility publishing markets


ECMPlus – Media Exchange Group and Tarsin have executed a definitive Capsa Platform Licensing Agreement, giving Media Exchange Group access to the mobile publishing world.

Tarsin, a mobile publisher, provides a widget-like framework to design mobile offerings.

According to the companies, Tarsin’s Capsa platform will provide a versatile framework for brands to design and deliver mobile experiences to the mobile marketplace. Capsa is a mobile content delivery solution available that is based on web-standards and is carrier, operating system, and device agnostic.

“The services and social networking solutions that Media Exchange Group are developing for release globally are needed in the market place, and needed by the online and mobile consumer,” said John Osborne, CEO of Tarsin.

“As mobile becomes the priority for 2011, we are excited about the opportunities and potential the Capsa platform now provides to Media Exchange Group, a one stop shop for turn key multi-platform mobile solutions company,” said Baer.

Leave a comment

Filed under Content Delivery, Content Monetization, e-Publishing, Enterprise Social Software (ESS), Industry News, Mobile Apps, Mobile Content, Rights Management, Telecommunications

W3i ups ante in app discovery monetization

Image representing W3i  as depicted in CrunchBase

W3i's ad payments ups ante for monetisation

Monetisation methods multiply as firm takes on ad payment challenge 


ECMPlus – W3i has just launched a new so-called ‘Ad-Funded Payment Platform’ which it claims provides a customisable solution for developers looking for app discovery or monetization.

According to W3i, with over 300,000 apps in the App Store and virtual goods revenue projected to pass $2 billion in the US this year, the Ad-Funded Payment Platform purports to help iOS app developers achieve their discovery and monetization goals.

It said the Ad-Funded Payment Platform could provide customisable offer interfaces for publishers and intelligent reporting for developers. The product is driven by W3i’s proprietary InstallIQ software, optimised on 500 million downloads, the company said.

“The W3i Ad-Funded Payment Platform provides publishers an opportunity to monetize and drive their virtual goods transactions in a branded offer wall experience, while giving advertisers powerful insights and control into their marketing campaigns through sophisticated campaign dashboards. Users get free apps” added Andy Johnson, CEO at W3i.

Leave a comment

Filed under Content Delivery, Content Monetization, Enterprise Content Management, Mobile Analytics, Mobile Content, Vendor News

Mobtron touts mobile business website creator

Mobile phone evolution

Size isn't everything

Free site product for businness users for mobile advertising, marketing


ECMPlus – Mobile web development platform has just launched a new module which enables any enterprise to create their mobile site for free, together with free hosting and backups.

According to the firm, anyone willing will also be provided professional advice by experienced mobile sites developers – and free of charge.

Mobtron provides mobile website developers with detailed statistics, access to a range of monetization functions, from mobile advertising networks integration to SMS payments.

“Our goal is to make the mobile website development simple and accessible to all” commented Mobtron’s technical development manager Sarunas Davalga. “Users select the desired option, drops it into the site and at once may observe how the site has changed. Thousands of individual users have successfully used the platform already, but with our latest add-on Mobtron has become an excellent tool for companies and businesses.”

Mobtron can be used both by companies wishing to create a representative mobile site of their business and especially by media agencies looking for means to offer their customers unique and cost-effective online mobile advertising solutions.

According to the company, all sites created using Mobtron are optimized for search engines and adapted for various models of mobile devices – including the most modern smart phones. Sites can also be created or edited using mobile phone only.

Leave a comment

Filed under Content Delivery, Content Management, Content Monetization, Content Provision & Creation, Digital rights management, e-Publishing, e-Reader, Mobile Content, Rights Management, Vendor News, Web Content Management, Web Experience Management (WEM)

Javascript code cuts off bloggers from the Web

EFF calls for immediate action to defend Tunisian activists against government cyberattacks

Commentary by Eva Galperin

Demonstrations and protests over unemployment and poor living conditions have been ongoing in Tunisia since the beginning of December, but last week the Tunisian government turned up the heat on bloggers, activists, and dissidents by launching a JavaScript injection attack that siphoned off the usernames and passwords of Tunsians logging in to Google, Yahoo, and Facebook.

The Tunisian government has used these stolen credentials to log in to Tunisians’ email and Facebook accounts, presumably downloading their messages, emails, and social graphs for further analysis, and then deleting the accounts entirely. Among the compromised accounts are Facebook pages administered by a reporter with Al-Tariq ad-Jadid, Sofiene Chourabi, video journalist Haythem El Mekki, and activist Lina Ben Khenni.

Unsatisfied with merely quelling online freedom of expression, the Tunisian government has used the information it obtained to locate bloggers and their networks of contacts.

By late last week, the Tunisian government had started arresting and detaining bloggers, including blogger Hamadi Kaloutcha, and cyberactivist Slim Ammamou, who alerted the world to his whereabouts at the Tunisian Ministry of the Interior using Google Latitude.

This weekend, Tunisian citizens began to report on Twitter and in blogs that troops were using live ammunition on unarmed citizens and started communicating with one another to establish the numbers of dead and injured.

Most notably, Tunisians have been posting videos of the protests, including the dead and wounded on Facebook, the only video-sharing site which is not currently being blocked by the Tunisian government, which makes access to Facebook especially important for the protest movement.

Because of the Tunisian government’s attacks on citizens’ login credentials, Tunisians should take the following steps to protect themselves: if HTTPS is available, use HTTPS to login to Facebook, Google, and Yahoo. If you are using Firefox, EFF’s HTTPS Everywhere plug-in will do this for you automatically.

EFF has received reports that the Tunisian government is periodically blocking HTTPS access to Facebook, Google, and Yahoo. If that is the case and you must login over HTTP, install the following Greasemonkey script to strip out the JavaScript which the Tunisian government has inserted to steal your login credentials. If you have logged in to Facebook, Google, or Yahoo recently over HTTP, login using HTTPS and change your password.

Additionally, EFF calls on Google, Yahoo, and Facebook to take action to protect the privacy of its users by alerting them of the potential compromise of their accounts and encouraging them to take the above steps.

Finally, Facebook has reported that is in the process of taking technical steps to protect the privacy of their users. We hope that they include the following: •Make Facebook logins default to HTTPS, if only in Tunisia, where accounts are especially vulnerable at this time. Google and Yahoo logins already default to HTTPS. •Consider allowing pseudononymous accounts for users in authoritarian regimes, where political speech under your real name is dangerous and potentially deadly. Many Tunisian activists are unable to reinstate Facebook accounts that have been erased by the Tunisian government because they were not using their real names. Websites providing services to Tunisian citizens cannot afford to sit on the sidelines while the Tunisian government launches malicious attacks on the privacy of users and censors free expression. Facebook, Google, and Yahoo should take these concrete steps as quickly as possible to inform and better protect their users.

Source: EFF

Leave a comment

Filed under Collaboration platform, Content Security, Enterprise Social Software (ESS), Industry News, Reporting, Security Content Management (SCM)