Following the merger of SAP AG and Sybase ASA earlier this year, both companies shared their first joint milestones in the form of combined product announcements that will focus on the mobile device, in-memory analytics and enterprise information management.
Stories by Kathleen Lau
SAP and CA have teamed up to pitch integrated offerings that aim to unify the governance, risk and compliance strategies typically segregated between the IT and business sides of an organisation. One analyst calls this partnership "disruptive".
Customers lack a framework with which to map out the risks in the IT processes and their impact on business processes, says James Dunham, group vice-president of GRC (Government, Risk and Compliance) solutions with SAPG. "IT understands risks related to the IT infrastructure, but don't know what processes that impacts on the side the application space," says Dunham.
And execs on the business side may be aware of risks in their business applications, but not how those relate to the IT infrastructure. GRC strategies are siloed even within the IT infrastructure, as well as within the business side, says Dunham.
Integration will be offered initially for the areas of security, IT project and portfolio management, and service performance. The chosen areas resulted from consultation with key customers about their top risk concerns and the drivers behind them, says Tom McHale, vice-president of product management at CA.
That customer feedback led to the companies offering integration for an initial three groups of products: CA Enterprise Log Manager, CA Clarity PPM, and CA Wily Application Performance Management which will integrate with SAP BusinessObjects Risk Management and SAP BusinessObjects Process Control.
McHale says following this base integration, CA and SAP will continue to offer other integrations and "knocking them off one by one". The result will be a catalogue of use cases, or risk scenarios based on SAP methodology, that address other situations, like how to govern extending the order-to-cash process to a new region.
Vivian Tero, programme manager for governance, risk and compliance infrastructure with IDC, thinks that the integration between SAP and CA is definitely "very disruptive" given the flow of information between business and IT is often minimal. "Those processes are very co-dependent, so if there is minimal interaction between the functional units, there's always going to be room for inefficiencies and errors in compliance or risk mitigation," says Tero.
There are other vendors attempting this sort integration, such as Oracle and Novell, who focus on linking GRC to identity access. But Tero says Oracle and Novell lack the breadth and depth that the SAP and CA partnership will offer. "They're going to be pushing a lot of information — security and compliance — up to SAP GRC Manager, which goes beyond just an identity attribute," she said.
The partnership should offer further value given the many joint customers the SAP and CA share, says Tero.
Echoing this, Dunham said the integration will be a differentiator for SAP and CA in the hybrid environments in which they often exist. "We believe that mapping together gives us a real competitive differentiator ... for some of the other larger competitors inside this space, the guys that have a lot of red (Oracle applications) on their box," said Dunham.
It's been about a year since Conficker/Downadup first hit, and although the threat didn't turn out to be as grave as it had the potential to be, the 6.5 million PCs that remain infected today represent what Symantec calls a "loaded gun, waiting to be fired".
As IT infrastructures become increasingly converged and components increasingly interdependent, IT administrators are still not factoring in the collateral impact of individual changes to the IT environment, says one executive.
Joe Wolke, director of IT strategy for Illinois-based IT consulting firm Forsythe Solutions Group, says that while technology trends like virtualisation, storage consolidation, cloud computing and hosted applications serve to streamline IT functions, they also change the traditional IT equation.
"In many ways they are making infrastructure less complex, but are making the model for identifying direct costs much more complex," says Wolke.
Changes to the IT environment like adding a new application, business unit or geographic location will have a greater impact than is immediately observable. The foundational issue of running IT as a business, says Wolke, is ensuring costs are transparent and comprehensible to users.
He says it is critical to align the lifecycle of a new application with that of the associated physical assets. This means recognising that storage and servers required for the application may not have the same lifespan, so all costs must be accounted for down the road.
Wolke suggests organisations develop "collateral impact metrics" to measure and account for the impact that radiates throughout the IT environment. "If I add 10 more users to the network, what is the impact on the network?" he says.
Factoring in costs of collateral impact should also happen at the project management phase as applications are being developed. While an organisation's impact metric might state that a new application for 1000 users will require five help desk people, that number will surely rise to fit the initial learning curve, says Wolke.
On a more granular level, building new applications deserves the same degree of collateral impact planning. Coding is dependent on various components including bits of code, and small applications and licensing from third parties that must be made part of a disaster recovery plan, says Wolke. "There is much more dependency. Applications don't stand alone anymore," he says.
A new study from Forrester Research shows that application developers and their project managers are not keeping up with the times. Mike Gualtieri, senior analyst with Forrester, says IT pros aren't necessarily adjusting to what is the new reality of a tough economy and the popularity of certain technology trends.
The Forrester research recommends five changes to application development professionals:
* Embrace the cloud: Developers must understand how to design and architect applications differently to take advantage of the cloud especially when it comes to cloud-specific strategies for scaling data. "Data is the Achilles heel of cloud computing when it comes to application development," says Gualtieri.
* Find your inner startup: The unrelenting focus of startups is to make money, so developers should take that lean approach in tough times even if they work for a large enterprise. "It is really about focus, about trying to get beyond all the processes, politics and management that normally occur as an organization grows," says Gualtieri.
* Favour flexibility and cost over platform loyalty: Enterprise IT is typically driven by the procurement department's decree to stick to a particular vendor stack. Entertain other options like smaller vendors or open source, says Gualtieri. "They are potentially cheaper and give you more flexibility rather than waiting a year for a large enterprise software vendor to get what you need," he says.
* Become passionate about user experience: Users want an app experience that is valuable, easy and aesthetically pleasing, and application development teams must catch on to that ever-growing demand.
* Coach your talent: Project managers often view their developers as "automatons on an assembly line," but software is a creative art, like making a movie, says Gualtieri. IT skills selection must align with the needs of the specific project. "If this is a customer-facing web site, it is a different skill set than a departmental application," he says.
IBM Corp. doesn't expect that its collaboration software offerings will outright replace popular user-driven social networking tools like instant messaging and Twitter, said one executive at the Lotusphere 2010 conference.
Starting in the spring of 2010, customers of Canada's Scotiabank will be able to use their smart phones to perform banking transactions by way of text messages, and they'll be using a platform provided by Auckland-based mobile developer M-Com.
Available for individual and small business account holders, the service is supported by, and tailored for, all types of smart phones and browsers, said Mike Henry, senior vice-president sales and service with Scotiabank.
"If you've got a power device like an iPhone, a BlackBerry Bold or Storm then we'll have specific downloadable apps customised for the power devices," said Henry.
The service provides account holders the option to perform certain transactions by sending an SMS code along with a brief text command like "BAL ALL" if they want to view all their account balances.
Scotiabank's mobile banking service is supported by a technology platform from M-Com, a New Zealand-based mobile banking and payments technology provider.
Adam Clark, CEO of M-Com, said that while Gen Y users are typically the target market for mobile banking, there are other groups as well. In the US, for instance, the average age of mobile banking users is 42, Clark noted. The more affluent users who are always "out and about and they have a smart phone" are a target audience, as well as blue collar workers who don't typically work in front of a PC.
But the smart phone is a natural place to which to send banking alerts, said Clark. "It's much more interrupting. As an email you might not get it until you check your email that evening or in the morning," he said.
The SMS and text approach appeals to all demographic groups because of its ease of use, he said. "You've always got your phone with you. You haven't always got your PC in front of you."
Acknowledging that security concerns is one of the biggest impediments to users adopting services, Clark said transactions on the platform are secure due to a mix of mechanisms including two-factor authentication and 128-bit SSL.
Henry said the security guarantee that account holders receive online will be extended to the mobile service. Things like account name and numbers will also be masked to prevent identification and reuse by an imposter. And, depending on the sensitivity and the amount of money involved in the transaction, second- or third-factor authentication will apply.
The mobile interface will be condensed and simpler to ensure it is legible and navigation is easy. From the user's perspective, mobile banking will be a lot like online in terms of being able to do things like access account balances and transaction histories and pay bills, said Henry.
Scotiabank is hoping to capitalise on the fact that there are 22 million handsets in Canada. More than half of phone connections are wireless and the adoption of smart phones is growing, said Henry. "On that basis there is going to be a lot of latent demand out there that will cut across all consumer groups," he said.
That said, Henry acknowledges that early adopters will likely be the younger crowd and business owners.
Clark said that while mobile banking is in its infant stages in Canada, that's not the case in other parts of the world, like Asia-Pacific and Africa, where more sophisticated mobile payment options are available.
But as momentum builds in Canada through a service like that of Scotiabank, Clark said mobile banking will eventually witness the addition of more services, greater customer engagement, and cross-selling as the mobile channel becomes a way to drive down costs.
According to Rob Burbach, senior analyst with Toronto-based IDC Canada, while there is a large pool of active handsets in Canada, that doesn't mean everyone is looking for a mobile banking service. That said, Burbach thinks the other major banks can be expected to make some noise with mobile initiatives of their own.
And, although Scotiabank's mobile service payment options are relatively basic compared to other parts of the world, Burbach said it will still hold appeal to some. "It comes to how comfortable are you using your mobile phone for your various functions?" he said.
Scotiabank's service will undergo a soft launch prior to going live.
There are many opportunities for vendors in the data quality space to create tools that cover a broader scope, yet many of those opportunities remain untapped, according to an analyst.
Andy Hayler, president and CEO of UK-based analyst firm The Information Difference, says most data quality software vendors focus on traditional niche areas like name and address type tools for customer and supplier records. But that represents only a small part of end user needs, he says.
"Vendors have opportunities to educate the world about data quality because there is an awful lot of ignorance around this area," he says.
A recent study on data quality by The Information Difference revealed that respondents view data quality as something that is not restricted to one area within the organisation. Instead, two-thirds of respondents said it is an issue spanning the entire organisation.
The survey, commissioned by location intelligence vendor Pitney Bowes Insight and data management vendor Silver Creek Systems, polled 200 companies in North America and Europe, almost half of which had revenues of US$1 billion.
Eighty-one percent of respondents reported being focused on a broader scope than merely customer name and address data. Twelve percent said they did in fact hold that slim focus.
Moreover, the survey also found that when placed in order of importance, respondents did not rate customer name address data the highest. Instead, product (inventory) and financial (accounting figures) data were deemed a greater priority.
Yet, the vast majority of tools only focus on customer name and address data, notes Hayler.
"That said to me that there is an opportunity for vendors to address a broader range of data," he says.
The study also delved into opinions on how data quality fit with master data management (MDM), or the quest for a single version of the truth across an organisation's data.
One-fifth of respondents felt data quality is a prerequisite to an MDM initiative and wanted to see more vendor offerings integrating those two areas.
Hayler says one would expect vendor partnerships between the areas of data quality and MDM, and that is precisely what is currently happening in the industry.
The issue for lack of attention to data quality by MDM vendors, says Hayler, is that traditionally these vendors have focused on building systems that digest data quickly, only to later realise such systems were useless if the data being input was bad.
Fifty-one per cent of respondents believed all was well with data quality, while only a quarter thought it was poor.
Yet, only about a third of respondents had a data quality programme in place. Another third had plans to implement a programme within the next year to three years.
Ted Morawiec, president & CEO of e-Net, a Toronto-based performance management system vendor, said measuring data quality can be tricky because it entails a human element.
"It is not [just] software and hardware that is the expenditure," said Morawiec. "It's the people and the process that it is improving."
Things like improvements in productivity must be measured, said Morawiec.
It's essentially business activity monitoring, he said, and is a relatively new concept for executive management to digest.
It can be a juggling act to curb unnecessary costs on additional software licences while ensuring there are enough licences for all users.
IT departments will often procure more seats than they actually require so users don't run into productivity problems, but that means paying for unused software, says Gareth Doherty, research analyst with London, Ontario-based Info-Tech Research Group.
"A lot of organisations will buy more resiliency than they need in terms of the software licence," says Doherty.
Negotiating the right software licence agreement is complicated by the fact that organisations often have no benchmarks with which to compare pricing, says Doherty.
But a "veil of secrecy" maintained by vendors regarding licence agreements makes it difficult to get the standard cost for, say, deploying 30 seats of a latest customer relationship management offering, he says.
It is only when seated at the negotiating table with the sales agent that the dollar amounts are revealed.
"Without having benchmarks, you really don't know if you are getting hosed by a vendor," says Doherty.
Non-disclosure caveats written into licence arrangements mean other organisations won't talk about their contracts either, says Doherty. However, he does suggest consulting an analyst firm that may have amassed that sort of data on different software.
Another myth that affects IT departments, says Doherty, is the belief that there is no wiggle room regarding standard terms and conditions in a licence agreement. With business intelligence applications, for instance, he says, organisations can have options included in the agreement to alter the licence model partway through the contract should conditions change. A young company that initially wants to deploy 10 seats on a seat-based licensing model may need the flexibility to upgrade to a server or site licence model, he says.
But the key issue is that it is very difficult to predict what will happen over the lifecycle of a contract, and even harder to know what will happen when the contract comes up for renewal, says Stewart Buchanan, a UK research director in the IT asset management and procurement group with Gartner.
"Customers' eyes are very much bigger than their stomach," says Buchanan. "They sign up for an all-you-can-eat menu and we find them not consuming as much as expected."
He has observed many enterprises agree to, and subsequently regret, an unlimited licence agreement. While he is not against unmetered agreements allowing unlimited use of a software, he advises a certain degree of due diligence beforehand.
Enterprises agree to unlimited licence agreements either because of particular current conditions in their IT environment, or they are encouraged to do so by the vendor, Buchanan says.
But regardless of the motivation, it won't end well if the company's actual usage turns out to be significantly above or below the forecast, he says. If usage is lower, then money was spent on unused licences. If usage is higher, the company will get value for the agreement but the downside is they then cannot revert to a traditional limited licence moving forward, says Buchanan.
Unlimited licence agreements can often appear an attractive option, especially when a vendor won't grant a discount in any other way, he says.
Many enterprises don't make "informed investment decisions" before entering into agreements, instead allowing the vendor to push them into a decision, says Buchanan.
Making assumptions on things like usage levels before signing a software licence contract is the "enemy of accuracy", says Dean Williams, services development manager with Toronto-based IT products and services vendor Softchoice Corp.
"You're either leaving money on the table, or worse, you're putting yourself at risk of non-compliance," says Williams. "Worst case scenario: penalties, levies, fines and potential legal risk."
Making assumptions about usage levels is often driven by the erroneous logic that it's cheaper to play it safe, says Williams.
But while choosing the appropriate software licence contract is important, another element is choosing the right software to deploy, he says.
He suggests understanding available product options, users' behaviour and the processes that support distribution of the software.
Depending on the size of the company, Williams says there may not be a dedicated person in charge of software procurement nor a standard process to follow.
But while software license compliance isn't necessarily an everyday activity, he says "there does need to be a regular rhythm of review that does leave enough time for something to be done about the findings."
Williams suggests a 90-day window to allow time to take action on the data collected on usage.
Buchanan says assessing the current IT environment is part of an organisation's overall software asset management strategy.
Organisations should plan the investment lifecycle, ensure a return on investment during that lifecycle and in general "think of what happens throughout that lifecycle which may be much longer than your contract," he says.
A Massachusetts-based provider of tools for accelerating the use of open source in software development estimates conservatively that 10% of development spending is redundant given open source code already available.
Black Duck Software's CEO and president, Timothy Yeaton, says there is an opportunity, especially in tough economic times when IT budgets are slim, to save money and redirect scarce developer resources to other areas of the business.
"We see development budgets cut, yet these companies are in markets where they're serving customers, where they have to continue to innovate through the recession," said Yeaton.
Collectively, US companies can realise savings of more than US$22 billion (NZ$37 billion) a year by reusing open source code in their application development, says Yeaton. There is a definite potential for significant savings on development costs, and companies "may be aware of occasional components of open source that might be useful for certain tasks, but the scope of it might be underestimated dramatically," Yeaton says.
There exist, he says, more than 200,000 open source projects representing more than 4.9 billion lines of code, an investment of two million developer years — figures derived from Black Duck's own database, called KnowledgeBase, of open source code and associated licence information, and the US Bureau of Labor.
In fact, the 10% estimate is an extremely conservative figure, says Yeaton, who has witnessed a customer, after committing to maximise the use of open source code, save about 88% of development costs. While that individual result does fall at the higher end of the spectrum, Yeaton says 50% is "definitely achievable".
Jay Lyman, enterprise software analyst with New York-based The 451 Group, agrees that the 10% estimate is conservative because the use of open source in most organisations is typically significantly underestimated, especially among those at the management level. Leadership may conjecture they are using just several open source components, says Lyman, but then "find they have 140 different open source packages in use either in their business or in their products".
The use of open source code in application development is more than just a mere cost-cutting strategy, says Yeaton, choosing instead to characterise the approach as a fundamental change in how customers are building software. "It's really shifted customers' emphasis from 'How do I define a solution from end-to-end?' to 'How can I identify components that I can already use, integrate them, and spend my scarce developer resources on adding my specific business value or drive innovation?'"
The hurdles to re-using open source code stem from a lack of awareness of what's even available and possible, as well as automation and management challenges with incorporating open source components into an application development cycle, says Yeaton.
While individual developers are very familiar with open source, businesses may not possess the mechanisms to help them seek out and incorporate the open source components of value to them, he says, and to vet them for security vulnerabilities, export control requirements, licence compliance, and build them into the development process on a steady state basis."
The Conficker worm may have already created havoc with the estimated nine million PCs it's infected, but one security expert warns the worm is only dormant, perhaps to be unleashed at a later date with an even greater vengeance.
According to EMC, today's knowledge-based economy necessitates that an organisation's IT strategy integrate social computing, team collaboration and enterprise content management (ECM) if it wants to improve its competitive position.
While Microsoft realizes there is greater benefit to collaborating with the open source community from an interoperability perspective, it may prove difficult to change its pro-proprietary image, said an open source analyst.
If calculating the maximum height at which an egg won't break when dropped while sacrificing the least number of eggs sounds like a worthy challenge to a programmer, then this year's Google Code Jam may be of interest.
When the cost of maintaining a fibre optic network soared, Toronto-based George Brown College turned to what it's calling "virtual fibre" to keep buildings on campus connected.
The school deployed GE60 wireless links from BridgeWave, a California-based provider of Gigabit Ethernet outdoor wireless hardware. The deployment, completed in just a couple of months, resulted in identical performance to the previous fibre optic network, but at a much lower cost, says Andrew Riem, manager of infrastructure and operations for IT services at George Brown College.
"It had the high capacity we were looking for because it would be serving an entire building which could hold 500 to 600 computers," says Riem. Before this, the school leased a fibre optic network which operated just fine, except the provider changed the billing scheme to include minimum distance, he explains. The result was the leasing cost would rise to C$20,000 (NZ$26,000) a month.
Montréal-based engineering and distribution firm Trispec Communications was brought on for the deployment.
Besides the virtual fibre being more cost effective, another attraction was the quick time to deployment, says Riem. And so far, since the deployment of four GE60 wireless links was completed in 2006, the network has functioned as it should and successfully supported the school's increased use of bandwidth-hugging activities like in-class video, distance learning and information-sharing applications, he says.
The school has since expanded the network with two additional links, resulting in six connected buildings from the previous four. Riem notes that George Brown continues to use the fibre optic network to connect its main campuses.
Had the school continued to use its fibre optic network in the same manner, the annual cost would have been about a million dollars a year, says BridgeWave's senior vice-president and chief marketing officer, Gregg Levin.
"Radio links that are C$20-30,000 to install is going to pay back very quickly compared to those high service costs."
BridgeWave's customers tend to fall in those verticals where the business requires multiple buildings, such as education, health care, state and local governments, and network operators, says Levin. The increasing use of Web 2.0 applications is a driver behind many customer network upgrades, he says, however, a large extent of the network traffic is in fact private LAN traffic, "in other words, taking this LAN and stretching it between buildings".
"So we're trying to give people a full fibre speed gigabit but in a wireless form, so when it's difficult or expensive to get fibre, they have a lower cost and fast to deploy alternative."
Levin acknowledges that although the network can be subject to interference at lower frequencies, the radio links operate at a higher frequency of 60GHz. And, the antennas send out beams with only one degree in width "which makes it almost impossible to be interfered with because anyone else using the frequency is probably not going to be lined up right on top of you in close proximity," he explains.
This also means that many links can be deployed and businesses can reuse their spectrum. "In the same area, there can be multiple links that are just pointed a little differently in direction, and they don't interfere with each other either," says Levin.
When negotiating an outsourcing contract, the tendency is to try to "box in the vendor" in an effort to cover every possible scenario that may transpire in the relationship — but that's not the way to go, said an executive with Canada's TD Bank Financial Group at a recent outsourcing forum.