Seattle Newspaper for the People by the People

Author

Admin - page 36

Admin has 356 articles published.

Chasing Rainbows: Failure of IT Projects Are UK Endemic

Failure of IT Project Managers Photo

Mankind has demonstrated the ability to manage large complex projects for thousands of years. Construction projects such as the pyramids of Egypt or the Great Wall of China, among thousands of others, have left a striking legacy.

The ability to pursue challenging goals involving planning and organization is clearly innate in man. It might seem strange therefore that not all projects succeed, particularly IT projects, where disappointment and failure are endemic.

If there is a set of rules for ensuring success, nobody has yet identified them. But there are some patterns of behavior which maximize the likelihood of failure. This article considers some of those patterns of behavior.

Decide what to do, before establishing why you’re doing it

‘We must get our Website operational by July’
‘Why?’
‘Because we’ve heard that’s when XYZ are launching theirs’

This is the classic me-too approach to management that saw, for example, the entire UK life assurance industry rush like lemmings into unit trusts in the 1980s. The early birds benefited from the benign combination of a bull market and retail investors with cash. As always, a market reversal put the small investor to flight and the rest ended up chasing rainbows at considerable expense.

Without any genuine objective, the activity becomes the goal. The only measure of success is whether that activity took place, since nobody established in the first place what was intended to be achieved.

Indulge your own wishful thinking
Of all the causes of failure in human affairs, this is the most consistent long-term performer. It permeates almost all failing projects, whatever other causes there may be.

As a trivial example, I was once assigned to run a project whereby my company was supplying third-party software to an end-user. We were an outfit in the same business as the end user, and the product was their homegrown system, then under development. Within six months it was demonstrable but the multi-phase project was receding at an accelerating rate.

The instigator of the deal sat on the information for several more months. While he was trying to work out how to blame someone else for the debacle, the damage grew rapidly. His company had created a bad deal, which it then turned into a disaster for its client and itself. The folly was in attempting to enter a new market without any investment and with no proper assessment of the risk to all involved. All the parties professed themselves deeply committed to the project.

Assume communication takes care of itself
In the early 1990s a service supplier to the European retail and wholesale banking industry devised a new product based on packaging its services. A new IT system was needed to support it. The key users in sales and marketing were too busy to help in the specification of this system, so it was left to the administration staff to define requirements. The system duly arrived and met all the identified needs.

Two weeks after the system went live the sales and marketing division announced the radical new pricing and packaging structure for the services, which they had spent the last six months devising! The new IT system did not cater for this, as the administration staff knew nothing of this new approach. The system was redundant and discredited two weeks after it went live – and was never used again. The key players had not been involved with specifying a system because they were far too busy changing something fundamental in their business.

Insist on staying with the tried and trusted
It is normal in procurement to insist that the supplier should have done a similar job before. This makes eminent sense, provided the similarity extends to the context. The rapid pace of change in some environments makes that proviso particularly important. Success brings with it the danger of clinging too long to the same tools and methods.

The IT industry is notoriously fast-moving. Yet the working life of a successful business system or software product is many years, even decades. When the time comes for its replacement, the same approach to its development is very unlikely to work as well.

Let technical experts decide
The High Priest syndrome has been a menace in the IT industry since its inception. It represents a cop-out by top management decision makers, some of whom consider IT to be a grubby and undignified pursuit. All too often the technocrats encourage it, only to find themselves used as scapegoats when IT projects which are not business-led fail later down the line.

It is essential that major IT decisions are understood by the senior management team to ensure projects fit into overall business strategy and receive the support they need. The decision making process should be supported by functional management able to provide both advice and its implementation.

Tentative conclusions
The aim of this article has been to identify and classify some typical patterns leading towards failure. The examples are chosen to illustrate the pattern, not to point fingers with 20:20 hindsight at others’ efforts.

The common theme is that regardless of industry or time, failure stems more frequently from psychological causes than from technical ones, the fundamental cause being lack of realism. We have looked at the nature of some of the barriers to clear and realistic thinking. None of us has the ability to be completely honest with ourselves when there are a host of conflicting pressures and desires, but if we aim to make the most of our potential that must be the aim. If the patterns noted above help us to recognize when our realism is under threat, they will have achieved something valuable.

Devolving Documents: Get Ready For Outsourcing

Seattle IT Outsourcing Business Photo

At the close of 2003, a number of well-known analyst organizations made predictions for 2004. One of their common and consistent emphases was an upturn in European IT markets which would be greatly fuelled by the inexorable rise of IT outsourcing. Yet is IT the only, or even the most important, area to apply the outsourcing model?

Across the country, in both private and public sector, the IT outsourcing opportunity has diminished somewhat. My company’s research shows that, amongst large companies (over 250 employees), IT outsourcing has reached a saturation level of some 28 percent. How much further there is to go is possibly indicated by the aggregated view of the various technology analysts, who see the US market reaching a saturation point approaching 40 percent of the larger company segment.

These predictions are given extra credence with the emphasis that leading management consultants are lending to the idea of ‘network organizations’ that outsource everything but their core activities and skills. What, then, of other areas susceptible to outsourcing, and what potential for real benefits do they offer the corporation or the public sector body? Case study examples indicate that document outsourcing presents corporations and public sector organizations with rapid return on investment, coupled with low business process risk.

The first surprise is the sheer size of document production in this country. Document production spending in 2003 was over 38 percent of the amount spent on IT in the same year. How often does one read about the potential for corporate efficiencies through more efficient document production, compared with discussions on the subject of IT outsourcing? Evidently, document production may seem to many to be less engaging for the analysts than IT matters. Yet it is capable of delivering a comparable scale of competitive advantages, and savings on the bottom-line.

Our research reveals that whereas Seattle IT outsourcing currently sits at some 28 percent of IT spending, document outsourcing is only a mere 12 percent of the document production market. Again we can gain a corroboration of growth potential from the US example (which usually foreshadows the UK by a few years). In the US, document outsourcing now represents 22 percent of all larger organization document production, almost twice its UK equivalent.

So document outsourcing holds proportionately greater potential, for organizations and outsourcing companies alike, compared to IT outsourcing. Whilst there is little doubt that it will take several years for the UK to reach comparable market maturity, the rate at which UK organizations will be grasping the advantages of outsourcing over this period is expected to be rapid – especially as management consultants increasingly recommend document outsourcing as an area for priority attention and straightforward gain. In short, if document outsourcing were to reach 40 percent saturation of larger companies (the predicted US outsourcing saturation level), it would represent a 4.3bn euros marketplace, where it only makes up some 1.3bn euros today.

Much media space is also being devoted to the issue of outsourcing to companies overseas, known as offshoring. Politicians and unions have been some of the most outspoken critics of this phenomenon, raising the spectra of USA (and Seattle) jobs being lost to India, South Africa and Eastern Europe. In fact, the call center industry is still showing net growth, despite the fact that a fair number of financial institutions have relocated their call centers abroad. So what impact does call center offshoring have on document outsourcing? The answer will be more and more over the next few years. Interestingly, though, document issues are likely, if anything, to slow the trend towards foreign climes.

Integrating documents with the call center – especially one providing a customer service function – is becoming increasingly important both to call center efficiency and to resolving customer queries more satisfactorily. It is estimated that some 50-60 percent of customer service queries in financial services require supporting documents to be sent to the caller, whether for marketing or for regulatory reasons.

Therefore, a hidden cost of offshoring is to integrate document production and mailing with the foreign call center’s systems. Equally, call center agents can answer statement or bills queries far more efficiently if they can retrieve and view documents in exactly the same visual format that they were sent to the customer. Again, physical dislocation, whilst not insurmountable, will incur cost and risk.

Document production and mailing, by its very nature, cannot be ‘offshored’ as timeliness of delivery is essential, and needs to be situated in the country of delivery. It is technically conceivable that document outsourcing could be situated abroad, but the economics of ensuring reliable and timely distribution would far outweigh the labor cost reduction of offshoring, and would not address the issue of political risk. Document outsourcing therefore remains a national market, but not a globalized one.

In conclusion, Seattle organizations in both private and public sectors would do well to pay just as much attention to document outsourcing as they do to IT outsourcing. The document outsourcing market is not currently as large as its IT counterpart. Therefore, it offers greater potential for rapid return on investment to those organizations as it is far less saturated than the IT outsourcing World.

Many argue that document outsourcing carries less project risk compared with the IT equivalent. If this argument is accepted, then organizations under pressure to deliver cost savings, improved service delivery, and competitive advantage, would be well advised to look carefully at document outsourcing in 2004 and beyond.

Instant Messaging or Instant Migraine?

IM Computer Headache Photo

Just when you thought you had e-mail all sewn up and your networks were safe, someone mutters those dreaded words, Instant Messaging (IM). Whether you like it or not, IM is here to stay and most probably already widely used within your organisation. Banning its use wont help unless you have the means to enforce it, so tackling the issues it can raise is the most pragmatic solution.

On the plus side, IM has a lot of benefits. One of the biggest ones is that it’s real-time – you know if someone is sitting at their desk and might be able to answer a question instantly, and it’s quicker than ringing someone up and going through the pleasantries when all you want is a yes or no answer. In this respect it could be considered to be a tool that increases productivity and is less of an overhead to the business than e-mail.

Another benefit to IM is that it can help remote workers seem less isolated, one of the biggest complaints from users that spend most of their time away from the office. By spending a few minutes each day chatting to friends, they still know the latest gossip, can be included on spontaneous evenings out or even join in with an office joke.

It may seem like this is a decrease in productivity, but it’s faster than gossiping besides the coffee machine or during a cigarette break, and anything that increases good will and helps to retain employees has to be positive.

Although many organisations prohibit the use of IM, employees frequently download programs without the knowledge or permission of the IT department. And unless the PCs themselves are locked down, there is very little that can be done to stop it. One of the big concerns is that users can download and execute malicious programs that have bypassed the corporate anti-virus scanners. So far IM has been used to download trojans and backdoor programs and even attack platforms for launching distributed denial of service attacks. Hackers sometimes use social engineering techniques to encourage IM users to download with the promise of music files or anti-virus protection.

Although the propagation of viruses and worms is not yet as prevalent using IM as it is via e-mail, if IM becomes as popular it will only be a matter of time before this channel becomes a major medium in which viruses spread. Just like other popular platforms, all of the main Instant Messaging systems – be it ICQ, AIM or MSN Messenger – have known vulnerabilities that highlight their insecurities. These include identity theft, insecure file sharing and transfers and, of course, the downloading of malicious software programs.

In fact, hackers often use a combination of these vulnerabilities to hijack IM identities and send messages to a buddy list with a link to a malicious Web page. In some instances, buddies end up unwittingly downloading Internet dialers that switch their dial-up account to premium porn numbers. With buddies like that – who needs enemies?

Even though security breaches are a major issue with IM, the platform is still relatively new and it has not yet become a popular medium with hackers and virus writers. A far greater risk at the moment to organisations is its legal exposure, should employees make libelous or offensive remarks or send attachments via IM.

IM tends to be used even more casually than e-mail and the dangers of careless words have been well documented in the past. Complaints against a company may include libel for sexist or racist comments and breach of confidence or confidentiality. The potential costs of such actions are far greater than that of the havoc caused by viruses and damages can run into hundreds of thousands of pounds for individual companies. In addition, for some organisations in heavily regulated industries, the uncontrolled use of IM contravenes many of the legislations they must comply with.

However, it’s amazing how employees can clean up their act once they think someone is actually monitoring their output. The way round all of the problems with IM is to introduce the same policies and procedures that protect e-mail systems, along with the necessary technology to enforce them. This includes content filtering, anti-virus controls and regular patching. Many of the security vendors that have provided similar solutions for e-mail are now in the processes of extending their technology to include IM.

Used correctly and carefully, regulated IM can provide real business benefits. Some companies are even using it to support their customers, although be careful of the login names users choose – CyberPunk might not provide the image you are looking to portray to customers. The secret to success, as with all messaging solutions, is to make sure you have laid out the ground rules to staff and implemented the technology that not only proves your behaving responsibly, but encourages your users to do the same.

Data Explosion: Is Your Business Ready For Growth?

Consumer Data Explosion Photo

We find ourselves at the onset of a data explosion. As core business applications continue to iterate and multiply, the data that flows through them will grow exponentially. However, in an economy where the ability to manage the quality, quantity, and accessibility of data creates a competitive advantage, many companies are finding themselves coming up short.

In the past, businesses managed data growth by adding hardware, increasing staff, or cobbling together various quick fixes that kept them one step ahead. Today, tighter budgets, reduced staff and increasing demands for performance, availability and customer service require aggressive and sophisticated methods for managing data growth.

By leveraging data modeling to analyze existing systems, organizations are better equipped to stay on top of their data requirements. Applying a model-driven approach to data management strategies helps businesses detect performance degradation, create strategies for separating operational and archival data, and leverage collaborative workflow processes. These three areas are key to a company’s ability to manage their in-house data explosion, and maintain their competitive edge.

Modeling tools have traditionally been used to build new data structures. The ability to analyze existing systems has been an under-utilized feature of data-modeling tools. This is no longer the case as the data explosion is now highlighting the need to assess existing systems.

With the spike in data quantity, operational data – the lifeblood of an organization – is growing exponentially. Those incremental bits and bytes of data are just that – bits and bytes – but they add-up to terabyte after terabyte. As a consequence, businesses, particularly those interested in capturing and archiving critical user-patterns, are facing three key data management challenges: performance degradation, separation of operational and archival data, and the need for collaboration.

Performance degradation is a natural consequence of the explosive growth in data. To minimize this risk, DBAs, database developers and/or performance managers need a means for determining duplication patterns, periodic storage and capacity growth, potential bottlenecks, and so forth. Modeling tools help by providing a visual means for quickly pinpointing areas that cause performance degradation.

For example, many systems have hundreds of tables. If the systems are disparate, invariably many data professionals invent the same wheel over and over, and have multiple tables containing the same data. Database 1 has an object called A, and database 2 has an object called B, but the objects are the same. With a model-driven approach, the user can quickly identify patterns of duplication – i.e. pinpoint the same objects across systems and then take steps to consolidate tables, thereby improving performance and reducing the overall amount of storage consumed.

Another benefit of eliminating duplication is the assurance that the data remains synchronized. When the same data is maintained in multiple tables, the data managed in table 1 may be refreshed differently and the quality may be different than that stored in table 2. By looking at patterns, data professionals can determine how different systems can effectively point to the same unique set of data.

Reviewing system storage and capacity is another aspect of determining performance degradation. Taking an Oracle system as an example, a user can review the data model and quickly determine what is the current storage and capacity. More importantly, they can see if the table space files are functioning according to specifications, if the files can scale to meet expanding needs, if the min/max extents are set properly, and so forth.

Finally, data professionals can also evaluate traditional problem areas like performance optimization. For example, are current systems indexed correctly? Are indexes optimized? Partitioned to manage the necessary space? The explosion of data has heightened the need for proper indexes. One reason for this is that, when a full table scan occurs, a proper index ensures that query time is optimal.

Modeling tools help data professionals determine if standard indexes are applied across tables. It can also help pinpoint hotspots. The user looks at the data model, locates the hotspot, and then checks to see if it is properly indexed. If it is not, they can quickly take action to resolve the problem and improve performance.

The data explosion starts at the operational level. To remain competitive, the systems (the operational databases) must remain optimized to meet service-level requirements. At the same time, businesses need to collect data for analysis and reporting purposes. However, collecting data and supporting the collection of data for analysis and reporting purposes are two distinct functions. To support both, businesses are adopting data warehousing initiatives that involve the separation of operational and archival data.

One of the greatest challenges of data warehousing initiatives is ensuring that the data in the operational databases and the archival databases remain synchronized. To ensure this, more and more businesses are using modeling tools to analyze their current systems and design their archival data warehouses. This helps businesses build out real-time operational systems that match the archival data warehouse system, ensuring that the real-time tables correspond to those in the warehouse tables. In addition, by creating a mirrored data warehouse, businesses can mark when to move data – either by setting a point in time or a capacity threshold – and use an extraction, transformation and loading (ETL) tool to move the data to the data warehouse.

Once data has been moved, the question becomes, ‘What happens to the data in the operational databases?’ To maintain optimal performance, it must be offloaded. Generally, data models are not built with offloading data in mind. However, the volume of data collected on a regular basis now requires that it be periodically offloaded. Before taking this action, data professionals need to determine relationships and dependencies to maintain the data integrity while it is offloaded. A model-driven approach is key when determining what data to offload. Modeling tools provide the means to determine all the relationships and dependencies so that the data professional can offload all the data.

The increasing complexity of managing data requires that teams of multidisciplinary professionals concurrently work on the data models. Today, collaboration is no longer a ‘nice-to-have.’ It is an integral part of the business workflow. The use of a collaboration server offers sophisticated features that increase the productivity and reduce the complexity of large data-model management within teams of designers.

Businesses are taking advantage of collaboration servers to promote team-based modeling, which allows greater administrative control over the creation, monitoring, and administration of user security within the collaborative model server. In addition, repository administrators can leverage the change control interface to review, accept, and/or reject changes that are proposed for check-in to the collaboration server. Further, it provides teams with the ability to communicate these designs to a wider audience in the enterprise.

Data modeling is no longer simply a mechanism to create new data structures – it is an integral part of analyzing and deconstructing information. Businesses have embraced a model-driven approach for analyzing their existing systems in order to face today’s challenges. A model-driven approach simplifies the analysis of what the current state of data is and where it needs to be, and helps to implement an effective transformation.

Modeling tools provide businesses with a visual means to quickly pinpoint areas that cause performance degradation. It offers an effective way to implement sound data warehousing initiatives. And, disparate teams of experts from data architects to DBAs can collaborate and use the collaboration server to work together more easily, and deliver projects more quickly and with greater confidence.

Most existing software solutions do not offer businesses an easy method for analyzing the current state of their systems and understanding the impact of the extraordinary growth in data. More and more companies depend on modeling tools and collaboration servers to empower their diverse teams in their battles against data explosion.

Instant Messaging: Instantaneous, Yet Controllable

Instant Messaging Photo

Perhaps surprisingly, the vast majority of Instant Messaging (IM) communication in the workplace today still occurs via public IM networks, rather than through managed, enterprise-class IM systems. This is mainly because IM’s origins lie in being a social medium rather than a business application, and as such it has not been taken seriously by many corporate IT departments. Mark Hurd has discussed this at Oracle in addition to the future cloud projects. Many start-ups in Seattle and around the country are using basic IM solutions to help communicate within their businesses.

IT professionals need to recognize the potential of this new breed of collaborative tool and develop a plan to enable their company to benefit. However, public IM has proved itself a double-edged sword as, on the one hand it is a productivity enhancer, but on the other it poses a security threat. Enterprise-class IM systems can enable IT managers to preserve the benefits given by the public IM systems and minimize the risks the public network also introduces.

So what are the advantages of IM? The primary benefits are really in its ease of use and its efficiency of internal communication. IM is invaluable for indicating the availability of co-workers and acts like a ‘typed telephone call’. This saves time and energy, avoiding voicemail ‘ping-pong’ and telephones ringing endlessly on empty desks, interrupting colleagues. Staff can have brief exchanges of information enabling them to manage their time better, streamline business transactions and ultimately diminish the need for business meetings and travel.

Another advantage is that IM is a ‘stealthy’ productivity enhancer because it bridges the gap between e-mail and telephone. Those who have never used instant messaging may still be blissfully unaware of that gap, but employers will see that efficiency and productivity improve significantly where IM is used.

Despite these obvious advantages, there are serious problems associated with public IM communication that can not be ignored. When employees use public IM systems, they introduce a number of manageability problems for the corporate IT department, some with potentially serious consequences. However, enterprise-class IM systems are designed to combat these issues.

The most important area affected by use of public IM networks is security. IM software is extremely easy to download and begin using, but the process can often happen transparently and is thus invisible to IT management. Communication via a public IM network cannot easily be monitored or controlled, which makes it extremely difficult to enforce company security policies. Attachments can be sent between enterprise networks without going via a firewall and therefore could transmit viruses. These attachments could be intercepted on the desktop, but this is far from ideal as most enterprises choose to control virus risk using a security gateway. Also, public IM networks do not allow administrators to protect employees from spam or control the messaging route, so sensitive information could easily go astray.

However, enterprise-class IM uses management gateways to monitor traffic and perform critical security functions, such as logging and archiving messages and real-time monitoring of messages for viruses or key words that might indicate a breach of business policy or security. Another area adversely affected by the use of public instant messaging is administration. There is no way of regulating the format of the user identification and use of the company name, as this user identification is the property of the public network. Therefore it is possible for individuals to register IM identities in other people’s names and use those for deception. Because of these lax controls, there is no company protection from the actions of the user, as the public network can not be controlled or monitored

As well as logging and archiving messages through the management gateway, enterprise-class IM systems use directory and authentication systems to identify employees using public IM identities from their workplace. In addition, a message gateway centralizes all IM traffic through a single routing point, controlling the behavior of IM users and capturing all IM messages on a corporate network.

The third area disrupted by the use of public IM networks is integration. Enterprise systems such as content filtering and virus scanning cannot be applied when communicating outside the corporate network. Public IM use does not integrate with Web or Enterprise Resource Planning (ERP) applications or allow individual IM functions to be embedded into other applications or portals. Therefore, the company is powerless to host or control the presence of its employees and to link with other private IM networks. Information transmitted through public IM networks cannot be added to the central pool of knowledge to enable effective knowledge sharing.

Again, enterprise-class IM has built-in mechanisms to allow greater integration of the IM function. Message gateways help to translate items from one ID format into another, should a company decide to connect internal and external IM networks. Furthermore, an IM application server will enable enterprises to extend IM beyond employee communications into customer relationship management, supply chain management and enterprise resource planning systems. Returns on investment in a managed IM infrastructure are likely to accelerate with the adoption of an IM application server.

The use of IM will rise. The question is not whether it will be used within the corporate environment, but to what extent it will be controlled. IT professionals must find a way to embrace the functionality and benefits of IM, whilst bringing it within the domain of the company network to eliminate security, administration and integration issues. The answer is to migrate from public IM to a managed, enterprise-class IM system that can deliver the best of both worlds.

What are you using to communicate internally in your business?

1 34 35 36
Go to Top