Techtalk on Cognos.com   [ Go to cognos.com]
 
Techtalk  


TechTalk Home > TechTalk Blog >

August 09, 2005

The Hard Part

Posted by: Floyd in Product Management
It has been said that what gets measured gets done. But this is not strictly true – managers today are deluged by a sea of metrics that are completely irrelevant to the task at hand. In fact, as early as the 1960’s, academic researchers were discovering that top executives rarely if ever used or relied upon the stacks of reports that were painstakingly produced and dutifully delivered to their desks each month. Instead they were much more likely to rely upon the news, hallway conversations, rumours and gossip in making business decisions.

The truth is that what gets managed gets done, and choosing which metrics to manage is the most highly levered strategic decision that an organization of any size can make. This indeed is “the hard part”. Choosing what is important is both science and art and requires attention to an uncommon mix of disciplines – psychology, business strategy engineering, systems science, finance etc. And you must choose. As a rule of thumb, the human short-term memory limitation of 7 items plus or minus 2 is an excellent guideline. You should choose a total of somewhere between 5 and 9, and preferably 5 metrics for your organization to focus on.

So how do you choose the 5 numbers to manage?

Executives can start top down. Begin by asking questions like:

  1. What business are we in?
  2. What is our growth strategy?
  3. If we could choose only one aspect of our business to optimize, which one would be most consistent with our vision and drive the greatest results?
  4. What single metric could we use as a proxy to determine whether this aspect of our business is doing well?
  5. Can we set a target for this metric that will energize and motivate our people?
  6. What will we loose sight of by focusing on this metric? What aspects of our business will falter? What bad tradeoffs and negative behaviours might people engage in if we focus on this alone?

Managers and analysts can start bottom up. Begin by asking questions like:

  1. Of all of the data I receive each month, which metrics would be most interesting to analyze and learn from?
  2. Which of the metrics I have access to would be most useful in causing people to ask interesting questions and learn about what is going on in our operation?
  3. Are there any surprising metrics in my daily reports that merit further investigation?
  4. How would posting a graph of this metric on the wall affect my team’s morale and focus?

The bad news is that even by answering these questions, you can still get it wrong – wildly wrong – like the call center in which support representatives are rewarded for increasing the number of calls handled, and thus hang up on their customers as quickly as possible. Here are some characteristics it takes to choose great metrics:

  1. Truth telling. There are lies, damn lies and then there are statistics. Do not fall into the trap of choosing numbers that make the situation look better than it is. The more transparent a metric makes your organization, the better it is. Follow Jack Welch’ s lead and see things for what they are.
  2. Systems thinking. Follow Peter Senge’s advice and look for the first, second and third order effects of optimizing your business around a particular metric. In the example above, if call volume is increased at your call center, what will the impact on customer satisfaction be? What will the effect of that change on customer loyalty? What will the impact of customer loyalty be on transaction costs? Etc.
  3. Undying curiosity. Understanding why things “are the way they are” is the best way to figure out how to make them better. Sometimes you may want to choose a metric just so that people in the organization will become curious about it. Perhaps the best thing about choosing a particular metric to focus on is that it will inspire passionate debate amongst those who care about making things better.
  4. Story telling. Selecting metrics is part of the process of creating a story that explains why the people in your organization should get up and come to work in the morning. It explains what they should do and why they should do it. The metrics you select become the most tangible, and repeated element of the story as you watch them and talk about them from day to day and quarter to quarter.
  5. Strategic intuition. Often the route to success is not around optimizing an internal operation, but instead requires a complete change in the direction and focus of the organization to access some new dynamic in the external environment. Unfortunately, few people are good at this, a lot are bad, and most winners are simply lucky. Still, it’s useful to have this perspective as you select metrics. What is important is as much about what is outside your organization as it is about what is inside.
  6. Tolerance for imperfection. Numbers will never tell the whole story; they are merely an approximation of reality. Still, these approximations are incredibly useful in focusing attention, and raising questions that drive organizational learning. As you choose your metrics, just be clear that you are looking for the best and most accessible approximation of reality. No set of five numbers will ever completely describe your business. Use them for what they are and move on.

Choose well. Your future depends on it…

August 9, 2005 | Permalink

July 27, 2005

Can SOX reporting = spreadsheets?

Posted by: Doug in Product Marketing

As many companies complete their first year under the strictures of Sarbanes-Oxley Sec. 404, this much is clear: SOX is spurring a wholesale transformation in the Office of Finance at virtually every publicly traded company in the U.S.  Sarbanes-Oxley is changing life for corporate finance.

From a purely financial perspective, the costs of compliance is skyrocketing. Financial Executives International (FEI) found that companies will spend an average of $3 million.  For companies with revenues above $5 billion, the first-year compliance cost jumps to $8 million.  What’s more, 88 percent of companies surveyed expected to maintain or increase those spending levels.

In many ways, the situation is not unlike the Y2K challenge of a few years ago.  The investments and reinvestments that companies made in ERP, CRM, and other enterprise systems not only addressed Y2K concerns (avoiding painstaking code rewrites), they also drove significant improvements in inventory management, supply-chain efficiency, and other key transaction business process areas.

Yet many companies today still use simple spreadsheets as the system of record for financial planning and management processes.

And with SOX 404 mandating faster closings and quicker production of financial statements, speeding closing cycle has become a modern imperative for organizations. In most instances the  inefficiency and corruptibility of these processes – sending e-mails, surreptitious or inadvertent formula changes, and a general lack of access, data, and process control – render them inappropriate in the eyes of SOX auditors.

Does this sound familiar? Are there still many in your organization that are still hanging onto spreadsheet reporting in this new, hyper-sensitive compliance era? I'd love to hear from you. E-mail me [email protected].

July 27, 2005 | Permalink

July 13, 2005

BI Competency Centers

Posted by: Tanya in Marketing

Cognos Forum was great this year. The buzz at this year's conference was unlike any other I had witnessed. There's real excitement about the future. I talked to so many customers about their plans. I found more and more customers are looking for strategies to drive BI enterprise wide and establish a BI standard.

I got to sit in one a discussion at one 'Birds of a Feather' breakfast on BI Competency Centers (BICCs) that was particularly eye-opening. From the discussion, running a successful competency center requires you to be a true renaissance person.
According to the attendees, it means being part traffic cop, part advertising executive, and part political strategist.

Some of the perspectives were too good not to share, so I jotted down the following ideas from the attendees:

Playing Traffic Cop
The first task for BICC managers should be to identify BI initiatives already in place within their organizations. This can be a daunting process. Large companies often find as many BI initiatives as there are groups trying to access information. So it isn’t uncommon to find users and decentralized IT teams building siloed BI projects to meet an immediate need.

Playing Ad Executive
Publicizing early wins with a BI standard or demonstrating how a single solution lowers ownership costs can drive a better understanding of a BICC initiative internally, said one participant. This is where managers need to play the role of advertising executive. Using internal communication tools such as newsletters, emails, and Web portals can keep people informed. “If teams can understand why you are making the change to one BI system or centralizing the BI function with a BICC, then the transition isn’t as difficult for users to make,” he said.

Playing Political Strategist
“It’s important to keep the levels of excitement high,” according to one BICC member. But it isn’t easy and the landscape is constantly changing. “You have to stay attuned to the changes in your organization.” Staying abreast of the needs of your business users and particularly the strategic objectives of your executive team is critical to a successful BICC.

Having a mixed team of business and IT stakeholders ensures you are able to “keep your ear to the ground” and assess the changing needs of your business users. One organization’s BICC regularly conducts user satisfaction surveys to gauge user response and stay ahead of requirements.

July 13, 2005 | Permalink

July 05, 2005

Enterprise Information Integration

Posted by: Caroline in Product Marketing

I was giving a presentation the other day to an IT audience. When we started talking about Enterprise Information Integration (EII), I asked for a show of hands. Who was familiar with the advantages of EII? Very few hands in the audience went up.

There’s been a significant amount of vendor investment in EII technologies. No doubt you are reading about the surge of market interest in EII. That’s because the promise is huge. EII offers organizations to combine, join and view information without moving it, wherever that information exists.

You know better than anyone, as your user community grows, the tolerance for data latency decreases and access to a variety of data sources is critical. Cognos took the step of embedding EII into Cognos ReportNet to enable customers to look at historical information and real-time transactional information for Operational BI, as well as providing federated queries across a multitude of data sources all without having to move that data. In addition, data integration vendors are embedding EII technology, to allow the virtual prototyping and virtual data integration to assist in the physical implementation of a data warehouse.

So I’ll pose the question to you. What does EII mean to you? Are you using EII technologies today? Do you see the value? What business problem is EII solving for you?

I’ll be giving a presentation through TechTalk called “Data Warehouse Healthcheck” with Margy Ross, president of the Kimball Group that will discuss the value of EII not only from a BI perspective but from an ETL perspective and will be available via Cognos TechTalk on July 28th. Look out for it or drop me a line at [email protected] and let me know your thoughts.

July 5, 2005 | Permalink

June 29, 2005

Headcount Analysis

Posted by: Doug in Product Marketing

Are you challenged by headcount analysis? I was speaking to an IT person at a conference the other day who called headcount analysis “the albatross around his neck.”

Human resources reporting and analysis can be a huge pain point for organizations that are looking to give their users an easy single view of their human capital investments. For many IT organizations especially those using PeopleSoft, headcount analysis is indeed an 'albatross' that can leave them unable to serve up the HR information their management team is demanding.

Especially in a time of market consolidation, it is more important than ever to provide the necessary HR metrics, reports, and plans to generate efficiencies, and make strategic decisions around staffing, retention and talent development. More importantly, in a time when ‘talent is king’ I’m finding that IT is being asked to drive priorities and key metrics across HR to answer the key questions: Are we making progress on key issues like retention, benefits expense and career development programs.

Cognos is putting significant focus on human resource analysis, providing packaged applications in this area. But I’d want to hear more from you. What are you doing to meet these challenges? Is headcount analysis a challenge? What are you being asked for?

Drop me a line at [email protected].

June 29, 2005 | Permalink

June 21, 2005

BI to make teams work

Posted by: Floyd in Product Management
Some years ago I had the opportunity to provide BI support to the management team of a 400 person organization that was struggling with performance.  I remember well the first meeting I went to, where the business unit head expressed his dissatisfaction, and then each of the regional managers went around the room in turn, making excuses for their poor performance.  Some then left the room to get on the phone and shout at their staff.  What a complete and utter waste of time!

It's tempting to think that these managers, or the millions of others like them that we meet every day are simply incompetent.  But in this case, the problem was systemic.  These managers simply did not have the information they needed in order to engage constructively with each other.  In fact what had happened prior to that fateful meeting was that each regional manager had cobbled together their own summary of performance, and the business unit head had received the only organization wide view of the data the evening before.  This group did not have shared perspective on the situation, and did not have detailed analysis that would help them ask the right questions of each other.

Six months later, this team was firing on all cylinders.  They were accustomed to getting a detailed information package analyzing activity in every office in every region several days before meeting.  During their monthly management meetings, instead of making excuses, they could collaborate together to understand why offices with similar characteristics in different regions performed differently.  In fact, they soon began moving work around between regions to optimize the total organizational performance.  It was a pleasure to meet, ask key questions and collaborate on productive solutions.

Of course it is not always this easy.  Some information cannot be shared broadly for security or regulatory reasons.  Some management teams will continue to behave in dysfunctional ways regardless of the information they have before them.  But over time, persistent presentation of the best possible information about the true state of affairs has the potential to completely change an organization for the better.  Why not try it in yours?

June 21, 2005 | Permalink

June 14, 2005

What’s in a name?

Posted by: Colin in Product Management

Shakespeare once wrote “What’s in a name? A rose by any other name would smell as sweet…” I wonder if a software product by any other release number would smell as sweet. Many people won’t purchase a 1.0 release of any software product. They figure it is likely to have quality problems and functional gaps. Too much is unknown. They will often talk about the “bleeding edge of technology” and they don't want to that far out in front.  Fortunately for fledgling software builders there are also a minority of “visionary” customers who actively look for innovative products that can give them a competitive edge.

Similarly some customers won’t upgrade to a “dot-zero” release, such as 5.0, in the expectations that it will introduce new features, but also new problems. They prefer to wait for the subsequent dot-one or maintenance release to shake out the teething troubles. Some years ago I worked with a vendor who decided that every even release (4.0, 4.2, 4.4 etc) would introduce new features, while odd releases (4.1, 4.3 etc) would only contain quality fixes. It’s not a bad strategy, but they made the mistake of admitting this to their customers. As a result hardly anyone installed 4.2, so hardly any problems were reported, so 4.3 was not really any better than 4.2.

At the other end of the scale, many people are wary of a software product with a release number over 12. There’s an immediate suspicion that it is inherently based on outdated technology, and that it will have acquired a lot of idiosyncrasies over the years, that can’t be changed because of the mass of supported applications.

Sadly the outcome of these prejudices is that numbering product releases has become a marketing issue, much like naming them, which can make any inferences from the number unreliable.  As I said, "What's in a name?"

June 14, 2005 | Permalink

May 26, 2005

How real is real time anyway?

Posted by: Tanya in Marketing

How real is real time anyway? And when it comes to wireless devices is it really 'bye, bye daily horoscope, so long stock ticker …and hello business-critical information?' Are people really using their wireless devices to drive corporate performance? 

The words ‘just in time information’ have taken on an entirely new ring lately. We’ve heard a lot recently about the importance of getting information fast, at the source, anytime and anywhere. But something in all of those discussions was missing.  ...a fact that becomes more and more apparent every day.

What about business analysis? What about the powerful intersection of business analytics and alerting? What information do you actually need? What are the trends inherent in your business? For well-supported BI users, event detection is the means by which they’re proactively notified of an anomalous event – such as a metric that has exceeded its pre-defined boundaries or thresholds – typically by e-mail.

With guided analysis, the notification is more than simply “something’s wrong.” The notification can also contain context-sensitive links that take the user to specific, relevant information about the specific issue and guide them through additional related information.  The alert notification can also be programmed to automatically trigger a contextual action at the click of the recipient’s mouse.

Analysis becomes paralysis due to the overwhelming amount of data and the nearly limitless permutations for navigating through it.  By contrast, guided analysis accelerates the analysis process, improving productivity and increasing the likelihood of its use.

How many of you are there yet? We want to hear from you. What are your users asking for? What's the reality of real time for you? Contact me at [email protected] with your thoughts.

May 26, 2005 | Permalink

May 20, 2005

Designing for High Use

Posted by: Mark in Product Marketing

I am struck by the amount of technology that surrounds us and have been giving some thought to what makes some things more popular than others. I am thinking of things like cell phones, digital cameras and keyless entry for cars.

All of these devices are used by millions of people each day. They are not “on the bleeding edge”- they have been around for a number of years now. Early on, they had a certain “coolness” just because they were new and different, but now they are popular because they deliver true value to people, value that is separate from novelty.

It seems to me that common factors among these devices are ease of use and portability. If something is going to be highly used, it needs to be available to you when you need it and it’s not a winner if you need to take a manual out every time you want to do something with it.

I think these same factors apply to Business Intelligence software. As BI software has become more pervasive, the value of the information delivered is appreciated as important. However, factors such as ease of use and availability are becoming increasingly important.

These days, the most used BI software is web based. Why? Because browsing is how most people now find information, and the web provides a well understood, familiar interface that lets people focus on the information they want, rather than the interface needed to get it. Increasingly, software for use on the Web is designed to be zero-footprint, so that all that is needed to access it is a browser. As this trend develops, we may come to a time when BI information is as easy and accessible as getting into your car.

May 20, 2005 | Permalink

May 10, 2005

The Next Big Thing?

Posted by: Rob Rose, Chief Strategy Officer

Hey, Thrill-seekers!

With all of the consolidation happening in the software industry, there’s been a lot happening to fuel the imaginations of industry-watchers who already have finely-tuned high performance imaginations. The impossible becomes unlikely. The unlikely becomes possible. One of the threads in the buzz among industry pundits kind of goes like this:

“BI and associated technologies focus on extracting and driving business value from structured data and ECM (Enterprise Content Management) enables organizations to manage and exploit their unstructured data and content to benefit the business in a number of ways -- combining the two would provide total data coverage and maximum value.”

Or something like that.

On the surface, it sounds like a good idea; however providing coverage for all enterprise data assets has little benefit in and of itself.

An Admittedly Structured-Data-Centric Perspective

Rightly or wrongly, we BI folks have historically held certain beliefs to be self-evident:

  • Data that has a close relationship to core drivers of the business is more valuable than data that doesn’t;
  • Data that directly relates to the revenue and cost drivers of the business are most valuable to the business;
  • Aggregated information that reflects the structure and drivers of the business is particularly suited for effective management.

Software vendors of ECM/Text Mining solutions usually make this argument:

  • On average only 20% of the total data in an enterprise is structured, leaving a full 80% unexploited and unmanaged (this is always the first slide after the title slide in their basic PowerPoint presentation. Notionally, this is correct, but only a subset is relevant to the business and not all of it is addressable and accessible by these systems.);
  • Many of the key indicators for performance can’t be gleaned from structured data. Items like customer satisfaction and warranty claim trends really live in the comment field of systems that are used by customer service reps and manufacturing floor workers who use a variety of terms and abbreviations to describe similar problems;
  • The knowledge worker’s connection to unstructured data – their email, documents, comments and web pages is stronger than the structured data because they spend more time with it and relate to it more easily.

While all of the above is true, a combination for its own sake doesn’t do anybody any real good.

The Case for Careful Combinations

There’s a case for careful combinations of these two worlds as applications merit. Many of the applications may start with either structured or unstructured data and extend the other. For example:

  • Text categorization – use categorization technology to structure text into relational database tables for SQL query tools like Cognos to aggregate, sort, and filter the results into easily consumable groupings and trends;
  • Text mining through CPM artifacts – Enable the growing document, metadata and data assets generated by BI and CPM systems to be “search enabled” for relating to enterprise taxonomies;
  • CPM as part of workflow processes – Place CPM software on top of document workflow processes to instrument the health of the process. As well as BI and CPM assets to support decisions required in the execution of the process;

And one more – the biggie…

CPM: The Killer App?

One of the reasons that convergence for its own sake isn’t important is that there are very different buying centers for CPM and ECM systems. BI vendors won’t be interested in helping a publisher track down a picture of a kid eating an ice cream cone. ECM vendors don’t care most of the time if the inventory manager can list all out-of-stock items and drive demand trends for them.

But…

When a company has a clear sense of strategy and has crafted their objectives to drive towards it, they assign metrics to ensure that they can track progress toward their planned targets. In the Cognos world, business intelligence is used to understand why things are on or off track, or to reveal opportunity to improve the status of the business. The reach of these CPM systems ideally drives down to everyone with an area of accountability in the organization. 

With that in place, you can imagine a very rich, hierarchical model of business performance that has a high degree of organizational coverage. That set of inter-related plans, metrics and report sets forms the hierarchical enterprise performance taxonomy on which to hang related text mining, content management, community management and high-value workflows.

For example, if you “own” a metric for the company, let’s say “cost per square foot of retail space” for your region, you’ll want to keep all of the BI related to the drivers of that metric nearby. You’ll also want all plans, documents and search access to all related content. You’ll make all of this available to everyone in the organization who contributes to the performance of this metric and to all who make decisions using the status of this metric. This is a very real blending of the structured and unstructured worlds with clear, high-value application benefit.

Top-Down and Bottom-Up: Value at every step

Companies aren’t buying software systems the way they bought ERP systems in 1999. The days of the big-meal commitments and sketchy visibility to ROI are over. Way over. It’s unlikely that a whole-enterprise top-down implementation of a system like the one that’s described above will be bought an implemented all at once. Companies need to look for a good starting place. Here are some attributes which might help in selection:

  • A clear candidate function, department or process for performance turn-around.
  • A large amount of comment-field data that’s useless to traditional reporting tools.
  • A clear sense of the metrics that drive a function or process.
  • Lots of data and lots of documentation.
  • Disparate data warehouse, content management or collaboration systems.

Cognos and Partners: Completing the Picture

As of now, there’s no one vendor out there that offers the complete story, because of the buying center issue described above. However, each world has a list of converging opportunities as implementations get more productive and look for the next step of value. When you see Cognos picking partners in the ECM/Text space, there’s a clear reason why. Our leading-edge customers get it. They get the value of some of these killer apps and are asking us to work with their chosen ECM/text mining vendor or vice versa.

So, when you hear industry pundits honk about the “inevitable convergence of structured and unstructured data management” ask them about the requirement of senior management to retrieve the ice cream picture, or service companies to report on knowledge assets rather than relate them to projects. It’s not so easy and it’s not so obvious. But there’s room for some killer apps and a ton of customer value in there somewhere.

In you’re interested in knowing more, Cognos, working together with IBM, Factiva and ClearForest, have put together a survey to help you benchmark your peers' use of text analytics technology. It will take you no longer than two minutes to answer eight questions.

Click here: http://www.zoomerang.com/survey.zgi?p=WEB224A54FRGK5 to take the survey and compare your results with others.

May 10, 2005 | Permalink


Questions?