CALL US NOW:

1-401-727-2400 U.S.
1-905-532-0440 Canada

Archive for the ‘Best Practices’ Category

Statistics in Pictures: Illustrations of Why Companies with Statistics Beat those Without

In this article, I wanted to provide clear insight into why data queries, charts, graphs, and trend lines are often misleading and may support incorrect decisions if used at their face value to gain insight into the workings of an organization, customers,  or markets.  I then provide insight into why statistics overcome the shortcomings of data reporting methods and why companies that use statistics in their operations sport a huge competitive advantage over those who do not.  By the end of this article, we will answer the question, “Can companies that use statistics make better decisions then those who do not?”

FIGURE 1 (above) – Table of Monthly Revenue for Company X

 

On each row of the above table, we see the monthly revenue for Company X from October 2008 to April 2009.  As often is the case, Company X asked its analysts to project what the revenue would likely be during the upcoming months of May through August and to give a report on the general financial health of the company based on this revenue forecast.  However, the financial manager was quick to comment that revenue was going up and that the company looked strong and was getting stronger.  Based on the above table, do you agree with the manager’s comments?

The analysts went off to work.  One of the first tasks they performed was to create a scatter plot of the data.  This plot of revenue by month appears below.

FIGURE 2 (above) – Graph of Monthly Revenue for Company X

In looking at the above plot, do you see a pattern?  The analysts did, and they were ecstatic.  The organization’s revenue appeared to be going up each successive month, and the revenue increase over time seemed to be fairly steep on the plot.

After viewing this plot of the company’s revenue by month, the analysts made a revenue projection by adding a best fit line.  The revenue projection graph appears below.

FIGURE 3 (above) – Best Fit Revenue Projection for Company X

After viewing the above graph, the analysts felt pretty good.  The best fit line seemed appropriate, as the data points were “snaking around” the line, conforming to what the analysts were taught is a sign that the model is appropriate.  It appeared that May’s revenue was projected to be around $7.5M, and by October, the monthly revenue would be somewhere around $13M.   The analysts used the graph to complete the table shown in Figure 1 as to show the projected revenue from May to August.  This updated table appears below.

FIGURE 4 (above) – Table of Revenue and Revenue Projections for Company X

 After checking the numbers and making sure there were no arithmetic errors, the analysts presented this table of revenue and revenue projections back to their manager.  The manager was pleased since he now had numbers to corroborate the magnitude of what he visually saw as an increasing revenue trend based on prior months.  The manager submitted the projection to the CFO, and the company made the decision to keep doing what they were doing since October 2008, as the data indicated that revenue was steadily increasing and would continue to increase such that by August, they would have a monthly revenue of around $11M.

The thought process and steps taken in this example are all too familiar.  I cannot recall how many organizations I’ve worked with that will look at a set of data, visualize a trend, and then use this visualized or plotted trend to determine what lies ahead.  If only life were so simple.

As it turns out, Company X’s revenue is being impacted by a variety of influences.  Season, competition, suppliers, a changing customer base – all of these could be impacting our revenue.  Simply looking at queries, tables, plots, and trends causes us to ignore the dynamic and interconnected world in which our company lives and how various influences are affecting us like waves coming over us in the ocean.

On my most recent vacation, my wife and I went to the shore.  When I was a young boy, waves often crashed into me and knocked me over.  However, as I’ve grown, my size has enabled me to withstand the force of small waves hitting me on a calm, sunny day.  However, if I was to stand at the shoreline day after day without getting knocked over, can I expect such a trend to continue?   If we were to apply the same thought process that Company X used to project revenue, the answer would be yes- since I did not get knocked down by a wave day after day while on vacation, this pattern should continue.   However, we can easily take a step back and see that is likely not the case.  If we take into account hurricane season, moon phase, wind speeds, and weather, it would not be difficult to imagine me being knocked over and dragged out to sea by a huge wave if I stood by the shoreline during hurricane season when there was a full moon, high wind speeds, and a tornado nearby.  The same applies for business- we need to look at the world around us and assess how many different variables are interacting with each other and ourselves to generate the performance indicator we depend on.

In the case of Company X, three variables are suspected to be influencing monthly revenue.  These variables appear in the below table as A, B, and C.

FIGURE 5 (above) – Table of Suspected Influential Variables and Revenue for Company X

When A, B, or C change, do you see a pattern in how revenue changes?  To assess whether these variables are truly influencing revenue and by how much, SAS business analytics software can be used to run a simple linear regression, as shown below.

FIGURE 6 (above) –Company X data imported into SAS

 

FIGURE 7 (above) –Company X Linear Regression in SAS



FIGURE 8 (above) –Company X Linear Regression Results in SAS

While the results of the analysis may seem alien at first, they are actually easy to interpret.  The r-squared value on top tells us how well our statistical model fits the data, or how much variables A, B, and C relate to monthly revenue.  An r-square value of 1 means it’s a perfect fit, whereas an r-square of 0 means that variables A, B, and C do not relate to monthly revenue at all.

In this case, our r-squared is almost 1, meaning that if we know the value of variables A, B, and C, we can determine monthly revenue almost perfectly!  Under the r-square measure, we see a table that includes a column that shows our variables and a column that shows “parameter estimates.”  These parameter estimates tell us how each of the variables A, B, and C impact revenue.  This output is our regression model.  To estimate monthly revenue, we can rewrite this model as follows:

FIGURE 9 (above) –Company X Linear Regression Model

Using the output from SAS, we determined that monthly revenue for Company X equals 0.159 (the intercept parameter estimate) plus the parameter estimate of A times the value of A, plus the parameter estimate of B times the value of B, plus the parameter estimate of C times the value of C.  We can write this into an MS Excel spreadsheet if we so choose:

FIGURE 10 (above) –Company X Linear Regression Model Pasted into Excel

 If we filled the formula down in Excel, we will arrive at estimates of revenue for not just our previous months, but the upcoming months of May – August.  We can then compare the revenue estimates of our SAS model for May – August to the revenue estimates of our best fit line.  These results appear below:

FIGURE 11 (above) –Comparison of Best Fit Line to SAS Prediction

The results of our SAS model tell us something completely different from our best fit line.  In fact, the results oppose one another!  Based on the results of our best fit line, we had determined that revenue would increase in the months of May through August.  However, our SAS prediction says that our revenue is about to plummet in the months of May through August.  Looking at an overlay plot of the two methods better illustrates how different the results are:

FIGURE 12 (above) – Graphical Comparison of Best Fit Line to SAS Prediction

 In the above plot, the blue line illustrates the results of the best fit trend and the pink represents the results of the SAS model.  In one case, it looks like everything is going well for Company X and that they should continue doing the same, whereas in the other, Company X may be on the verge of crashing.  Which is correct?

As it turns out, variables A, B, and C are fictitious and were created to be directly related to revenue for illustrative purposes.  The relationship is: Monthly Revenue = A divided by B times C.  If we look back at the values, it will become apparent that this is the relationship.

 FIGURE 13 (above) – Relationship between Monthly Revenue and A, B, and C

In the real world, we rarely know the true relationship between variables.  However, we can use statistical methods to estimate these relationships.  However, how accurate are statistical methods in determining these unknown patterns?   If we go back to our plot from Figure 12 and add in the actual relationship, the power of statistics quickly becomes evident:


FIGURE 14 (above) – Graphical Comparison of Best Fit Line, SAS Prediction, and Actual

Unfortunately, using classic BI reporting methods, the CFO made the wrong decision, and this wrong decision could have cost him his job and the company’s solvency.  The queries, tables, and plots all showed that revenue was steadily increasing.  Even if one carefully queried, charted, and plotted variables A, B, and C, their relationship with revenue would likely be missed.

Using regression, SAS, without knowing the true relationship between our variables, was able to very decisively detect the relationship occurring behind the scenes and use this relationship to predict that Company X was on the verge of catastrophe!   Had Company X used statistics to augment their reports, the CFO may have decided to change which marketing strategy was being used, or which supplier he was purchasing from, or perhaps which customer demographic he was targeting- all variables that may be represented through A, B, and C.

While we worked with only three variables in this case – A, B, and C – businesses tend to be influenced by many more variables that are often related in much more complex ways then our simple formula shown here.  Yet, even in this simple example, it would have been difficult to detect the underlying pattern visually.  While linear regression may have worked just fine in this example, more powerful methods are often necessary within business to accurately detect patterns driving key performance metrics.  These methods may include time series analysis, neural networks, decision trees, cluster analysis, and the like.  Luckily, technologies such as SAS sport an environment that makes these technologies available in a clean, graphical interface that is designed with business application in mind.

The use of statistics, whether in marketing, healthcare, insurance, energy, or otherwise, yields the user a powerful competitive advantage over those organizations using BI reporting alone to drive decisions.  Furthermore, adding a statistical capacity into an organization tends not to be all that challenging with the appropriate guidance.  For those organizations who would like to become more profitable using methods such as those described in this article, my colleagues and I are available to help guide the building of a robust analytical capability that will leave competitors in your organizational dust!

Timothy D’Auria serves as SAS Analytics Practice Manager for Creative Computing, awarded as a top data consulting firm in the Northeast US.

For more information, feel free to contact Timothy D’Auria at 401-727-2400.

 


Cognos Active Reports

By Itzik Maoz, Director of Business Analytics Practice for Creative Computing, Inc.

One of the most rewarding aspects of teaching students about Cognos Business Intelligence is to witness their enthusiasm when they grasp the software’s possibilities and power. They don’t need to be IT professionals to learn how to extract vital information, analyze it and create meaningful reports.

I’m especially excited about Cognos’ ability to produce active reports. This unique format lets you create dashboards for a comprehensive view of multiple aspects of your vital information, and you can see it on your tablet, Android smart phone or computer. You can also run active reports in a variety of browsers. You get information right when you need it, even if you are not connected to the Internet or your network, so that you – or your customers – can make informed decisions.

With Cognos’ active report feature, the information is presented in a very dynamic way. The active reports let you use a variety of interactive prompts and charts, including bubble charts, trend Chart, etc. You can view different types of information, such as sales by region or by product, on one dashboard. There are so many options; it’s easy to customize the reports.

You can also quickly process information in an extremely visual way, especially if you’re using IBM’s Rapidly Adaptive Visualization Engine (RAVE), which makes the visualizations interactive and animated. It’s not just a static report. It visually changes, moves and shifts on command. For example, you can visually observe how your different product lines shift as you slide the time slider with your finger on your iPad.

Create and Interact
In a LearnQuest class, we talk about how to create active reports and how to interact with them. It’s a combination of demonstrations and hands-on work. We discuss the features and the concepts. Students practice using sample data – typically sales data because it’s easy to understand.

Before taking any Cognos class, it helps tremendously if you review the prerequisites. You’ll simply get more out of the class, if everyone enters at the same level.

I’m a trainer in these sessions, but I can also share my knowledge as a consultant and a frequent user. I help people understand how the software relates to the student’s industry or their job. Cognos is used in all industries and at all organization levels. My students are from a variety of industries, with the majority of them from the Finance and Sales departments.

Sharing my knowledge is part of my personality. Showing people how to get the most out of this great technology so they can grow their company gives me a lot of satisfaction.

About the Author
Itzik Maoz is Director of Business Analytics Practice for Creative Computing and has more than 15 years of Business Analytics experience. Based in Providence, R.I., he has worked with many of New England’s most recognizable companies in Manufacturing, Retail, Healthcare and Higher Education. He is proficient in the entire IBM Business Analytics stack and is well versed in the entire Business Analytics project lifecycle.


Competitive Advantage: Business Analytics Provide Critical Tool

By Itzik Maoz, Director of Business Analytics Practice for Creative Computing, Inc.

Business analytics help organizations achieve many different strategic objectives — to compete in the marketplace, grow sales and retain customers, enforce integrity and security, improve business capacity, satisfy consumer expectations, learn more about the organization and/or implement necessary changes.

Whether we’re aware of it or not, we’re seeing business analytics at work on a regular basis in our daily lives. One of the most common examples is market basket analysis. Think of Amazon’s feature that tells you exactly what customers who bought a certain product were also likely to buy. This kind of data mining uses product and interest associations to predict groups of items that are more likely to be bought together.

Retailers use this method to stock shelves in situations where an item, say bread, is in close proximity to an item such as milk that is likely to be purchased together.

Another important type of analysis looks at “churn” rate — the number of customers who cut ties with a company in a given time period. Churn analysis helps companies distinguish the characteristics of likely-to-leave customers. They then can use that data to predict which customers are likely to churn in the future and offer incentives to those that generate more profits to keep their business.

In social media analytics, companies scan blogs, forums and comment sections to find out customers’ sentiments about their products. This data mining process will identify key words that might signify “good,” “bad” or “neutral” opinions. This is an emerging area but the technology for analyzing language is improving all the time.

Anyone with a credit card has probably come face to face with fraud detection analysis. If your bank has noticed red flag activity — say, a purchase at a store in a state you’ve never visited — it is using previously fraudulent patterns to raise the red flag. These processes have been improving and getting more efficient, so that now you may get a call within a very short time after a suspicious activity has occurred.

Personally, I’m most excited about healthcare data analysis applications. This is an area that can impact many lives, prevent illnesses, improve quality of life, shorten the treatment cycle when an effective drug is identified early in the process, etc.

This kind of analytics, of course, has much higher stakes than previous examples.
Think about the consequences of a false positive — a patient is diagnosed with a condition that he does not have — or a false negative — a patient with a condition that is not diagnosed. But with a careful and meticulous process, the risks can be mitigated.

If you are interested in learning more to understand what data science is all about, there are many books that can enlighten. For a technical understanding and implementation, IBM SPSS Modeler is a great way to get familiar with the subject without the need to be fluent in complex statistics. You can see a recommended LearnQuest course here.

Almost every organization can benefit from business analytics. While nobody can completely predict the future, companies that use analytics have been proven over time to be much more successful at what they do. The key is to identify which type will make strategic sense and to make the investment necessary to be a leader in your industry.

About the Author
Itzik Maoz is Director of Business Analytics Practice for Creative Computing and has more than 15 years of Business Analytics experience. Based in Providence, R.I., he has worked with many of New England’s most recognizable companies in Manufacturing, Retail, Healthcare and Higher Education. He is proficient in the entire IBM Business Analytics stack and is well versed in the entire Business Analytics project lifecycle.


IBM’s SPSS Modeler

By Itzik Maoz, Director of Business Analytics Practice for Creative Computing, Inc.

It wasn’t that long ago that companies looking to analyze data would hire a Ph.D. or otherwise experienced statistician to sit in a back room and write long lines of code with complex algorithms. That’s no longer the case, with IBM’s SPSS Modeler around to help business analysts to gain data-based insights.

SPSS Modeler is an extensive predictive analytics platform that allows business people who don’t necessarily have any statistics background to perform powerful analytic processes and integrate these processes with their decision management.

What IBM has done is simplify the user interface with an intuitive design. Nodes, which are graphic icons, perform tasks used for data preparation, modeling, exporting and importing.

SPSS allows the user to try out a variety of modeling approaches and compare the results. It can be extended beyond your typical numerical and categorical analysis to provide tools for text analytics to analyze social networks, for example.

In my LearnQuest classes, I show students what they need to know to get the most out of SPSS Modeler — a sense of the lifecycle and best practices of analytics, the ability to collect the right data, while cleaning any “noise” in order to reduce inaccuracies and anomalies. We examine a few models and talk about the uniqueness of each one and how we can leverage multiple models to enhance our predictive accuracy.

The Intro to Modeler class is focused on the process of data manipulation, but the more advanced courses examine specific techniques of analytics and their predictive applications.

Along the way I work with students to adapt the application to their individual needs, and by the end they come away empowered by sophisticated methods that traditionally required advanced degrees (with none of the student loans).

About the Author
Itzik Maoz is Director of Business Analytics Practice for Creative Computing and has more than 15 years of Business Analytics experience. Based in Providence, R.I., he has worked with many of New England’s most recognizable companies in Manufacturing, Retail, Healthcare and Higher Education. He is proficient in the entire IBM Business Analytics stack and is well versed in the entire Business Analytics project lifecycle.


Smart Shortcuts

By David Pacific, Education Services Practice Lead, Creative Computing, Inc.

Who among us doesn’t want to save time? Who among us wouldn’t want to know that there are shortcuts in the system they’re already using, shortcuts that could dramatically simplify their lives and level of effort?

I’ve participated in countless Cognos Report Studio projects and often find that the client is asking for help to do what I like to call “monkey work” — changing report titles, headers and footers, logos or simple text elements. Most of these clients are unaware of a very powerful concept in the Object Library, which is why I think it’s imperative to discuss this capability here and share this knowledge with the masses. This can save days of editing and maintenance when tedious template or layout changes are required.

With Layout Component References, Report Studio allows you to reuse objects from other reports. You can use this “library” — actually a report itself — to store commonly used objects, such as headers, footers or a complex report template for a specific set of tables containing commonly used report elements. This allows standards to be utilized across an enterprise so there is no longer a question of what logos, color schemes, layout calculations or prompt tables should appear or function.

I have personally used and implemented an Object Library on many of my projects to store all of the prompt tables that would be used on the design of multiple reports. I have found report development to be very quick and successful in creating a “self-service” template that has all of the possible permutations for commonly used report objects. Instead of giving developers the task of creating new report content from scratch, they have a contained template utilizing Layout Component References and can simply remove components they do not care about.

What’s more, with an Object Library you can make a change to an element that’s referenced multiple times, and the change will be propagated across all locations without even having to open any of the reports that contain the reference. Rather than opening and editing hundreds of reports to repeatedly change text or a header, you can make that change once, and it will appear in every subsequent report that includes the reference. This creates consistency across all reports, which is always crucial for successful report implementation. If you need to eliminate a referenced object from a new report, you can always override it, while still keeping the object in your library for later use.

It sounds so obvious, but once leveraged, this feature can save you hours and hours — and possibly even weeks — of development and maintenance time. After all, there’s no need to constantly reinvent the wheel when there’s content you can simply reuse, maintain and update once. Put this best practice in place and you’ll see the immediate benefit and value added from a concept that all Report Studio developers should be utilizing.

About the Author
As Creative Computing’s Education Services Practice Lead, David Pacific oversees all training engagements and participates in analysis and requirement gathering for all potential training opportunities. In this respect, he coordinates to identify training needs and provides a fit for scheduling and courseware. He also has a combination of onsite development skills, so when he is not overseeing or delivering trainings he is working on implementation for a multitude of Fortune 500 clients. This has helped him develop an excellent mix of skills sets between the classroom and real world development work, which has proven to be a valuable asset as a trainer.


On Board with Dashboarding: How to make the most of this dynamic visualization tool

By David Pacific. Education Services Practice Lead, Creative Computing

In business intelligence everything is trending toward the ability to gain quick insight — analyzing accessible data that allows you to make good decisions immediately. Enter the dashboard, a tool we are seeing more and more organizations utilize.

A dashboard can mean different things to different people, but typically when we define a dashboard in terms of business intelligence it is a group of graphical representations of any information that is important to an enterprise. A dashboard should tell a meaningful story with actionable insight. Its real strength is being able to display similar information in multiple formats — each provoking its own emotional response from the analyst.

There is a lot more to dashboarding than meets the eye. It is not simply a graph or chart, but a customizable visualization tool. This of course comes from a fundamental understanding of the toolset and its capabilities, as a dashboard consists of individual elements existing in pre-developed or managed reports. Having this understanding is a prerequisite for being able to develop effective dashboards, which is why I first guide students into Report Studio courses.

When we participate on dashboard projects with organizations, we usually start the conversation with data governance and standards. In order to put anything on the dashboard, all of the stakeholders need to be in agreement about what data should be presented. Getting that buy-in from the earliest stages of development will ensure that the dashboard will be effective for enterprise use. In a typical organization there are a multitude of data sources and a multitude of revenue sources, but each stream of data is going to carry its own implications, so you want to make sure the dashboard reflects the most relevant and comprehensive information.

What you are looking for is to develop some key performance indicators or metrics that can provide an immediate health check for the organization — a high-level snapshot that executives, sales or marketing can use. If something looks awry, they then have the capability to view it in more detail and at this point, either act upon it or pass the insight along to an actionable resource.

It is up to the developers to create an environment that provides the most insight. You want a dashboard that is as flexible and dynamic as possible. You want an intuitive interface that allows you to pass filters and change values to view different angles of the data at a moment’s notice. The best principle for design is to keep views of the data as simple as possible.

At the end of the day the dashboard is only going to be as good as the data it reflects. So get the buy-in, create some enterprise standards, and you’re on your way to dashboard success.

About the Author
As Creative Computing’s Education Services Practice Lead, David Pacific oversees all training engagements and participates in analysis and requirement gathering for all potential training opportunities. In this respect, he coordinates to identify training needs and provides a fit for scheduling and courseware. He also has a combination of onsite development skills, so when he is not overseeing or delivering trainings he is working on implementation for a multitude of Fortune 500 clients. This has helped him develop an excellent mix of skills sets between the classroom and real world development work, which has proven to be a valuable asset as a trainer.


Simplifying Report Studio with Object Libraries

By David Pacific. Education Services Practice Lead, Creative Computing

Who among us doesn’t want to save time? Who among us wouldn’t want to know that there are shortcuts in the system they’re already using, shortcuts that could dramatically simplify their lives and level of effort?

I’ve participated in countless Cognos Report Studio projects and often find that the client is asking for help to do what I like to call “monkey work” — changing report titles, headers and footers, logos or simple text elements. Most of these clients are unaware of a very powerful concept in the Object Library, which is why I think it’s imperative to discuss and share this capability again and again. It can save days of editing and maintenance when tedious template or layout changes are required.

With Layout Component References, Report Studio allows you to reuse objects from other reports. You can use this “library” — actually a report itself — to store commonly used objects, such as headers, footers or a complex report template for a specific set of tables containing commonly used report elements. This allows standards to be utilized across an enterprise so there is no longer a question of what logos, color schemes, layout calculations or prompt tables should appear or function.

I have personally used and implemented an Object Library on many of my projects to store all of the prompt tables that would be used on the design of multiple reports. I have found report development to be very quick and successful in creating a “self-service” template that has all of the possible permutations for commonly used report objects. Instead of giving developers the task of creating new report content from scratch, they have a contained template utilizing Layout Component References and can simply remove components they do not care about.

What’s more, with an Object Library you can make a change to an element that’s referenced multiple times, and the change will be propagated across all locations without even having to open any of the reports that contain the reference. Rather than opening and editing hundreds of reports to repeatedly change text or a header, you can make that change once, and it will appear in every subsequent report that includes the reference. This creates consistency across all reports, which is always crucial for successful report implementation. If you need to eliminate a referenced object from a new report, you can always override it, while still keeping the object in your library for later use.

It sounds so obvious, but once leveraged, this feature can save you hours and hours — and possibly even weeks — of development and maintenance time. After all, there’s no need to constantly reinvent the wheel when there’s content you can simply reuse, maintain and update once. Put this best practice in place and you’ll see the immediate benefit and value added from a concept that all Report Studio developers should be utilizing.

About the Author
As Creative Computing’s Education Services Practice Lead, David Pacific oversees all training engagements and participates in analysis and requirement gathering for all potential training opportunities. In this respect, he coordinates to identify training needs and provides a fit for scheduling and courseware. He also has a combination of onsite development skills, so when he is not overseeing or delivering trainings he is working on implementation for a multitude of Fortune 500 clients. This has helped him develop an excellent mix of skills sets between the classroom and real world development work, which has proven to be a valuable asset as a trainer.


Achieving design that is for the business, by the business

By Sara Ricketts. Principal Performance Management Consultant, Creative Computing

What’s most important to a developer in Business Intelligence (BI) implementation: Data accuracy? Performance? Look and feel?

The truth is, all three are important, but disproportionate weight is often given to the first two. This is a very common mistake in approaching BI design. As developers and modelers, we need to remember that the end result we’re aiming for is a satisfied, self-sufficient business user, not just a data model and some associated reports/dashboards.

A major key to success is ensuring the BI project lead has a clear understanding of how stakeholders will use the information, even before any data modeling or design takes place. To facilitate this, you should embed targeted business stakeholders within the project team at an early date, albeit in a very limited, part-time capacity. Their initial role is to offer input into the analysis phase of the project, describe the business process and delineate current challenges.

It’s more than just examining current reports and assuming that that will be enough. What is the business value of simply replicating what they already had, but in another tool? Very little, if we’re being honest. What we need to strive for is a better experience for the end user — but only people involved with the business can define “better.”

Our role as developers is to be the bridge between the business and the data, both in terms of the assets that we build, and the conversation around requirements. It also means understanding the business process, albeit at a high level: More often than not, business users aren’t fully aware of all of the capabilities of the BI tool, so asking “What should this report look like?” almost guarantees that you will end up with a design that does not offer the true value of the BI solution.

The better approach is to listen to the business stakeholder describe the process of how they would use that information. With that input you can translate your knowledge of the business process into the BI design. For example, “Instead of a long list of those orders, would you like to be able click on a bar chart for that off-target revenue and see all the detail in a popup window?”

Most likely you’ll hear something like “Wow! It can do that?” Excited business users, who have played an active role in the design, can become future BI champions. In the end, that serves our ultimate goal: delivering a high value Business Intelligence offering as the single, actionable version of the truth.


Using Prompt Token Syntax to Create Efficient Reports

Using Prompt Token Technique to Create Efficient Reports
Helpful Technical Tips for Cognos 8 Users

Using Prompt Token Technique to Create Efficient Reports

Users appreciate flexible reports, and so we provide them with all sorts of prompts and calculations.  A frequent request is to allow them to choose which data to include, or to group or sort their report in various ways.  Or even decide which columns should be included in a report.  If you used to be an Impromptu Report writer, you might have done this using IF-Then-Else calculations to determine the sort value based on the users selection.

Let’s look at several report outputs:

Revenue by Manager

Revenue by Order Method

Revenue by Product Line

One of the great features of Report Studio is that even if there are a lot of items in a query, only the items that are actually included in a report are retrieved from the database (or cube).  It would be terrific if when a user chose to report on Manager, Order Method or Product Line, only the required field would be included in the query.

In Cognos 8, the prompt/token syntax can be used to do this in a very efficient way.  Most prompts return a data value – Acme Inc., 9/15/2008, etc.  These are used to compare in calculations.  A prompt token doesn’t return data, it returns a report “object” – for instance a query column.  When the user makes a choice from a list of token options, only the one they choose is included.  That way, if the different values are coming from different tables, only the required tables are touched.

Let’s look at the query items for the above reports:

The first five items are pulled directly from the GO Data Warehouse (query) package.  Product line, Order method and Level 2 manager are not included in the list report.  Only the calculated column Dimension Column is included:

And here’s the magic – the prompt/token syntax:

This is the syntax in its simplest form.  In this case, dim_column refers to a parameter that I’m making up.  It could be called anything.  Including it in the prompt syntax automatically creates the parameter.  The word token, in single quotes, is a reserved word – it must appear exactly as shown.  This tells report studio to substitute the object that the prompt returns.  In our example, we have included several dimension columns in the query – Order method, Level 2 manager and Product line.  To include them as objects, the values returned must be surrounded with square brackets – just as they would look if they were included as part of a calculation.  So, when prompted for a dim_column, if I type in [Product line] (exactly matching the query item – note lower case ell), then this report is produced:

Note that the column header says Dimension column – we’ll fix that in a moment.  First, to make the report friendlier for the user, a prompt should be defined – this could be on the report page or prompt page.  A value prompt will be created, with static choices.

Once the value prompt is dragged onto the page header, you are asked if you want to use an existing parameter.  I chose dim_column.

Then I just hit the Finish button to create the prompt.  The next step is to create static choices (these could be pulled from a query also, but they must be in the correct format).  They should like like this:


Note the use of the square brackets in the Use column.  This refers to items that we’ve already added to the query.  It would also be acceptable to put in fully qualified items from the package, e.g. [Sales (query)].[Product].[Product line].   If that approach is used, the items don’t even need to be added to the query – although I find it simpler and more transparent to have the items included in the query.

It’s also wise to add a default selection.  This can be done in the prompt syntax, but here it’s being done in the Report Studio GUI:

Finally, set the prompt type to “auto-submit”.  With that setting, as soon as the user changes the value, the report is rerun.  That’s especially helpful if the prompt is on the report page.  The report now looks like this:

Finally, let’s change the column header/title.  To do that, you must unlock the report using the lock/unlock icon .

Once it’s unlocked, you can simply change the heading from a Data Label to a  Report Expression.  The expression is simply the ParamDisplayValue – the display values that were used in the static items definition:  ParamDisplayValue (‘dim_column’).  This is the same name assigned in the static choices for our value prompt.

Now, when the report runs, It looks like this:

Note that the header shows Level 2 Manager – the same as the prompt value.  It will change dynamically when a different value is chosen.

A similar method can be used for sorting a report.  Our report query will be the same as the original report, including the Dimension Column.  The difference is that all of the columns will be included on the list report.

The report looks like this:

We’ll make two changes to the list report – forcing a sort on the Dimension Column, and hiding that column.  Both of these are done in the normal Report Studio GUI.  When you set the boxtype to none for the column, be sure to do the same thing for the column title.

When this report is run, the list will be sorted by whichever column the user chooses:

Voila!!!

There are other ways to experiment with this syntax too.  The full syntax is:

#prompt(‘parameter_name’,’token’,’defaultvalue’,’prefix’,’’,’suffix’)#

For instance, #prompt(‘adv_dimension’,’token’,’Product line’,'[‘,”,’]’)#.  By adding a prefix and suffix, we could skip the opening and closing square brackets in the prompt values – this might allow them to serve as filters also – Product line instead of [Product line].  I’ve used suffixes to employ one prompt to drive several calculated columns.  For instance, rather than simply showing Product line for Product, let’s show Product line and Product type.  For Managers, we’ll also show two levels, Level 2 Manager and Level 3 Manager.

In this case, we’ll add items to the query, and name them predictably – e.g. Product-1, Product-2 and Manager-1 and Manager-2.  The static items in the prompt are just ‘Product’ and ‘Manager’.  The prompt syntax is:

#prompt(‘dim_column’,’token’,’[Product-1]’,’[‘,’’,’-1]’)#

Here, when the prompt returns Product, with the prefix and suffix added on, it becomes ‘[Product-1]’.  We’ll add two of these calculated prompt columns, the second one is identical, but it points to the next level:

#prompt(‘dim_column’,’token’,’[Product-2]’,’[‘,’’,’-2]’)#

Which will point to ‘[Product -2]’

Here’s how the static choices look:

Since we’ve defined a default prompt value, it doesn’t have to be done in the Report Studio prompt GUI, but if you prefer to do it there, don’t forget to change the default selection in the GUI to Product.

Here’s the query:

Finally, here’s the report – note that the column headers haven’t been addressed yet….

Hopefully this gives you some idea of the power of this approach.
Good luck!

Josh Lobel
Creative Computing, Inc.
401-727-2400
www.mycci.com