JMP and SAS present the Innovators' Summit 2008: The knowledge exchange for analytic excellence. September 24-26, 2008 - Sofitel San Francisco Bay
JMP.com sas.com
JMP and SAS present the Innovators' Summit 2008: The knowledge exchange for analytic excellence. September 24-26, 2008 - Sofitel San Francisco Bay

Abstracts

Day 1 Plenary Speakers

Self-Sustaining Innovation: Ideas for Creating a Data-Driven Organization

Avinash Kaushik, Analytics Evangelist, Google

Innovation was in core pockets of the company, empowering a few people here or a team of folks over there. The Web and 2008 are changing that in every way imaginable, posing a fundamental assault on our old way of decision making. In this session, Avinash Kaushik presents insights on how to create a data-driven organization through a mix of cultural approaches, destruction of some currently worshiped holy cows, tools, skills and mindsets. As business evolves along multiple dimensions in a very complex world, the goal is to create pervasive and self-sustaining innovation across our organizations – innovation that permeates all levels, works for groups of all sizes and is always on.


Building Consensus: How Analytic Visualization Fosters Successful Innovation

Mike Cramer, Director, Operations Research for Worldwide Restaurant Innovation, McDonald’s

Successful innovation in large enterprise environments requires stakeholder consensus across organizational functions, hierarchies and learning behaviors. One key barrier to success is the ability to connect diverse stakeholders in ways that are meaningful from their individual points of view.  

Learn how McDonald’s Operations Research has applied a portfolio approach including analytic visualization to increase the success rate of reaching innovation consensus in one of the world’s largest retail enterprises.


Innovating the Future

Richard Antcliff, Director, Innovation Institute, NASA Langley Research Center

To be uncertain is to be uncomfortable, but to be certain is to be ridiculous.
~ Chinese Proverb 

This quote is one of a hoard of sayings that convince us that not only is it difficult to be certain about anything, in fact it is quite impossible. This mindset restricts our thinking about the future – it becomes unknowable. In fact, the future is quite predictable, based on the actions we have already taken. We need simply to look at the data and see what is already in motion. Will there be surprises? Of course. However, most of the future can be readily discovered. We will take a look at the future and see what is in store for us.

Back to Top
Day 1 Concurrent Speakers

Trading Performance for Profit 

Rob Reul, Founder & Managing Director, Isometric Solutions

Unless you work for a monopoly, your organization cannot help but benefit from a cultural orientation that strives to increase positive customer feedback. Six Sigma’s strongest link to business relevance and sustainability might be its ability to assure that your business makes products or delivers services that your customers actually want. 

When Six Sigma initiatives evolve to the Voice of the Customer (VOC) frontier, they mature through two key phases. The first phase is assuring that products are made “right” and that services are provided as intended. But stopping here won’t sustain an organization in a competitive environment. Stopping here misses the second phase, which is possibly the biggest Six Sigma impact to date: assessment of the most prized features and characteristics of a product or service. In this area, Six Sigma professionals can bring great value to their organizations by researching the ideal combination of attributes and articulating those benefits in economic terms that characterize marketplace preference. This frontier draws upon a core Six Sigma methodology: design of experiments (DOE). The sustaining difference is that here DOE is deployed on marketplace preferences among competing alternatives. In the research discipline, these marketing experiments are known as conjoint analyses. Six Sigma professionals who master marketplace choice experimentation will increase their organizations’ competitiveness when introducing or optimizing product or service offerings.

In this session, we will introduce three methods of conjoint marketplace experimentation.
Participants will learn how to construct a marketing choice experiment, when such an experiment is appropriate and why it should be used.


Solving Practical Problems from Earth to Mercury

Manuel Uy, Principal Professional Chemist, Johns Hopkins University Applied Physics Laboratory

The Johns Hopkins University Applied Physics Laboratory (APL) is a not-for-profit research center for engineering, research and development. Since its founding in 1942, APL solves practical problems that present critical challenges to the nation. We will describe in this presentation about how we solved a problem with the MESSENGER spacecraft, a NASA-sponsored satellite that is now on its way to the planet Mercury. This problem had threatened its timely launch from Cape Kennedy. We also will describe a practical testing protocol to evaluate a battery of ground-based sensors for the government which was conducted in Dugway Proving Grounds in Utah. In both these cases, we shall describe how statistical planning and analysis overcame classic one-factor-at-a-time engineering resistance as well as other managerial constraints.


Statistics in Natural Language Processing

Emmanuel Roche, Vice President of Research and Development, Teragram

It’s an interesting paradox:  although the amount of text available to run experiments has increased by several orders of magnitude in the past 15 years, the statistical methods remain surprisingly similar in scope to those used before the development of the Internet. This is not to say that the computational linguistic community remained idle during that period. Indeed, new techniques proved very successful in some instances. However, I think many statisticians who have followed these developments would agree that the results are nowhere near what we might have expected given the explosion in the availability of data. In this talk, we will explore why this is the case and might still be the case 10 years from now, when we will be able to play with 10 trillion documents rather 10 billion. On a more hopeful note, this presentation also will highlight the value of statistics in natural language processing when the scope of the experiments is very precisely delimited by the general knowledge of syntax.


How Systematic Innovation Beats The Lone Genius: Breakthroughs in Sales, Marketing, Finance, Health Care

Scott Lasater, Director of Lean Six Sigma Enterprise Institute, TQM Network

Far too many people see innovation as something that occurs primarily in the design of products or manufacturing operations. They often fail to see how it applies to such areas as document, patient, or information flow in sales, administrative, financial or health care operations. In reality, all organizations are transactional and service operations – some also happen to have manufacturing attached.

Many people also mistakenly believe that creativity and innovation are the result of a sudden, fortuitous flash of brilliance in the mind of the gifted, visionary, creative genius.  "Some have the gift, some don't," is a common misconception.  Actually, markets and shareholders are not so impressed with those who generate only the one-time innovations that might occasionally come from such sporadic epiphanies. They tend to reward those organizations that know how use a process to consistently and systematically produce innovative outcomes.

Come see the scientific approach to systematic innovation at work in transactional processes – generating enormous impact to both internal and external customers, market share and the bottom line.  Again and again, these methods have been proven to fill essential gaps in detective-work capabilities, making good detectives into great, innovative detectives, and good employees into great employees.
In the words of Box and Hunter, "With this democratization of the scientific method, many more people are being found with creative ability and unrealized aptitude for problem solving and discovery."

Come see what you've been missing!


Financial Analysis and Return on Investment Modeling

Tom Little, President, Thomas A. Little Consulting

The presentation presents the essential components of a financial analysis and detailed return on investment (ROI) determination critical for every improvement activity and for making valid data driven business decisions.

The presentation presents the basics of a financial analysis, the types of data required, how soft-dollar activities such as cycle time can be modeled for cost and benefit determination, sensitivity modeling for cost justification, target and goal setting and final ROI determination.


A Comprehensive Credit Assessment Framework (CCAF) – Merging the Best of Art & Science to Optimize Credit Granting

Clark Abrahams, Chief Financial Architect, SAS

Recent events in the mortgage market have raised questions about the way the lending process works.  This talk describes a framework (CCAF) that combines judgmental factors with analytical methods to achieve a more complete and accurate credit assessment.  CCAF ensures that all relevant considerations and business contexts are brought into play early on in the process so that lenders can quickly evaluate risk and render sound decisions. A dozen important features of any credit granting system are identified and the notion of an underwriting “gap” is explained.   CCAF effectively narrows the gap, which enables users to qualify more creditworthy consumers, while helping to avoid financial market crises. 

CCAF applies to both developing markets, where data may be scarce, and also to mature markets, where certain data may be over-emphasized, while other data is ignored.  In the latter case, risk may be overstated using typical lending approaches, which increases the number of undeserved credit denials and instances where the borrower is overcharged.  Conversely, risk may be understated, which increases the likelihood of loan default and loss for the consumer, lender, and any purchaser of the loan, including investors and other capital market participants.

Back to Top
Day 2 Plenary Speakers

When Function Trumps Fashion, Data Visualization Takes Center Stage

Stephen Few, Principal, Perceptual Edge

Data visualization is uniquely qualified to put business intelligence on the path to fulfilling its promise to produce innovative and intelligent organizations. It takes advantage of the tremendous but subtle power of visual perception to spot and make sense of patterns. Not to be confused with the latest fashions of gratuitous 3-D and superfluous decoration, real data visualization should open our eyes and minds to see and understand in new ways. That functionality will change the way we do business.


Probability Management: A Cure for the Flaw of Averages

Sam Savage, Consulting Professor, Stanford University

Consider a statistician who drowns while fording a river with an average depth of three feet. His error: replacing uncertainties by single average numbers. That’s the flaw of averages. It’s not pretty. However, it is pretty common. The flaw of averages affects business, government and military planning. But it doesn’t have to affect you. The answer: probability management.


Statistical Story-Telling Through Graphics

John Sall, Co-Founder and Executive Vice President, SAS

When we deliver statistical results, graphs should be at the heart of them. Graphs should be produced by default, not by request. Graphs should tell a story relating the forces and balances of the optimization that produced the estimates -- i.e., visualization is not so much what you see, but the picture of forces you imagine behind what you see, the conceptual model. Graphs should provide serendipity power, making you notice patterns and outliers that lead to discovery. Graphs should be multivariate, showing as many dimensions as are useful. Graphs should be adaptable to scale, turning black clouds into density contours when there are too many points or collapsing dimensions when needed. Graphs should be dynamic, responding to clicking, brushing, filtering, sliding, slicing, etc. Graphs should be ambitious, showing as much perspective on the data and model as is valuable.

Back to Top
Day 2 Concurrent Speakers

In the Wine Country – Changing Tradition

John Cunningham, Senior Manager, R&D and Process Improvement, G3 Enterprises

For thousands of years, the wine industry has flourished. Traditions developed over these many years are strong. G3 Enterprises serves this traditional industry in a number of ways.  The corporation’s focus is supplying wineries with quality closures, labels, process consulting and testing services.  Among its most crucial offerings are products designed to protect their customers’ wines. Not surprisingly, one critical aspect of protecting the wine is the package closure – be it natural cork, plastic stopper or screw cap.  An improperly designed or applied closure can turn a great wine into garbage in a relatively short time.

Although the industry is still heavily influenced by the art of wine production, increased competition is persuading decision makers to look more at the science. In order to serve their customers best, G3 has chosen a scientific approach to modeling, understanding, and controlling their processes.  By using design of experiment (DOE) to understand these processes, they determine which variables are vitally important for economically producing high-quality products. Introducing this approach was not easy and several forms of resistance were common.

In this presentation, John Cunningham describes how he introduced DOE into the G3 corporate culture and how successful experiments led to product improvements. These improvements are helping transform the way the company embraces analytics and creates new products.


Exploratory Data Modeling: The First Step for Successful Data Mining

Dick De Veaux, Professor of Mathematics and Statistics, Williams College

The sheer volume and complexity of data available to most organizations has created an imposing barrier to its effective use. These challenges have propelled data mining to the forefront of making profitable and effective use of data.

Clearly, the success of data mining (as opposed to query-based methods like OLAP) is due to the predictive model that it produces. But with hundreds or even thousands of potential predictor variables, how does the analyst start building the model?  In traditional statistics, we always start with exploratory data analysis (EDA) before proceeding to model building. You may have been taught the first three rules of good data analysis -- namely: 1) Make a picture 2) Make a picture 3) Make a picture. But with 500 variables that would create thousands of bar plots, histograms and scatterplots, making the analyst feel a little like the sorcerer’s apprentice.

Instead, we’ll reverse the process and start with a preliminary model just to narrow the set of potential predictors. Modeling before exploring or exploratory data modeling (EDM) seems to be at odds with standard statistical practice, but in fact, it’s simply using models as a new exploratory tool. In this talk, I will show via several case studies how EDM works and how effective it can be as a data mining strategy.


Business Breakthrough using Traditional Six Sigma

Strong S. Huang, Six Sigma Master Black Belt for Worldwide Six Sigma Program Implementation, Beckman Coulter Inc.

Traditional Six Sigma Philosophy is a disciplined way of life while enhances creativity and innovation.  The goal is to achieve excellence in Quality defined by The ZOPEC Philosophy.  With the knowledge of the ten foundation areas of Six Sigma, the true power will deliver breakthrough business results across all industries and help to make a positive difference in this world systematically.


Warning for the next earthquake: Rapid data analysis before the ground shakes

Richard M. Allen, Associate Professor, Seismological Laboratory, UC Berkeley

The Berkeley Seismological Laboratory operates a network of geophysical instrumentation across northern California. Network data streams continuously to UC Berkeley, where it is processed to detect, analyze, and notify of earthquake activity.  Using a breakthrough in our understanding of earthquake rupture we are now developing and testing an earthquake warning system that can rapidly detect earthquakes as they initiate, predict the forthcoming ground-shaking, and warn those in affected areas a few seconds ahead of shaking.  This requires rapid reduction of over 1,000 data streams from across the state, each providing ~100 data values per second, to a few parameters including earthquake time, location, magnitude and shaking distribution, all within a fraction of a second. Tracking uncertainties through the processing system is crucial to distinguish false detections from real earthquake hazard. In this seminar we will discuss the challenges in real-time data analysis, the performance of the system in recent earthquakes, and the prospects for future earthquake warnings.

Back to Top
 
SAS Campus Drive, Building S, Cary, NC, 27513 • Phone: 1.919.677.8000
© 2008 SAS Institute, Cary, NC, USA. All Rights Reserved. Terms of UsePrivacy Policy