JMP.com home
Discovery Summit 2010. Exploring data. Inspiring innovation. SAS World Headquarters, Cary NC, September 13-16.

Abstracts

  • Breakout Presentations
  • Posters

Wednesday, Sept. 14, 1:30 p.m. – 2:30 p.m.

Colorado A & B

Definitive Screening Designs

Bradley Jones, PhD, JMP Principal Research Fellow – SAS
Chris Nachtsheim, PhD, Frank A. Donaldson Chair in Operations Management, Carlson School of Management – University of Minnesota

Screening designs are attractive for assessing the relative impact of a large number of factors on a response of interest. Experimenters often prefer quantitative factors with three levels over two-level factors because having three levels allows for some assessment of curvature in the factor-response relationship. Yet the most familiar screening designs limit each factor to only two levels. We introduce a new class of designs that have three levels, provide estimates of main effects that are unbiased by any second-order effect, require only one more than twice as many runs as there are factors, and avoid confounding of any pair of second-order effects. Moreover, for designs having six factors or more, our designs allow for the efficient estimation of the full quadratic model in any three factors. In this respect, our designs may render follow-up experiments unnecessary in many situations, thereby increasing the efficiency of the entire experimentation process.

Colorado C & D

Meta-Modeling of Computational Models – Challenges and Opportunities

Cy Wegman, Section Head; Bill Worley, MS, Technology Leader – Procter & Gamble

Complex computer simulations are critical to the development of predictive models for many products and processes today. These models have their own challenges: they can become computationally expensive and time-consuming for expert users as the number of simulation “experiments” grows in a one-at-a-time simulation environment. Design of experiments and meta-modeling are being used to find holistic solutions to these problems. One would think that, due to the deterministic nature of computer simulations creating meta-models, this would be as easy as falling off of a log. Our experience is that it’s more like being hit with a log. Finding a model that best approximates the holistic behavior of the system has been very difficult. Conventional wisdom may not lead to the best overall meta-model. For example, a Gaussian Process model that fits all the computational data points exactly will give a “good” meta-model. However, close examination of the jackknife residuals versus actual plots shows that the residuals data look more like the noise seen in stochastic experiments. So, in a case like this, a neural net or even a response surface approach may lead to a better overall approximation model than the Gaussian approach. This talk will give an overview of the challenges and opportunities that are a part of meta-modeling.

Colorado G & H

Let JMP® Work for You – Analyzing Complex Process Data with the Help of JMP Scripting Language

Katharina Lankers, PhD, Research and Development Scientist – SCHOTT AG, Germany

SCHOTT is an international technology company that manufactures and processes special glasses and other specialty materials, components and systems. During different process steps with many influencing variables, huge amounts of process data are gathered and can be used for process optimization – if engineers and operators are able to draw the right conclusions within a reasonable period of time. JMP Scripting Language (JSL) is of great help here. Data access, analysis and visualization can be automated according to specific requirements, customized user interfaces help manage these procedures quickly and easily, and well-arranged output reports allow for timely and appropriate interpretation and decision making. We show some examples of JMP scripts, which are of great benefit in troubleshooting, optimizing and stabilizing a manufacturing process. With a few mouse clicks, all relevant process parameters are extracted from a database, preprocessed and analyzed, and the results are presented in a customized summary report saved as a useful basis for further discussions. In another application, we have developed a tool to help us find – from a huge pool of potential influences —which process parameters might be relevant for quality changes. For this purpose, we use a combination of data mining methods like partitioning, clustering and modeling. Thanks to these JMP scripts, our engineers save lots of time every day and enjoy optimal support in data analysis – a substantial contribution to the permanent improvement in quality and increase of yield.

Colorado I & J (30-minute talk)

Image Isn't Everything, but It Sure Is Something in JMP® 9

John Ponte, JMP Software Developer – SAS

In the 1990s, the phrase “Image is everything!” became famous. With the release of JMP 9, images aren’t “everything,” but they are available in JMP. The new image functionality allows images to be used to improve visualizations, add context around data, and help you to extract data and information that were previously inaccessible in JMP. I’ll show you how to bring images into JMP and access the image-processing functionality through the user interface. I’ll also share some real-life examples of images and show how JMP Scripting Language can be used to extract the data for analysis. Come learn how JMP can improve your image and help you get to your data.

Colorado I & J (30-minute talk)

Using JMP® to Develop Glass Formulations for High-level Nuclear Waste Conditioning: Data Visualization, Statistical Analysis and Predictive Models

D. Perret, PhD, Research and Development Scientist – Atomic Energy Commission, France

Long-term storage of radioactive waste requires its stabilization into a form that will neither react nor change for extended periods of time. Glass formulation for the vitrification of high-level nuclear waste elements has been under investigation at the French Alternative Energies and Atomic Energy Commission (CEA) for many years. Besides the complexity of its formulation, nuclear glass also needs to meet requirements specific to the industrial vitrification process. Viscosity, density, electrical and thermal conductivities, and of course, long-term durability of the glass, are properties that have to be perfectly understood and controlled. As a consequence, we continuously have to deal with large amounts of data, including formulation data (glass compositions), physical and chemical properties, and data related to the vitrification process. JMP software has been recently implemented in our R&D environment by teams who develop nuclear glass formulation. The graphical platform provides very useful and easy-to-use tools such as the Scatterplot Matrix and the Mixture Profiler platforms, which enable the visualization and the analysis of large amounts of formulation data. Besides these convenient features, the main reason for using JMP is its powerful statistical analysis platform, which facilitates the comparison of glass composition domains with a high degree of complexity. For this purpose, PCA and Cluster platforms are very relevant. Since our studies are also focused on building property-to-composition predictive models, such tasks can be efficiently carried out by using the Fit Model and Stepwise platforms. Finally, considering the complexity of the glass formulation in question, we necessarily have to use a design of experiments (DOE) approach. Although the Mixture Design platform does not exactly meet all of our specifications, JMP does provide statistical information (PRESS, hat values, leverage plots) to analyze the results coming from our designs of experiments.

Wednesday, Sept. 14, 2:45 p.m. – 3:45 p.m.

Colorado A & B

Profiling the Probabilities of Preference

Bradley Jones, PhD, JMP Principal Research Fellow – SAS
Rob Reul, MS, Managing Director – Isometric Solutions LLC

Discrete choice experiments are useful in marketing to define a product or service that has the highest perceived utility for target customers. Credit cards are a service that banks want their customers to accept and use. To do this, they may offer various incentives to their customers. Finding the most powerful incentives was the goal of a choice experiment performed for a banking client. This study evaluated 8,232 different credit card incentives to find the one with the most appeal. The choice experiment was performed using a Web-based survey of more than 1,000 prospective customers. Each respondent answered a number of demographic questions followed by a six-minute choice experiment that presented eight successive choice sets of three offers. In each set, participants picked the best offer. An aggregation of results from the 100 hours of discrete choice experimentation was used to construct a choice model. The fitted model for the respondents as a group predicted that the best offer would be preferred to the worst offer by 16 out of 17 people. Compared to the current offering the best offer would be preferred by eight of nine people. Two of 37 demographic variables collected were effective for segmenting the respondents into roughly equal groups. In the target group, the previously quoted odds ratio of 16:1 improved to 48:1. Using just these two subject variables, the bank knew exactly how to target the market for their new card!

Colorado C & D

Discovering Partial Least Squares with JMP® 

Ian Cox, PhD, Solutions Manager for JMP Sales and Marketing – SAS
Marie Gaudard, PhD, Consultant – North Haven Group

This talk provides an introduction to partial least squares (PLS), an approach that can be used to find relationships between sets of variables, typically a set of predictors and a set of responses. PLS exploits the correlations within and between the sets of variables to reveal underlying or latent structures and, unlike regression, PLS can be used when the predictors outnumber the observations. Although a favorite with chemometricians, PLS is a versatile technique that can find application in many areas. We illustrate PLS using an example from computational chemistry (Umetrics, 1995). In this example, the goal is to measure the biological activity of a compound indirectly, using relatively inexpensive chemical measurements. The data set contains 33 observations and 15 predictors.

Colorado G & H

Particle Counts Mislead Scientists: A JMP® and SAS® Integration Case Study Investigating Modeling

Kathleen Kiernan, MS, Technical Support Statistician; Diane Michelson, PhD, Statistical Trainer; Annie Zangi, MS, Research Statistician – SAS

This paper will address consequences of modeling data in real-world applications and will demonstrate methods and techniques for modeling non-normal data using the integrated capabilities of both JMP and SAS. Most commonly used statistical tests and modeling techniques are based on certain assumptions. What if the data we are analyzing is non-normal or heteroscedastic? JMP offers some methods and techniques for modeling non-normal or heteroscedastic data, while SAS offers a full range of procedures that can be used. The Box-Cox power transformation is often used to transform non-normal or heteroscedastic data. Another approach is to model a non-normal distribution. This paper will present two semiconductor case studies. During semiconductor processing, wafers are kept as clean as possible. Particles in the active area of a circuit can ultimately cause a defective chip. The first example explores a case where scientists tried to reduce particle counts occurring on wafers and found the model less effective at identifying the source of the problem than originally thought. Semiconductor chips are fabricated on large wafers, each of which can contain hundreds of chips, and wafers are processed in batches. In the second example, we will investigate misleading results that can occur when trying to find the important factors in the variance components of the process in a traditional linear models setting. We will present solutions for both of these examples using JMP and SAS integration, which can be applied to other scenarios as well.

Colorado I & J (30-minute talk)

How We Used JMP® to Find the Needle in the Haystack (or “How We Caught Our Black Swan”)

Greg Flexman, Principal Process Development Engineer; Jyoti Srivastava, Senior Process Development Engineer – Talecris Biotherapeutics

The Talecris Process and Risk Analysis group uses JMP daily for data analysis to support the manufacturing of life-saving protein therapies. Recently, the analytic capabilities of JMP were crucial in addressing a manufacturing issue that was profoundly affecting the business, inspiring the largest troubleshooting investigation in the company’s history. This finding turned out to be similar to finding the proverbial needle in a haystack at extraordinarily long odds. This case study will describe how JMP analysis (data transformation, series of linking data from disparate sources, table operations, scripting and regression) led to the discovery of the root cause that had been eluding teams of scientists, biochemists, engineers, and various internal and external experts. Step 1 in finding this needle in the haystack was selecting the proper “field” in which to search. This exercise started with evaluating for potential cause in the supply chain. This presentation illustrates how the issue was identified using a combination of logic supported by data analysis in JMP. The discovery required the use of a “swim upstream” supply chain analysis across two combination (blending) steps. Step 2 was picking out the right “needle,” and confirming there wasn’t more than one. The culmination of the analysis involved a simple regression executed on a grand scale (scripted iteration of thousands of linear fits). This completed the extension of the supply chain “swim upstream” analysis across three combination (blending) steps and identified the root cause that was literally a one-in-a-million scenario.

Colorado I & J (30-minute talk)

A System for Analyzing Drug Safety Data Using JMP®

Peter Mroz, MS, Manager and Statistical Programmer – Johnson & Johnson

A system has been developed that uses JMP 9 to analyze drug safety data. Using enhanced and easy-to-understand data manipulation and visualization techniques, the system allows medically trained personnel to explore safety data and identify adverse events of interest for a given drug. Several new features in JMP 9 provided significant advantages, including add-ins for application distribution, cell colorization and separate windows for each object. The advantages of using JMP over a previous system include computational speed, user-friendly interfaces and interactivity, for example, linking between objects, drilling down to underlying data and data filtering. This talk will provide a general overview of the system, followed by some of its interesting technical details. These details will include how to make a table cell act like a hyperlink, how to link two tables that are not normally linked, and how to visualize many different adverse events over time with a heat map grid chart.

Wednesday, Sept. 14, 4:00 p.m. – 5:00 p.m.

Colorado A & B

Setting Robust Process Specifications Using Design of Experiments and Monte Carlo Techniques

William D. Kappele, President – Objective DOE

Setting specifications on process parameters can be a difficult job. Setting robust specifications that won't need constant adjustment may seem almost impossible. This presentation will show how a combination of design of experiments and Monte Carlo techniques can be used to set robust process specifications in a straightforward, objective manner. In an example using published data for a rubber manufacturing process, you will see stages of the process, from designing an experiment for optimizing the process specifications to the use of a Monte Carlo simulator to study the responses to specification changes. You will see these results used to optimize the process specifications. You won't need to delve into the math – JMP will handle that. You will leave this presentation with a new paradigm for setting robust process parameter specifications.

Colorado C & D

Converting Data into Information – The Application of Multivariate Statistical Techniques in Agrochemicals Production

Alan Brown, PhD, Principal Technical Expert – Syngenta

Syngenta is a world-leading agribusiness providing crop protection products (herbicides, insecticides and fungicides), plants and seeds for farmers and growers. Manufacture of the chemical active ingredients is undertaken at a number of sites using a combination of batch and continuous processing. A characteristic of these sites is the production of large amounts of data. This presentation illustrates the use of the modeling and multivariate capabilities within JMP Pro to turn such data into information that enables improved process understanding. Three examples of increasing complexity are presented. The first utilizes principal components analysis (PCA) of six variables to investigate the reason for performance differences between two production units. The second considers 34 variables (for example, flows, temperatures, pressures, etc.) from a continuous process and the use of classification techniques, such as recursive partitioning, boosted trees, PLS-DA and neural networks, to predict the polymorphic form of a product. The same set of input variables is used to model the amount of a group of related chemical impurities using PLS and neural networks. The third example uses PCA/PLS and recursive partitioning to examine the differences between three parallel production streams. Data on 14 process variables were collected for 30 batches, 10 for each stream. Approximately 400 observations, at one-minute time intervals, were captured per variable per batch.

Colorado G & H

Discovering Information in Functions Using JMP® and R

Kevin Anderson; Theresa Utlaut, PhD

Analysts frequently fit curves to data in an attempt to produce information. Regardless of the domain that generates the data, whether time, frequency or others, the analyst’s quest is the same: to fit a function to the data and create useful information to make decisions. Functional data analysis is a set of techniques for analyzing those functions fit to data. These relatively new techniques can be useful for examining sensor, economic, environmental and other types of data generated in serial domains. Certain sets of basic functions can themselves be analyzed to discover patterns in such characteristics as the velocity and acceleration of those curves, and may allow previously unappreciated details to be discovered, utilized and admired. This presentation will explore some aspects of functional data analysis in JMP 9. It will discuss some theory, and demonstrate some implementations in JMP 9 and JMP Scripting Language (while interfacing with R software) to analyze functional data.

Colorado I & J (30-minute talk)

Uncover Associate Employees’ Tenure Pattern Using Survival Analysis in JMP®

Libo Yu, Senior Manager of HR Strategic Analytics and Budgeting – Walmart

The world’s largest employer, Walmart runs 4,000 US stores employing 1.2 million store associates. Terminations and hires happen every day, but there is no formal tracking tool for these events. This is because it becomes “mission impossible” to attempt to predict the right size of hires or terminations with conventional methods. This paper shows how to leverage survival analysis and develop tenure models to project the termination likelihood for store associates. Furthermore, it extends the model to proactively project the size of terminations, comparing the forecast with actual results to determine projection accuracy. The forecast forms another reference framework for store operations. The paper also examines the competing risks to unveil underlying factors that better explain the drivers behind the terminations. In addition to highlighting the benefits of using survival analysis in JMP for operations, this paper also reviews the limits and conditions of using this technique, as well as the potential pitfalls. Various techniques such as life distribution, survival and parameter estimations will be discussed. This project had multiple phases, including prototype, testing to pilot program and final rollout; therefore, this paper presents a very good case study for applying survival analysis to operation research in the retail industry.

Colorado I & J (30-minute talk)

Discriminant Analysis of High School Student Mathematics Class Placement

Simon King, Teacher – Cary Academy/Texas A&M University

A crucial decision that affects the academic future of many middle and high school students is the level of mathematics class in which they are placed. This decision effectively decides whether they can take high school calculus or statistics as seniors. School mathematics departments place students into math classes on the basis of teacher recommendation. The purpose of this study was to explore high school student mathematics class placement through discriminant analysis. Standardized testing scores and class grades were used to assess teacher placements of students into 7th grade and 9th grade mathematics classes (Algebra I, Geometry, Geometry Honors and Algebra II Honors). A logistic discriminant analysis was used. Data exploration visually demonstrated a large number of students who “overlapped” two potential class placements. On exploration of individual student logit probabilities and visual representation of the data, it becomes clear that many decisions are marginal. While standardized testing helps us paint a useful picture of a student and his or her potential future academic success, it is unlikely to paint a complete picture. The missing piece is the teacher’s knowledge of the recommended student, as well as other variables that affect that student’s success and mathematics placement. These extra variables include but are not limited to work ethic, commitment to learning and communication skills. Additionally, JMP visuals were utilized to present information to parents, teachers and school administrators with limited statistical backgrounds.

Thursday, Sept. 15, 10:30 a.m. – 11:30 a.m.

Colorado A & B

Analyses of El Niño and La Niña Events and Their Effects on Local Climate Using JMP® 

Rich Giannola, PhD, Atmospheric Scientist – Johns Hopkins University Applied Physics Lab

El Niño and La Niña (EN/LN) are quasi-periodic ocean warming and cooling events, respectively, that originate in the equatorial Pacific Ocean. Along with associated changes in atmospheric pressure and wind patterns, these phenomena influence the general circulation of the atmosphere and affect short-term regional climate. Meteorologists characterize EN/LN events through various physical indices that are used for correlations with observed climate conditions such as temperature and precipitation. JMP is ideally suited for visualization and analyses of these data. First, indices for EN/LN events are available for each month since 1950. Monthly averaged data from US weather stations are also available for the same 60-year period. Thus, there is a large data sample available for statistical analyses. Second, several JMP platforms – including distribution, time series, and bubble plots – can be used to process and visualize the data. In this paper, we propose to investigate the EN/LN phenomena interactively using JMP. We shall use historical EN/LN indices, temperature and precipitation data from selected US cities, and climatic normals (means) from the National Oceanic and Atmospheric Administration. The temporal behavior of EN/LN events and their effect on local climate will be presented. Comparisons will be made between the results determined from JMP and known climatic impacts of these phenomena. Increased familiarity with JMP processes and an understanding of the EN/LN phenomena are anticipated takeaways for JMP users.

Colorado C & D

JMP® Degradation

Christopher Gotwalt, PhD, Director of JMP Software Development – SAS
José G. Ramírez, PhD, Industrial Statistician – Amgen

When we use the word “degradation” in the presentation title, we’re not talking about the age of JMP; it’s only 21 years old. Nor are we insinuating that JMP is falling apart. Actually, we are referring to the software’s capabilities for the design and analysis of degradation studies. Product reliability studies are usually based on product failures. In some situations, however, greater insights can be gained by studying how a material or product degrades over time. In this talk we will use data from a degradation study on a dielectric to show how the JMP Degradation platform can be used to model degradation data and to generate pseudo-failure data that can be analyzed with the Fit Life by X platform. We will also show how the results of the pseudo-failure analysis can be used as inputs for the Accelerated Life Test Design platform to design a follow-up experiment. If time permits, we will discuss using PROC NLMIXED in SAS to fit a nonlinear model for the degradation data.

Colorado G & H

Invasive Pest Safeguarding Using JMP® to Profile Sample Design, Expert Opinion and Pest Movement

Ned Jones, MS, Statistician

Planning programs to eradicate exotic pests that have entered the country requires the ability to predict pest movement. Developing boundaries around new interceptions is critical. JMP 9 provides map-based visualizations of these interceptions. JMP models annual movements by looking at the yearly differences in interception locations. Boundaries can be developed around new interceptions by extending the results to the Profiler and Simulator.

Colorado I & J (30-minute talk)

Choosing Between Several Designed Experiments Based on Multiple Design Performance Criteria

Christine Anderson-Cook, PhD, Scientist – Los Alamos National Laboratory

When selecting a best design for an experiment, it is beneficial to consider what the specific objectives of the experiment are and to use matching criteria to quantify differences between the potential design choices. JMP provides multiple options for creating designs to be compared and a rich set of design summaries that allow an experimenter to understand the trade-offs between candidate designs to make a judicious choice appropriate for that experiment. The talk demonstrates the process of constructing and evaluating designs for a specific screening design example and shows how this process could be generalized to other settings.

Colorado I & J (30-minute talk)

Optimization of Therapeutic Protein Production in an Aquatic Plant Expression System Using DOE

Lynn Dickey, PhD, Vice President of Research and Technology Development; Vincent Wingate, Associate Director of Upstream Process Development and Plant Biology – Biolex Therapeutics

Biolex Therapeutics uses a novel biomanufacturing platform (LEX SystemSM) based on the aquatic higher plant Lemna minor, genetically engineered to produce recombinant biopharmaceutical proteins. We use a small-scale process model to optimize critical factors to maximize the yield and quality of recombinant biopharmaceutical proteins. Using the design of experiments (DOE) capabilities in JMP, we will discuss what information is required prior to setting up a DOE, how to set up a custom DOE, and how to analyze and visualize the data. We will exemplify the JMP DOE platform’s powerful design, analytical and graphic capabilities that have allowed us to optimize critical parameters and identify critical parameter interactions and optimal ranges (both for environmental and media components) that affect the yield and quality of several different therapeutic proteins. A designed experiment will be presented that optimizes carbon dioxide levels, nitrogen concentration and environmental factors to maximize the yield of a therapeutic protein. It is well known that two of the major nutritional components required for increases of plant biomass are higher levels of nitrogen and carbon dioxide. It is not yet fully defined in the LEX SystemSM what combination of nitrogen, carbon dioxide and environmental factors (e.g., light intensity and temperature) are required for maximal expression of a recombinant protein. Using our small-scale process model, we will show how we have used custom DOE experiments with these critical parameters to optimize the expression of a vaccine antigen recombinant protein.

Thursday, Sept. 15, 1:30 p.m. – 2:30 p.m.

Colorado A & B

Building Effective Response Models for Direct Mail Using JMP® 

Tim Troutman, Manager of Mass Marketing – Charlotte Rescue Mission

Charlotte Rescue Mission, a residential recovery program for homeless men and women, uses JMP to increase response rates for direct-mail campaigns, which constitute a primary means of fundraising. Our donations come from a large number of donors in small amounts. This contrasts with other non-profits that receive government or United Way funding and consequently receive and process fewer, but larger, donations. The result is that Charlotte Rescue Mission, and organizations like it, have higher fundraising costs. We use JMP to build predictive response models and assign scores to potential donors. Using this information, only those with a high likelihood of giving to a specific campaign are solicited for donations. This reduces fundraising costs and allows more of the money raised to be used for our programs. Additionally, JMP offers us data visualization tools that, for example, let us track donation trends by geographic area. Data exploration tools in JMP have enabled Charlotte Rescue Mission to better classify donors according to their giving potential, enhancing the organization’s ability to allocate development resources in a way that minimizes overhead costs. This presentation will explore data mining capabilities available in JMP, especially the Partition platform and logistic regression models, and will walk through the design and execution of a predictive response model. It will also show examples of other areas where the JMP toolset has improved our analytics, for instance, better understanding donor behavior through geomapping and data exploration. Any industry that could benefit from data-driven predictions and customer/donor classification will benefit from this presentation.

Colorado C & D

JMP® Applications in Photovoltaic Reliability

D.C. Jordan, PhD, Senior Reliability Engineer – National Renewable Energy Laboratory
Chris Gotwalt, PhD, Director of Statistical Research Development

The ability to accurately predict power delivery over the course of time is of vital importance to the growth of the photovoltaic (PV) industry. Two significant cost drivers are the efficiency with which sunlight is converted into power and how this relationship develops over time. The accurate knowledge of power decline over time, also known as degradation rates, is essential to all stakeholders — utility companies, integrators, investors and scientist alike. Outdoor testing plays an important part in quantifying degradation rates of different technologies in various climates. Due to seasonal changes, however, several complete cycles (typically three to five years) traditionally need to be completed to obtain reasonably accurate degradation rates. In a rapidly evolving industry such a long time span is often unacceptable. Advanced time series modeling – such as Autoregressive Integrated Moving Average (ARIMA) modeling – can be utilized to decrease the required time span and is comparable to some non-linear modeling. In addition, we will demonstrate how the JMP 9 map feature was used to reveal important technological trends by climate.

Colorado G & H

Combining On-Field Performance Analysis with Expert Ratings for a Data-Driven Decision Process in Professional Baseball

Sig Mejdal, MS, Director of Amateur Draft Analytics – St. Louis Cardinals

The use of analytics to measure, evaluate and predict both the performance and value of athletes has become more prevalent in baseball since the publication of Michael Lewis’ acclaimed book, Moneyball, in 2003. While change is occurring in this tradition-steeped sport, using analytics along with scouting evaluations is not standard practice. One challenge is that decision makers are often skeptical, and rightfully so, of analytical methods – especially ones that appear to be a “black box.” The philosophy at the St. Louis Cardinals is to leverage both the scouts’ expert ratings and cutting-edge quantitative analysis of data from many sources so that the two methods complement each other to provide maximum predictive ability. This paper will describe generally how the St. Louis Cardinals use JMP and other analytics to promote a data-driven decision process and address decision makers’ concerns about statistical information. The use of data visualization, in combination with compelling storytelling, can help direct decision makers to optimal baseball decisions. The paper will focus on the use of analytics to evaluate amateur players seeking to enter the major leagues. It will provide an update on this work since my 2009 keynote speech at the JMP Innovators’ Summit in Chicago.

Colorado I & J (30-minute talk)

Beyond JMP® Journals: Using JMP as Interactive and Dynamic Presentation Software for Real-Time Evidence-Based Decision-Making

John Salmon, MS, Research Engineer – Aerospace Systems Design Laboratory

JMP “slide-interfaces” offer additional capability not found in other presentation software, such as suppression of analyses for different types of audiences, live zooming or filtering actions for multi-scope investigations, and on-demand adjustments of assumptions, constraints and presented examples. These JMP features respond to suggested studies that static slides, or slides with minimal interaction, may not always be the most effective way to present and communicate information essential for decision making. Since decision making is innately a task or process for defining actions in the future, uncertainty is always present. Thus, decision making often requires testing key assumptions or hypotheses under different scenarios in order to gain insight into expected outcomes from different decisions. Traditional slides with static graphs do not allow manipulation of assumptions; as a result, decisions are often made on a subset of all possible scenarios and on potentially limited data not shared by the decision maker. Using JMP as both presentation software and data analysis tool concurrently with “slide-interfaces” can significantly improve decision making by immersing participants in the data during the presentation – thereby allowing real-time trade-offs, data mining, “what-if” games and decision space exploration. This paper summarizes some initial findings and applications of JMP as presentation software developed at the Aerospace Systems Design Laboratory (ASDL) at the Georgia Institute of Technology.

Colorado I & J (30-minute talk)

Menu Customization and Add-Ins in JMP® 9

Eric Hill, MS, Software Developer – SAS

With JMP 9, the development team completely overhauled menu and toolbar customization and created a new concept called a JMP add-in. JMP add-ins create an unprecedented opportunity to customize JMP and share analytical methods and discoveries with the entire JMP community in your organization. This paper will discuss how menu and toolbar customization is different in JMP 9; what JMP administrators need to know about managing custom menus in large JMP deployments; and different approaches to using JMP add-ins to customize the JMP environment and empower JMP experts in your organization to share their expertise with others.

Thursday, Sept. 15, 2:45 p.m. – 3:45 p.m.

Colorado A & B

Gage Repeatability and Reproducibility vs. Evaluating the Measurement Process: What Is the Difference?

José G. Ramírez, PhD, Industrial Statistician – Amgen
Laura Lancaster, PhD, Research Statistician Developer; Jon Weisz, MS, Senior Director of JMP Sales and Marketing – SAS

Gage repeatability and reproducibility (GR&R) studies (e.g., AIAG’s Measurement System Analysis) are traditionally used to assess the ability of a measurement system to give consistent readings when done several times (repeatability) and under different conditions (reproducibility). A GR&R study helps classify a measurement system as good, marginal or acceptable. An Evaluating the Measurement Process (EMP) study goes beyond a GR&R by providing visual tools for learning and improving the measurement process so it operates at its full potential. In this session, we show how to analyze the data from a traditional GR&R study using the Variability/Gauge Chart platform, as well as the EMP analysis using a JSL plug-in with SAS/QC® and an early view of the new EMP analysis planned for JMP 10.

Colorado C & D

Using JMP® for Advanced Warranty and Reliability Analytics

Sameer Vittal, PhD, Technical Leader – Analytics, GE Energy
Rob Reviere, Technical Architect - SAS

Most manufacturing companies face common challenges in understanding part returns/failure data and in predicting the financial impact of these losses. A wide variety of techniques has been developed, from reliability life data analysis to more complex actuarial simulation tools. In this paper, we explore a variety of methods available in JMP, from basic life data analysis (e.g. Weibull’s) to proportional hazards models (both parametric and non-parametric) that can incorporate part configuration and condition monitoring/usage data. We also develop models that incorporate survival life/reliability as a function of usage, condition, configuration, geospatial and seasonality variables. In addition, the paper will provide examples of how these models can be combined with financial loss data to generate actuarial loss models via Monte Carlo techniques. All examples are based on real-life industry experience. The data has been scaled or standardized appropriately to avoid divulging proprietary information.

Colorado G & H

Spotting the Cheetah with JMP® 

Sky Alibhai, DPhil, Zoë Jewell, VetMB, Project Co-Leaders – WildTrack

As a follow-up to last year's Discovery Summit talk about visualizing data (specifically animal footprints), we'd like to take this year's attendees on a trip to Namibia to see JMP in action. One new project for us is monitoring a global icon: the cheetah. This is a story of groups from different disciplines and geographic regions working together to provide solutions that not only help save Africa's most endangered cat, but have many applications in medical imaging, forensics, earth sciences and industry. In our talk, we'll demonstrate how we've helped introduce new image-processing capabilities into JMP toward automated extraction of footprint images from similar substrates – a huge challenge in computer vision. We'll showcase a new and robust cross-correlation discriminant analysis model built with JMP scripting language that allows us to identify individual animals from their footprints. We'll tell how, once individuals have been identified, the JMP 9 mapping capabilities help us monitor their ranging patterns. The whole process will illustrate the flexibility and power of JMP to rise to challenges in a number of different fields. Of course, we will show JMP live for maximum impact. And we'll also illustrate key points in this presentation with video footage gathered in May 2011 at the first WildTrack workshop for carnivore monitoring, in Namibia.

Colorado I & J (30-minute talk)

Yield Modeling and Patterns Prediction Using Data Mining: A Preventive Approach

François Bergeret, PhD, Statistician – Ippon Innovation
Vincent Barec; Alexandre Couvrat, Yield and Defectivity Engineers – Soitec

For complex process manufacturing, yield and defects depend on a huge number of factors. Yield enhancement tools already in production are helping Soitec to detect most process issues (quick reactive approach). This paper presents two new data mining tools created with JMP Scripting Language (JSL) and JMP that are now used for a preventive approach. Yield Model detects the main factors related to yield. These factors are then included in a model to provide accurate predictions. Simulations — based on manufacturing volume, process tools loading and throughput — are available for the user to predict yield and defects variations. Yield Model is a very interactive tool, and its ultimate JSL capabilities will be presented live during the presentation. For the next SOI (Silicon on Insulator) generation, within part thickness uniformity is the main yield driver. Pattern Prediction uses the JMP 9 Neural platform to analyze and predict thickness uniformity with very promising results. The user trains the neural net on a learning sample, and the neural net automatically predicts the pattern for future parts with a good accuracy, assessed on a validation sample. This paper includes advanced JSL features used for the Yield Model and the discovery of the power of neural nets to accurately predict complex patterns. The two tools presented in the paper are preventive tools, helping Soitec to save time and to improve quality.

Colorado I & J (30-minute talk)

Enabling Powerful Analytics with JMP® Scripting for SAS® and R Users

Matthew Flynn, PhD, Director, Claim Research – The Travelers Companies Inc.

Analytics are viewed as increasingly important for competitiveness in the Property and Casualty (P&C) insurance marketplace. Researchers in insurance utilize the depth of SAS Analytics and data integration daily. Adding JMP to the toolkit enhances the researcher’s intuition and communication of analytic results by allowing easy visualization of SAS data with JMP. Scripting in JMP also facilitates the rapid development of tools that allows one to “share well with others” – that is, to enable your business users to access power analytic tools. This paper will highlight new features in JMP 9 that utilize JMP integration with SAS and R, aiding the solution of business problems in P&C research analytics. JMP has always provided a number of ways to interact with operating systems, for instance, connecting to local and remote SAS servers, executing SAS code and exchanging data programmatically. In this paper, we will explore multiple methods that we have found useful in our analytic work for interacting with the statistical package R from within JMP via JMP Scripting Language. Example methods include batch file-based communication, Windows OLE/COM communications, socket communications, and using the Load Dll functionality to call R functions. I will also highlight the power of utilizing the new JMP 9 functionality for accessing R and show examples of adding to the functionality of JMP by accessing the Microsoft Windows API via the JMP LoadDll function. Examples include accessing Windows environmental variables; controlling and positioning JMP windows; defining hot-keys; importing and exporting data to Microsoft Excel; and GUI automation testing via sending keystrokes.

Posters

Discover posters featuring forward-looking applications.

Poster 1:

Supply Chain Risk Mitigation in Medical Device Manufacturing: Using JMP® to Analyze Complaints, Improve Design, and Control an Assembly Process

Joel Phillips, Director of Quality - Incisive Surgical

Poster 2:

SAS® BI and JMP® Integration for Signal Detection of Adverse Drug Events in the FDA/MAUDE Database

David Olaleye, PhD, Statistician and Software Developer - SAS

Poster 3:

Pricing Analytics Application in Product Cost Using JMP® Scripting

Emma Johnson, Business Analyst - Cypress Semiconductor

Poster 4:

Using JMP® to Teach Business Statistics: Cases and Applications

Marlene Smith, PhD, Associate Professor of Quantitative Methods - The Business School, University of Colorado Denver

Poster 5:

Having Fun with SPC Using JMP® Scripting Language

Wenyu Su, Applied Statistician - The Dow Chemical Company

Poster 6:

Automated Reporting:  JMP® Graphics and JSL for Benchmarking Dashboards

Erin Vang, Principal Pragmatist - Global Pragmatica
Karen Copeland, PhD, Principal Statistician - Boulder Statistics

Poster 7:

Method Forming a Self-Damping Sine-Wave Spring Systems with an Exponential Load Range

Walker Lee Guthrie, Owner and Developer - Pre-Stress Products, Quantum Spring Division

Poster 8:

Variance Budgeting and Monte Carlo Techniques to Aid Specification Setting

Ronald Andrews, Senior Process Engineer; Joseph McGee, Principal Scientist - Bausch + Lomb

Poster 9:

JMP® and Visual Six Sigma: A Recipe for Optimal Healthcare Outcomes

William Baum, Consultant - Cactus Consulting
Philip Ramsey, PhD, Consultant - North Haven Group

Poster 10:

Stochastic Simulation of Laundry Cold Water Temperature in North American High-Efficiency Washers

Mark G. Cipollone, PhD, Senior Scientist - The Procter & Gamble Company

Poster 11:

A System for Analyzing Drug Safety Data Using JMP®

Peter Mroz, Manager and Statistical Programmer - Johnson & Johnson

Poster 12:

Hey! Measurement Variation! Get Off My Manufacturing Floor!

Spiro Atieh, Quality Engineer - Specialty Silicone Fabricators

Poster 13:

Role of Statistics in Quality by Design (QbD) Implementation in Product Development

Inna Ben-Anat, QbD Strategy Leader - Teva Pharmaceuticals USA

Poster 14:

Did It Have an Impact or Not? Using JMP® for the Design and Analysis of an Experiment in a Vaccine Product Root Cause Investigation

Janet Alvarado, Senior Project/Process Statistician - Merck & Co.

Poster 15:

Moving from SPSS to JMP®:  A Transition Guide

Jason Brinkley, PhD, Assistant Professor of Biostatistics - East Carolina University

Poster 16:

Comparison of K-means, Normal Mixtures and Probabilistic-D Clustering in Segment Profiles for B2B Markets

Goutam Chakraborty, PhD, Professor; Satish Garla, Student - Oklahoma State University
Gary Gaeth, PhD, Professor - University of Iowa

Poster 17:

You Can’t Experiment on That! The Use of Stochastic Modeling and Experimentation to Optimize Processes

Kurt Johnson, Director of Office of Metrics - LDS Church

Poster 18:

Using SAS® and JMP® to Monitor Energy Conservation Efforts

John Mills, Principal Analyst; Sarah Thornton, Analyst - Progress Energy

Poster 19:

Reliability Parameters Z-score Monitoring

HanCheng Ong, Staff Engineer - Seagate Technology International

Poster 20:

Using Dynamic JMP® Dashboards to Provide Decision Analysis Support for Critical Staffing Metrics

Brad Radichel, Program Manager - US Department of Veterans Affairs
Jay V. Schindler, PhD, Health Informatician - Northrop Grumman

Poster 21:

Using Stepwise Regression to Find Possible Interactions in a Non-Interaction Model DOE

Jed Campbell, Quality Manager - US Synthetic

Poster 22:

JMP® Data Server for Multi-User Organization with Diverse Data Sources

Neal Verzwyvelt, Staff Software Engineer - Atmel Corporation

Poster 23:

Automating Process Capability Analysis Reporting for Multi-Product Parameter Sets

Thomas D. Steinmetz, Senior Automation Analyst; Byron A. Wingerd, PhD, Process Engineer - Merck Manufacturing Division, Viral Vaccine Technology and Engineering

Poster 24:

Linkage Map Optimization in JMP® Genomics to Improve Agronomic Crops

Eric Jackson, PhD, Research Molecular Biologist - Agricultural Research Service (USDA)
Rebekah Oliver, Research Scientist - Agricultural Research Service (USDA)
Kelci Miclaus, PhD, Statistical Geneticist, JMP Life Sciences - SAS

Poster 25:

Data Mining with JMP® - Product Performance and Customer Behavior

Peter Wiebe, PhD, R&D Senior Quality Engineer — Abbott Laboratories

Poster 26:

Adventures in Solid Phase Optimization - How I Learned to Stop Worrying and Love DOE

Steve Figard, PhD, Research Scientist - Abbott Laboratories