Presentation and Poster Abstracts
- Tuesday Breakouts
- Wednesday Breakouts
- Poster Sessions
Tuesday, September 14, 1:20 p.m.
Bradley Jones, PhD, JMP Director of Research and Development – SAS
Lou Valente, JMP Product Manager – SAS
There is a general perception that designed experiments are esoteric and a tool only for the advanced user. We believe this perception is false and counterproductive because it provides a rationalization for not doing experiments. In this presentation we will design, perform and analyze an experiment to show how straightforward the process of experimentation can be.
Steve Figard, PhD, Senior Scientist – Abbott Laboratories
Most software packages that revolve around and/or include the creation of graphs and the representation of data in graphical format provide today’s users with an unprecedented (and sometimes confusing) plethora of options to use and graphs to make. Data analysis and presentation are highly facilitated by the proper use of graphics. Indeed, the Graph Builder platform of JMP® 8 is the greatest thing since sliced bread for the rapid visual presentation of quantitative information. With the great power of these and other graphics packages comes not just great responsibility (thank you, Spidey), but also an increased probability of lack of clarity, error, and actual abuse from simple ignorance of principles of good graph construction. The software can only provide so much protection, for as Richard Cook (science fiction author, The Wizardry Compiled) has so well observed, “Programming today is a race between software engineers striving to build bigger and better idiot-proof programs, and the Universe trying to produce bigger and better idiots. So far, the Universe is winning.” The objective of this presentation is to provide the audience with the principles gleaned from such giants as Cleveland and Tufte (along with anyone else I find in the interim) within the context of JMP, in an effort to combat graphic entropy.
A JMP®-Based, User-Friendly Analysis and Presentation System for Consumer Test Evaluation and Interpretation in the Food Industry
Jeff Stagg, Principal Scientist – Kraft Foods
David Rose, Implementation Consultant – SAS UK
Kraft Foods promotes statistical thinking and best practices globally with the development of software tools that execute complex data handling and calculation. The choice of software platform depends on both software capability and customer requirements to determine the optimal cost/quality solution. JMP software was selected as the platform to provide a CONSUMER TEST EVALUATION tool for the global Sensory/Consumer function in Kraft Foods. It replaces a set of Kraft Foods-developed data analysis macros written in various other software packages. JMP scripts combine data handling and calculation requirements with novel high-quality graphical output tailored for import into PowerPoint presentations. Products are assigned names, labels and multiple color profiles that are automatically used in all graphical routines. Statistical analysis methods within the Kraft Foods application include principal components analysis (PCA), regression, clustering and multiple range tests and graphical output including customized histogram, frequency, stacked, scatter, star plots and both 2D and 3D PCA charts. Importantly, all native JMP software capabilities remain available to users of the package, but the Kraft Foods application product names and color profiles are not tied to these procedures. The CONSUMER TEST EVALUATION tool has halved analysis time, significantly reduced training time, minimized calculation error, ensured/increased use of Kraft Foods best practices and reduced software costs. Its success results from a close collaboration between Kraft Foods U.K. Banbury and SAS Professional Services in Marlow. The application is now deployed globally under International and U.S. Master License Agreements, which enables flexibility of deployment to an evolving customer base.
Peter Wiebe, PhD, Senior Quality Engineer – Abbott Laboratories
Krista M. Birch, ASQ CSSBB; Moria Trautman; Tony Orzechowski, MS, ASQ CSSBB; Co-Authors – Abbott Laboratories
The JMP Scripting Language (JSL) allows a user to harness the powerful analytics and graphing capabilities of JMP to generate dynamic customized applications. Three critical elements of an application are the user interface, the analysis and the output. The designs of these elements are dependent on the business need of the application; however, JSL offers the flexibility to tie these elements together and maintain the performance intrinsic to JMP software. As an example, we have developed a JMP software application that uses historical performance data to predict its future capability. Unlike the built-in functions in JMP, we perform this using a bootstrapping simulation method. The user has the option to use a random normal distribution with a specified mean and CV, or reference any table of historical values, so that no assumptions regarding the distribution are required. We have added the ability to dynamically add other sources of variation, either through the user interface or through a referenced data table. The analysis is performed using matrix algebra, and the output is created with existing JMP graphing and analysis tools. Outputs that previously required hours of repetitive manual analysis activities now take place in a matter of seconds. The objective here is to emphasize that by using JSL we can enhance the existing capabilities that allow users to create applications in JMP that have the power and feel of features intrinsic to JMP, yet have features required to meet a business need.
David Meintrup, Professor of Mathematics, Statistics and Operations Research – University of Applied Sciences, Germany
Using the JMP Scripting Language (JSL) for graphs joins two of JMP's best capabilities: the flexibility of adding features that the scripting language offers, and the interactive visualization that makes JMP graphs indispensable. We will start with a short introduction to the different options of scripting for graphs. The main focus will consist of a wide range of scripted graphical applications that were deemed important from customer interactions. We will start with an easy example script for graphical clustering and we will then present an adverse event-relative risk map (drug development), wafermaps (semiconducter) and a visual tool for testing significant differences for the partial areas under two ROC curves (diagnostic medicine). We will end with a presentation of the concept of a row-based graphical object and show its applications, including choropleth maps and visualization of gene expression.
Tuesday, September 14, 2:40 p.m.
José G. Ramírez, PhD, Industrial Statistician – W.L. Gore and Associates Inc.
Most, if not all, industrial experiments face restrictions in randomization due to hard-to-vary factors of different sizes of experimental units. These situations induce what is called a split-unit, or split-plot structure. This structure must be taken into account when designing the experiment to reflect how the experiment will be run, and in the analysis to avoid making the wrong conclusions. JMP’s Custom Design is a very flexible and powerful platform that allows us to easily tailor the design to the experimental situation at hand. In this tutorial, I will give an overview of important things to consider when designing an experiment, like treatment and unit structure, as well as tips for how to easily recognize a split-unit structure. I will also walk you through the design of a split-split-plot experiment, exploring the unique Design Diagnostics of JMP’s Custom Design platform and some of its options, including Optimality Criterion, Simulate Responses and Advanced Options. If time permits I will cover the analysis of split-unit experiments using the Fit Model and the Graph Builder platforms.
Philip Ramsey, PhD, Consultant and Educator – North Haven Group and University of New Hampshire
Visual Six Sigma (VSS) emphasizes the use of dynamic data visualization to accelerate the process of transforming data into sound business decisions. The analytic and graphical tools generally used in Six Sigma today have changed little since the introduction of Six Sigma in the late 1980s. In contrast, VSS seeks to leverage the powerful dynamic visualization and enhanced modeling capabilities in JMP in support of sound and timely improvements to business processes. In this talk, three case studies – drawn from health care, aerospace and manufacturing – will be used to demonstrate how VSS, paired with JMP 9.0, can make the process of identifying root causes and formulating effective solutions lean. In addition to illustrating the powerful interactive visualization capabilities in JMP 9.0, new features in the Partition and Neural Net platforms will be demonstrated.
A Fresh Take on an Old Problem: Using Analytics to Gain New Insights into Ways to Reduce Fatal Motor Vehicle Accidents
Janel Nixon, PhD, President and Founder – Integrative Engineering LLC
Each year, 40,000 Americans die in motor vehicle crashes, making motor vehicle crashes the leading cause of death among Americans between the ages of 3 and 36. According to the U.S. Department of Transportation, the total societal cost of these crashes exceeds $200 billion annually. Despite these staggering statistics, many of these research efforts to date have failed to draw the sort of definitive conclusions that could potentially lead to beneficial policy changes or new regulation. Researchers have attempted to detect trends involving the driver’s age or gender, vehicle speed, time of day or month, or the weight of the vehicle. However, the primary shortcoming of these studies is that they all tend to view the vehicle in a vacuum – they look at the data associated with individual vehicles without accounting for the interaction that occurred between the vehicles that crashed with each other. This paper explores the hypothesis that individual vehicle characteristics are less important than the compatibility of the vehicles involved in the accident. It’s not so much the weight of a vehicle that’s important, but rather it’s the discrepancy in weight, center of gravity, and the delta in bumper height between the involved motor vehicles. This paper demonstrates how JMP is used to test this hypothesis, and to finally detect a trend that provides some new and valuable insight into this problem. It also stands as a case study for how analytics can be used across organizations to guide policy-making and new regulation in a way that delivers the maximum benefit to society.
Takenori Takahashi, PhD, Professor – Keio University
Nowadays, robust design is an indispensable tool for manufacturing. Until recently, the Taguchi method has been mainly used for it. However, this paper proposes Hyper-structure Optimization for Prospective Engineering (HOPE). Signal-to-noise ratio (SNR) solution drawn from the Taguchi method is included as a special case solution in HOPE theory. By using the same crossed array experiment data, multiple reasonable solutions are created by HOPE theory, while an SNR solution is obtained uniquely by using the Taguchi method. All of the solutions are candidates for the final solution and, therefore, the reproducibility has to be examined by confirmation run. The final solution is selected from confirmed candidates whose reproducibility is clearly proven. The combination of JMP and HOPE software, which is a JSL application for JMP 8, is developed for both methods, and it enables us to perform robust design and the confirmation of reproducibility easily and quickly. It is new software based on JMP 8 and is developed in cooperation with Takahashi and SAS Japan. Many advantages of HOPE theory are demonstrated by examples of paper helicopter designs, which are used as an educational exercise worldwide. Education and implementations of HOPE theory are held in many Japanese enterprises. The presentation will demonstrate use of this software in real, successful cases.
Tuesday, September 14, 4:00 p.m.
Mark Johnson, PhD, Professor – University of Central Florida
Bradley Jones, PhD, Co-Author – SAS
Custom Design in the DOE platform of JMP produces nonregular designs with desirable projection properties (Leoppky, Sitter and Tang, 2007). Jones and Montgomery (2010) further investigated nonregular designs to identify designs that allow for unambiguous estimation of main effects and a few two-way interactions. Adoption of these partially confounded designs by a statistical community accustomed to fully confounded designs could be enhanced by demonstrating their classical design features. For example, the recommended nonregular design for six factors in 16 runs can be shown to arise from a 24 full factorial in factors A, B, C and D with additional factors introduced via the generators E = ½(AC+BC+AD-BD) and F = ½(-AC+BC+AD+BD) to yield the principal fraction. The remaining 26 two-level, non-regular orthogonal designs for 6-factors in 16 runs will be examined from the classical design perspective. Finally, an example involving the design of an experiment to assess arson fire patterns will be described in which the aforementioned nonregular design is ideally suited.
Martin Owen, Manager, Chemical Development – GlaxoSmithKline
"It's not what the community can do for us, it's what we can do for the community." My intent is that this presentation is a continuation of a conversation, rather than a monologue. At the 2009 Discovery Conference and Innovators’ Summit in Chicago, my world was illuminated by the quality of the “big ideas” within the JMP community. But ideas are not enough. Innovation needs to be Developed, and others Empowered to Apply these ideas in a Sustainable way to add value. I intend to discuss “light-switch moments” and how they relate to the experimental design activities I’ve been involved with in the past and the Quality by Design activities that I am developing for GSK’s future. I will take you through the IDEAS journey – describing how to Innovate Develop Empower Apply Sustain – sharing the challenges of how we are using JMP software and working with the JMP community to accelerate our path along this journey
Zoë Jewell, VetMB; Sky Alibhai, DPhil, Project Co-Leaders – WildTrack
2010 is the United Nations International Year of Biodiversity. We rely on the diversity of life to provide us with food, fuel and medicine, yet our activities are causing biodiversity to vanish at unprecedented rates. Twenty-five to - 36 percent of mammal species are now in danger of extinction, and with them these resource-rich ecosystems. WildTrack (www.wildtrack.org) is working with JMP software to develop a footprint identification technique (FIT) which is helping to address this crisis. It’s a whole new approach to wildlife monitoring. Based on the ancient art of animal tracking, FIT distills traditional animal tracking skills into a robust scientific tool for endangered species conservation. We have more than 20 different species and project partners around the world. Our data have three levels of complexity:
- Each species has unique foot geometry.
- Each individual within a given species has a unique footprint.
- Each time an individual places its foot it leaves behind a unique print.
Using JMP software, FIT extracts and distills data collected from footprints to identify animals at the species, individual, age-class or sex levels, with high accuracy. We will present the inspiration behind FIT and describe data collection in field situations ranging from work with the black rhino in the African bush to the polar bear in the Arctic tundra. We will show new developments in image-capture developed with tigers in North Carolina, and new techniques in image-manipulation in JMP. We will demonstrate statistical verification of FIT using multidimensional scaling and bagged prediction models in JMP.
Greg Flexman, Process and Risk Analysis – Talecris Biotherapeutics
As paper measurement and recording systems are replaced by advanced monitoring and electronic databases, there has been a proliferation in the amount of data generated and collected. The evolution and application of analytical tools to review this accumulation of data has in some cases lagged behind the volume of data collection. As the complexity of the technology required to archive and manage the volumes of data increases, the process owners can be left without a clear window into actionable understanding. Specialized software tools that are capable of consolidating and analyzing data from such reservoirs can require significant capital investment to implement and interpretation can be daunting for the non-analyst. The example JMP scripted application demonstrates how these issues were addressed in a cost-effective manner by developing a graphical SPC dashboard that was linked via OBDC to a data collection system. This enables the distribution of basic process monitoring tools broadly in the organization, allowing production and quality personnel to monitor operations more efficiently and frequently. The graphics were designed to be rapidly, visually interpretable and utilize a combination of SPC charts. This tool was developed without capital investment or external resources, and may be applied across all of the attributes that are archived in this system. The presentation will review not only the overall scripting architecture and graphical features employed, but the business factors related to process monitoring and deployment of a scripted analytic tool within an organization.
JSL to the rescue: Coping Strategies for Making Sense of Real-Time Production Data in Vaccine Manufacturing
Andrea Coombs, MS, Operations Research Analyst – Emergent BioSolutions
Emergent BioSolutions Inc. is a biopharmaceutical company that manufactures BioThrax® (Anthrax Vaccine Adsorbed), the only vaccine approved by the U.S. Food and Drug Administration for the prevention of anthrax infection. BioThrax® production includes a two-stage fermentation of a non-virulent strain of Bacillus anthracis. In-line turbidity, temperature, and pH probes are used to monitor the fermentation process and real-time data can be viewed remotely and is available for analysis. JMP has been used to create a customized workflow to summarize data from multiple production trains. A “super” journal is the command central of this customized workflow that contains JSL scripts to access, process, track and analyze data. Central to this workflow are scripts to process real-time data into meaningful variables critical to the process, verify variables to ensure accuracy, and dynamic handling and visualization of data. As a result, an overwhelming amount of real-time production data has been summarized into the key variables that have been used to monitor, characterize and improve the fermentation process.
Wednesday, September 15, 1:20 p.m.
Bradley Burch, Founder – NFL Data Visualization (NFLDV)
The National Football League (NFL) is more popular than ever before. NFL fans and analysts thirst for more information about the performance of their favorite teams and players. There are hundreds of websites that specialize in providing users with tabular data, all similar in format and content. But the NFL fan deserves more. NFL Data Visualization uses JMP to analyze large amounts of NFL historical data for the purpose of visually displaying correlations between offense, defense and other categories normally consumed in tabular formats. Sophisticated outputs such as bubble plots, contour maps, overlay plots, tree maps – and many other JMP visualizations – allow NFL fans to discover trends and performance measures in new ways. JMP has revolutionized the way the non-statistician NFL fan can now consume football statistics and analyses. Now that sounds like fun!
Information Visualization and Visual Data Mining with a Focus on New Graphic Tools Using Both JMP® and SAS® for Mortgage Fraud
Anders Hasselrot, PhD, Data Mining Manager, Lloyds TSB
Daniel Keim, PhD, Professor and Head of the Data Analysis and Visualization Research Group, Department of Computer Science, University of Konstanz
Matthias Schaefer, MS, Co-Author, Department of Computer Science, University of Konstanz
Never before in history have data been generated at such high volumes as today. Exploring and analyzing the vast volumes of data becomes increasingly difficult. Information visualization and visual data mining can help to deal with the flood of information. The advantage of visual data exploration is that the user is directly involved in the data mining process. There are a large number of information visualization techniques that have been developed over the last decade to support the exploration of large data sets. In this paper, we propose a classification of information visualization and visual data mining techniques, which is based on the data type to be visualized, the visualization technique, and the interaction and distortion technique. We demonstrate the classification using a few examples, most of them referring to techniques and systems presented in this special issue.
Identification of Rogue Tools and Process Stage Drift by using JMP® Software Visualization and Analytical Techniques
Jim Nelson, Engineering Data Specialist – Freescale Semiconductor
The making of integrated circuits (computer chips) takes hundreds of very exacting series of steps. Hundreds of different pieces of equipment are used within those steps. Process drift and yield drops are a feared fact of life in the semiconductor industry. Identification of which process steps and/or tool or combination of tools contributed to the process drift or yield drop is a time-consuming and complex process, taking up a considerable amount of the engineers’ time within Freescale. A combination of JMP software-delivered analytical tools, Freescale-developed statistical methodologies and Freescale-developed analytical and visualization JMP scripts are being used to solve these issues. This paper describes the use of partitioning, a Freescale-developed methodology, Step Origin of a Drift Analysis (SODA), and two Freescale-developed JMP software scripts. The scripts allow for ordering of the charts and graphs based upon statistical evaluation along with the ability to move from chart to chart very quickly. This methodology allows for a very positive combination of analytical guidance, coupled with tools that provide the engineers with visual evidence to match their knowledge and experience in making decisions on what needs to be done to bring the processes back into alignment.
Gillette – The Best a Man Can Get. Let’s Discover How JMP® Helps You Get the Best Shave in the Morning.
Elisa Bonvini, PhD, Statistician – Procter & Gamble
Everyone knows Gillette and its famous slogan – The Best a Man Can Get. But how do we make sure we deliver on the promise? Although one might not see it, there are a lot of science and technologies behind those blades. And a lot of variables too. Together they play an important role in delivering a close and comfortable shave to men all over the world. At Gillette we are increasingly using JMP and the design of experiments (DOE) platform to model both technical data and consumer responses. We start from technical DOE, where we optimize our settings to achieve the optimal performance. Once we identify our best players, we then use DOE to understand consumers’ reactions and preferences. And this continuous iterative process is used alongside the product development journey, as we make our products better and better. Case study: the latest augmented design to understand consumers’ responses to various blade geometry options.
Kim Mitchiner, PhD, Principal Member of the Technical Staff – Sandia National Laboratories, Albuquerque, NM
Arthur P. Heath, MS; Thomas K. Mack; Larry D. Miller, MS; Carmella A. Varoz, MS, Co-Authors – Sandia National Laboratories, Albuquerque, NM
JMP and design of experiments (DOE) have been successfully applied to security system technologies from sensors to communication and display systems. In all cases, the technologies have been complex enough to warrant the need for a statistical determination of significant factors and/or the generation of predictive models. For the sensors, it was the task of calibrating a fiber optic intrusion detection sensor (FOIDS) with 32 adjustable settings. In addition to the numerous settings, the FOIDS also had two software processors for detecting different types of alarms. The problem was made more complex when the different types of alarms occurred on the wrong processors, causing nuisance alarms. JMP’s ability to optimize several predictive models simultaneously with JMP’s Prediction Profiler flash files was an important factor in producing field solutions. For the Communications and Display testbed system, numerous hardware and software network components had been integrated to build a functional system. Although the components of the system had been tested individually, the system’s performance could not be piecewise evaluated. Through the application of JMP’s design of experiments and data mining capabilities, it was possible to test some of the factors affecting the system’s performance and to differentiate between some of the software and hardware contributors. This paper will discuss design of experiments and the JMP tools applied to the solutions for both security systems.
Wednesday, September 15, 2:40 p.m.
Making the Most of the New Mapping Features in JMP® 9
Xan Gregg, Software Development Manager – SAS
After touching on map visualizations prior to JMP 9, which involve custom graphics scripts, we explore the new mapping features in JMP 9 in depth. We cover the basic uses for maps that JMP 9 supports: providing context for geographic data and showing a measure that varies by map region (choropleths). Finally, we demonstrate JMP 9's ability to import ESRI SHP files and use custom maps. Along the way, we invite discussion on the role of mapping visualizations in general. Where do they provide insight, and where do they lead to distortion?
Julia O’Neill, MS, Principal Engineer – Merck & Co. Inc.
Vaccine manufacturing is a complex biological process composed of many steps carried out over months or even years. Hundreds of raw material characteristics and process variables are monitored for every lot. Although the final vaccine product is well characterized and controlled, identifying the root causes of variation in the intermediate bulk material is extremely challenging. Teams of engineers, statisticians and scientists have begun to apply and develop data mining techniques to overcome these challenges. CUSUM sequence plots, partial least squares (PLS) regression and random forests have proven extremely valuable in recent projects. These data mining methods have set a new standard for vaccine root cause investigations within Merck. Their effectiveness will be illustrated with a case study.
Craig Burkhart, PhD, Global Materials R&D – The Goodyear Tire & Rubber Company
Philip Brown, PhD, Senior Applications Developer – Predictum Inc.
Goodyear has used nonlinear modeling methods for tire rubber compound property prediction for more than 20 years, but their use was limited. The roadblocks preventing widespread use of such models have been due, in no small part, to the lack of DOE (design of experiments), analysis, data mining and visualization tools integrated with these models. Integrating JMP with SAS has allowed Goodyear to replace an old, obsolete system with a modern, standards-based, full-featured system. Goodyear materials scientists have thousands of formulary ingredients to consider when designing a new rubber compound. With the new JMP and SAS system, a friendly and dynamic user interface allows compounders to import compound formulations, develop variants of previous compound recipes, model them under SAS and evaluate the properties of these models using JMP's modeling and visualization tools. All of this is done in the virtual world, allowing Goodyear compounders to fully consider potential recipes before producing them in the lab for physical evaluation. Goodyear now explores more possibilities in less time and with fewer resources, thereby accelerating and enhancing compound development by improving speed-to-understanding.
Vincent Wingate, Associate Director of Upstream Process Development and Plant Biology – Biolex Therapeutics
Lynn Dickey, PhD; JEFF REGAN, Co-Authors – Biolex Therapeutics
Biolex Therapeutics uses a novel biomanufacturing platform (LEX SystemSM), the aquatic higher plant Lemna minor, genetically engineered to produce recombinant biopharmaceutical proteins. We use a small-scale process model to optimize critical factors to maximize the yield and quality of recombinant biopharmaceutical proteins. Using design of experiments (DOE) capabilities in JMP, we will discuss what data is required prior to setting up a DOE; how to set up a custom DOE and analyze the data. Using the small-scale model, how we identified and optimized critical parameters and optimal ranges (both for environmental and media components) that affect the yield and quality of several different therapeutic proteins; using the small-scale model, how we used Failure Mode Effects Analysis (FMEA) is also performed to determine the effect of potential failures in critical upstream parameters on the therapeutic protein yield and quality measured responses and to assess the relevance of the small-scale process model to the large-scale production process and equipment.
Daniel Arneman, PhD, Environmental Analyst – University of North Carolina at Chapel Hill
James McAdam, PE, Co-Author – University of North Carolina at Chapel Hill
Building energy efficiency is receiving more attention as a cost-effective way to cut fuel consumption and greenhouse gas emissions, but efficiency advocates face one small problem: measuring what didn’t happen. Savings and return on investment calculations are based on avoided cost, and it’s difficult to know just how much energy a building might have consumed if left to business as usual. Standard methods often compare to previous billing cycles, but ignore the effects of weather or other mitigating factors. More nuanced engineering models are time-consuming, and are often too complex for the average stakeholder to implement and understand. This paper establishes a simple weather-based model in JMP that can predict building energy consumption from easily obtained monthly billing data. The models are highly visual, easily understood and can be used to quickly calculate cost, energy and carbon savings. The University of North Carolina at Chapel Hill is now using JMP to monitor and verify its Energy Conservation Program for an 18 million square foot campus, with a projected savings of more than $3 million and 50,000 tons of greenhouse gases per year.
Wednesday, September 15, 4:00 p.m.
Patrick Biltgen, PhD, Senior Principal Systems Engineer – BAE Systems
The JMP Neural platform has evolved significantly since its introduction in JMP 6 and is used heavily in the aerospace and defense simulation community to approximate high-fidelity complex systems simulations. This paper will review and compare neural network models across common data sets created in the last three versions of JMP and comment on the improvements and advancements of this tool set for complex systems designers. By comparing goodness of fit statistics, ease of creation and flexibility to address multiple complex models, the author will evaluate the utility and flexibility of the latest neural platform and demonstrate how JMP 9 can be used to analyze and visualize the results of complex simulations. Examples will include large agent-based simulations and models with mixed continuous and discrete variables. The presentation will also demonstrate the use of the Prediction Profiler with multiple neural equations and the use of the Simulator to rapidly flood a large design space with many points to enable “inverse design” for capability-based analysis.
Marie Gaudard, PhD, Consultant – North Haven Group
This talk explores some of the new advanced predictive features of JMP Pro Version 9 in the context of a medical data mining application. Bootstrap forests and neural nets, as well as other techniques and the use of JMP Scripting Language, are employed in an effort to identify pulmonary embolisms using three-dimensional computed tomography (CT) data. The data we use for our training and validation sets formed the basis for the 2006 KDD (Knowledge Discovery and Data Mining) Cup competition. These data present interesting challenges: sparse and noisy data, multiple regions associated with a single pulmonary embolism, a spatial structure within patients, and non-traditional measures of sensitivity. We develop predictive models using the training data, choose a final model, and apply that model to the separate test set in order to assess its performance.
Theresa Utlaut, PhD; Georgia Morgan, MS, Statisticians – Intel Corporation
John Tukey said, “The greatest value of a picture is when it forces us to notice what we never expected to see.” Arguably, JMP’s preeminent feature is the ease with which its users can quickly and easily create informative graphics that allow us to see what we may or may not have expected. The ability to create informative and decision-worthy graphics is especially important currently, when data are increasing in complexity and volume at a rate that is difficult to comprehend. In an article in The Economist (February 2010), it was estimated that mankind will create 1,200 exabytes (10^18) of digital data this year. The data deluge is in full swing! The visual tools provided by JMP facilitate our effort to extract information from data, but customization is often needed when faced with complexity and volumes of data. With the additional benefit of JMP’s Scripting Language (JSL), the visual analytics can be improved for the problem at hand and our work made more effective. This paper will present examples that demonstrate the use of JSL to enhance JMP’s visual analytics and streamline the analysis process.
Brent Ginn, PhD, Senior Scientist – The JM Smucker Company
One of the most valuable features of JMP is the ability to visualize complex data and statistical analysis in a very efficient manner. This visualization not only is valuable for the researcher, but also for management as it helps the researcher communicate information into relevant business concepts. One such area for CPG (consumer packaged goods) companies where this is highly valuable is the analysis and interpretation of sensory and consumer data. This paper presents the application of JMP to the analysis, interpretation and visualization of such data.
Zhou Wei, MS, JMP Principal Consultant – SAS Institute (Beijing) Co. Ltd.
Classic methodologies of design of experiments (DOE) are widely applied in design, manufacturing, quality management and so on. However, this is not the end of DOE. One of its shortcomings is a neglect of the propagation of errors, which means that the so-called “improved process” still produces some amount of defects continuously. Visual tolerance analysis is a good way to overcome the obstacle, and it effectively complements classic DOE, perfecting the final feasible solution. First, this paper briefly introduces the background and study value of tolerance analysis. Then, it highlights visual tolerance analysis, which features “Defect Profiler” and “Defect Parametric Profile,” two new visualization techniques. With the help of visual tolerance analysis, engineering and statistical analysts could work together to find out what is the key factor leading to propagation of errors and how to reduce it significantly and cheaply. Additionally, to improve analysis efficiency for statisticians and to ease understanding for non-statisticians, a related example with JMP software is illuminated to show the power of this booming engineering optimization approach in the DOE family.
Discover Posters Featuring Forward-Looking Applications
Just as we hope you’ll spend time challenging our technical staff and planting seeds for future JMP features, we also hope you will spend time with our poster presenters, learning about their forward-looking applications of JMP.
Poster 1: Be Customer Wise or Otherwise: Combining Data Mining and Interactive Visual Analytics to Analyze Large and Complex Customer Resource Management (CRM) Data
Kam Tin Seong, Professor – Singapore Management University
Ji Junyao, Aditya Misra, Timothy Tan Chee Yong, Students – Singapore Management University
Poster 2: Using JMP® Statistical Discovery Software to Answer a Clinical Question
Edward Dinovo, IT Consultant – Sepulveda Research Corp.
Poster 3: Product Design Using JMP®: Solving a User Complaint
Chad Naegeli, Senior Design Engineer – Incisive Surgical Inc.
Poster 4: Product Shelf-Life Extension Based on Statistical Analysis of the Collected Stability Data
Inna Ben-Anat, Process Engineer – Teva Pharmaceuticals USA
Poster 5: Use of JMP® for High Explosive Formulation, Testing and Data Analysis
Paul Anderson, Chemist – US Army ARDEC Picatinny Arsenal
Poster 6: Multiple Comparisons with JMP®
Karen Copeland, Statistician – Boulder Statistics
Poster 7: Using JMP® Graphics to Explore Business Solutions
Wenyu Su, Applied Statistician – The Dow Chemical Co.
Poster 8: The Use of JMP® Control Charts for Corporate Customer Satisfaction Monitoring
Sarah Travaglio, Net Promoter Score Manager – Asurion
Poster 9: Risk and Return of Investments in Online Peer-to-Peer Lending
Ram Gopal, Professor and Department Head – University of Connecticut
Poster 10: Using Analytics to Reduce Carbon Emissions and Save Trees
Steve Fowler, Director of Continuous Improvement – First Solar Inc.
Poster 11: Fun with Dialog Boxes and the JMP® Scripting Language
Peter Mroz, Safety Data Programmer – Johnson & Johnson
Poster 12: Salesforce Effectiveness at Sears: An Analytic Approach to Goal-Setting
Jason Chavarry, Senior Analyst, Operations and Analytics – Sears Holdings Corp.
Poster 13: Measurement's Effect on Manufacturability, Morale, Customer Service, Progress
Spiro Atieh, Quality Engineer – Specialty Silicone Fabricators
Poster 14: The Role of Software in ‘International Statistical Standards’
Michele Boulanger, Senior Consultant – Johnson International Statistical Consulting Inc.
Laura Lancaster, JMP Software Developer – SAS
Poster 15: JMP® Analytics Applied in Diagnostic Radiology and Neurosurgery Trauma Research
Melvin Alexander, Operations Research Analyst – Social Security Administration
Poster 16: Energy Reduction in Glass Furnaces Using JMP®
Rui Abreu, World Class Manufacturing Coordinator – Saint-Gobain Glass Portugal
Poster 17: A Comparison of Oil Well Disasters in the Gulf of Mexico and Kuwait
Alissa Anderson and Kristen Immen, Students – NC State University
Poster 18: Is Population Weighting the Right ‘Fit’ for Air Pollution Trends?
Alissa Anderson and Kristen Immen, Students – NC State University
Poster 19: Emissions Uncertainty: Focusing on NOx Emissions from Combustion Sources
Emily Wisner, Alissa Anderson, Colin Geisenhoffer, Brody Heffner and Michael Shaw, Students – NC State University
Poster 20: Secondary Ozone Standard
Christine Wu, James Kniffen, Elena Beckman and Steven Crenshaw, Students – NC State University