Success Story

At Marelli, curiosity is the mother of innovation

Why the consolidation of big data management, analysis, data controls, prevention and prediction into a single, streamlined workflow is accelerating future automotive technology

ChallengeEmbracing the opportunities afforded by a data-rich manufacturing environment means also confronting its challenges. At Marelli, this meant overcoming data access barriers and finding a way to deepen collaboration between data and domain experts.
SolutionTechnical leadership at Marelli sought an analytics tool that could offer both sophisticated statistical analysis and a means of streamlining the entire manufacturing data workflow. By consolidating multiple steps into a single tool, JMP® statistical discovery software “offered a very powerful suite of tools… both from a statistical point of view and from a visualization point of view,” says Worldwide Innovation and Improvements Quality Manager for Electronic Systems, Simone Cianchini. 
ResultsSince its implementation, JMP has made data exploration a best practice within Marelli’s analytics workflows, enabling domain experts to quickly extract more value – and ultimately more insight – from their data. Furthermore, Cianchini says, this workflow has even helped to change the culture by empowering data and domain experts alike to approach data with a spirit of curiosity. 

Marelli is one of the world’s leading global independent suppliers to the automotive sector, employing nearly 54,000 people around the world. With a strong, established track record in innovation and manufacturing excellence, the company provides technologies for several automotive product areas, including electronics and advanced interiors, automotive lighting and sensing technologies, e-powertrain and thermal management solutions for vehicle electrification, green technologies for exhaust systems, solutions for internal combustion powertrain and suspension systems, as well as motorsport.

The company’s unique technological vision, most evident in its fast-paced development cycles on connected, autonomous and shared electric mobility, is deeply aligned with a mature analytics culture Marelli leadership says is paramount to innovation. “The usage of data is key to accelerating decisions – and supporting those decisions with more robust arguments. To increase profitability, the key is to maximize this formula output: (speed x effectiveness)/cost. Data collection, visualization and analysis all support it,” says Simone Cianchini, Worldwide Innovation and Improvements Quality Manager for Electronic Systems at Marelli. 

A Six Sigma Master Black Belt with more than 25 years of experience in manufacturing operations, engineering innovation, integration, R&D and data science, Cianchini came to Marelli from semiconductor giant Vishay where he served as Senior Manager for Engineering and Integration. Agility with analytics, he says, is something that, over the course of his career, he’s come to see as the defining value of organizations driving meaningful innovation.

“In our modern environment, data science is critical. We can no longer make decisions based on intuition or abstract ideas,” he says; if organizations are to remain competitive, they must invest in strategic analytics enablement. 

A streamlined and strategic data workflow is the cornerstone of successful analytics enablement

Well aligned with this ethos, Marelli has instituted an active Data-Driven Decision Making (3DM) program across its electronics business line. 3DM has standardized around a series of sequential data initiatives, Cianchini explains, providing a streamlined workflow that begins with big data management and carries through to analytics, data controls, prevention and prediction.

“Data alone doesn’t make things happen; rather, we as domain experts must make decisions based on insights we extract from our data,” Cianchini says. “Our goal is to leverage this information in every way we can to create opportunities, avoid mistakes and accelerate decision-making processes.”

To best deliver on the 3DM optimization promise, Cianchini argues one must first ensure that the data being collected is fit for purpose. To do so, he poses three key questions: Who is the customer for a given data usage? What is the customer’s expectation around this data usage? And what is the best method to communicate with the customer?

“First we have to think about the end user – be it engineering, operations, human resources, or even our CEO and CTO,” he reflects. “Everyone has different data management needs, so the voice of the customer is absolutely critical.” Furthermore, he adds, complexity doesn’t always add value. “Even a simple data visualization could generate a very interesting discussion. The data tell a story, every story can have different points of view, and every point of view can show you different information.” Working smarter, he says, often means working simpler. 

Dynamic visualizations cut through barriers and complexity 

Data visualization quickly unravels any unnecessary complexity and, therefore, has become increasingly important to Marelli where Cianchini says it both deepens collaboration and makes data insights more accessible to domain experts who best understand the process, product or service. “If we work in isolation between these two functions and competencies, we lose out on speed – or we may arrive at the wrong conclusion altogether,” he says.

In his view, there are two ways to utilize data. First is a structured, methodical approach: the traditional scientific method proceeding from hypothesis to analysis to results. Second is an unstructured approach that values collaborative exploration. “These conversations don’t necessarily begin with a hypothesis – just curiosity,” he says, stressing that there is no perfect approach, as the only way to optimize decision making is to achieve the correct balance between structured and unstructured methods.

“In my experience, I’ve found that it’s always best to play with the data before moving on to a deeper analysis. And it’s especially valuable at this stage to bring in an outside perspective from someone who can pose disruptive questions that help domain experts gain a fresh perspective and move away from pre-packaged thoughts.”

Exploratory data analysis and visualization, he says, are some of the most underrated aspects of a robust analytics workflow. And having an analytics tool that makes exploration a standard step or best practice within that workflow can be transformative. “When I speak with my team, I tell them: always play with the click, be curious and dare.” And that’s where JMP® statistical discovery software comes in.

A consolidated analytics workflow in JMP® helps to quickly improve product uniformity performance 

The agility JMP brings to Cianchini’s team is perhaps best illustrated by a case in which engineers used the tool to swiftly improve the product performance level and uniformity. After observing actual product performance in the production line, the team first measured the product’s flatness in different positions, X-Y; collected process output data from the line’s measurement system; and evaluated input information for each part.

“We then used a JMP table join to merge process input and output into one,” he explains. “This is a very powerful JMP feature that we have used in many different situations, particularly wherein there is a similar output file from the factory’s automated machinery.” The result, he adds, was a dynamic table that visualized both input and output data.

“With some exploration, we were able to determine whether process input would be relevant,” he explains. “The speed to capture this information was absolutely key – perhaps the perfect illustration of the merger between what it means to be a data expert and process expert at Marelli.” 

  • A model showing how the team at Marelli measures object flatness.

  • The resulting merged table shows both process input and output data.

  • Classical data distributions provided inadequate insight into how the flatness issue might be resolved.

  • The team starts from a variability chart for a single part by its location.

  • It is evident that all parts have similar location flatness distribution.

  • A contour map in Graph Builder enables the team to filter for a single part.

    Beyond facilitating problem-solving collaborations, a relatively simple data visualization, Cianchini contends, is the difference between questions and the solution. “When our starting point was a classical data distribution, the team saw only the degree to which the flatness had spread, making it impossible to resolve with limited evidence,” he says. But by experimenting with variability and attribute gauge charts, Graph Builder contour maps and heat maps, the team was able to paint a much more nuanced picture of the problem at hand.

    It was immediately evident in the visualization that some of the dedicated part locations had very different flatness ratings. “The question then became: to what extent was this flatness difference the same for all parts?” he adds.

    ‘From unspecified problem to a detailed description, just like that’

    Continued data exploration provided a further solution. With just a few clicks – and in a handful of seconds – a follow-up visualization provided more information, enabling the team to see that flat areas were consistent in degree and location across all parts. “This contour map was very useful to our technical experts who have a deep understanding of the product and process,” Cianchini says. “And it didn’t even require any formulas or coding.”

    With this single analytics workflow, the team quickly learned two key pieces of information: one, that all the parts had similar flatness distribution, and two, that there were specific regions – the left-bottom and center-top – where flatness values were consistently higher. Equipped with this information, Cianchini’s team could then precisely identify where the machine was introducing variation. “Just like that,” he says, “we moved from unspecified problem to a detailed description and ultimately, the solution to our problem.”

    After a series of analyses, the team arrived at conclusions that were instrumental in identifying the way in which the process could be improved. By tuning two specific parameters and introducing newly redesigned elements to their tool, Cianchini’s team comprehensively improved uniformity and reduced flatness-related defects. JMP, he says, “offers us a very powerful suite of tools for this kind of investigation, both from a statistical point of view and from a visualization point of view.”

    “It’s an excellent example of the importance of analytics culture – and why it is imperative to be curious about your data,” Cianchini concludes. “Ask questions without constraints and be willing to play with the click. It’s the only way you can be sure to give the correct answer to the correct customer in the correct way. Curiosity means being able to continuously innovate and continuously improve.” 

    • A dynamic scatter plot provides ways of extracting information visually.

    • For each part produced, the team records the value of process inputs (1-2-3-4) and relative output flatness. This information is used to determine whether there is any correlation between the input and output and whether there are inputs of specific relevance to the change in output. One unique graphic contains a wealth of information, augmented by customizable colors and other visualization features. In this graphic we see input and output distributions (top-down matrix diagonal); correlation between the variables (graphic mode) (down-left matrix); correlation between the variables (with statistical index) (top-right matrix); and correlation between the variables (with color scale) (top-right matrix).

    • A visual analysis with a color map merged with correlation values quickly provides a number of conclusions including that the output (Flatness) and Input 1 have a strong positive correlation; the output (Flatness) and Input 4 have a strong negative correlation; Inputs 2 and 3 are not significant for the change in output; and Inputs 1 and 4 have strong negative correlation and would therefore not be easy to change independently.

    • Variability gauge charts enabled the team to draw several important conclusions. As a result of the new configuration and settings, they observed less flatness and better uniformity across the board. 

    • “It’s an excellent example of the importance of analytics culture – and why it is imperative to be curious about your data,” Cianchini concludes. “Ask questions without constraints and be willing to play with the click. It’s the only way you can be sure to give the correct answer to the correct customer in the correct way. Curiosity means being able to continuously innovate and continuously improve.”

      It's a key piece of advice he says can transform an organization's culture: "just play with the click."

      The results illustrated in this article are specific to the particular situations, business models, data input and computing environments described herein. Each SAS customer’s experience is unique, based on business and technical variables, and all statements must be considered nontypical. Actual savings, results and performance characteristics will vary depending on individual customer configurations and conditions. SAS does not guarantee or represent that every customer will achieve similar results. The only warranties for SAS products and services are those that are set forth in the express warranty statements in the written agreement for such products and services. Nothing herein should be construed as constituting an additional warranty. Customers have shared their successes with SAS as part of an agreed-upon contractual exchange or project success summarization following a successful implementation of SAS software.