Streamlining Workflows to Optimise Processes
Thomas Zelikman, Senior Consultant NNE Applied Manufacturing Group
The data lifecycle is a multistep process beginning with acquisition and storage of data and ending with data disposal. With the ever-increasing size of today’s data sets, however, workflows can easily become messy or overly complex. To more quickly identify actionable information from the noise, analysts need better workflows and the right combination of analytic tools to help streamline the process.
Flexibility, integration, standardization and streamlining of statistical methods within the data lifecycle benefit the end user by optimizing work within the data management system and by making the analytic process easier and more repeatable.
In this video, you will learn to:
- Use data visualization to make data cleaning more intuitive.
- Combine a desktop analytics software with open source tools like Python and R to build a high-functioning interface.
- Leverage custom scripts to save and re-execute graphical analyses.
- Package saved scripts into point-and-click add-ins that can be deployed as bespoke analytical routines to novice users.