Today, our sources of data are growing ever larger. It’s a trend that is driving interest in a new class of data analytic methods. Depending on your starting point, these methods are varyingly described as big data, artificial intelligence, data science, machine learning, data mining, predictive modelling and Industry 4.0. They have become synonymous with the expectation of delivering super insights from data to drive better decisions for “free” and with little human effort.
There is, however, a growing body of literature suggesting that the benefit of these methods is not as easily captured as proponents may suggest. In her best-seller Weapons of Math Destruction, author Cathy O’Neil notes that the combination of large data sets and sophisticated analytics has frequently resulted in unforeseen negative consequences. While newer methods are indeed powerful – and of course more data is better than less data – they are not infallible.
Instead, O’Neil argues, modern data analysis methods require due attention to the circumstances and purpose for which the data was originally collected. The root causes of many big data analytics mishaps relate to the omission of the fundamentals of data analysis that have been known for decades, such as paying close attention to data quality.
In this recorded seminar, the speakers will use industry case studies to discuss potential problems in exploiting newer methods of learning from bigger data. They will demonstrate the need to integrate an understanding of data limitations with methods that address those limitations, in combination with subject matter expertise. And they will explore solutions to help avoid “red herrings,” wasted time and money, or misplaced opportunity as practitioners chase the wrong answer.