
Sign up to save your podcasts
Or


Social media and digital technology now allow researchers to collect vast amounts of a variety data quickly. This so-called “big data,” and the practices that surround its collection, is all the rage in both the media and in research circles. What makes data “big,” is described by the v’s: volume, velocity, variety, and veracity. Volume refers to the massive scale of the data that can be collected, velocity, the speed of streaming analysis. Variety refers to the different forms of data available, while veracity considers the bias and noise in the data. Although many would like to focus on these details, two other v’s,validity and volatility, hold significance for big data. Validity considers the level of uncertainty in the data, asking whether it is accurate for the intended use. Volatility refers to how long the data can be stored, and remain valid.
In her new book, Big Data, Little Data, No Data: Scholarship in the Networked World (MIT Press, 2015), Professor Christine L. Borgman, Presidential Chair in Information Studies at the University of California, Los Angeles, examines the infatuation with big data and the implications for scholarship. Borgman asserts that although the collection of massive amounts of data is alluring, it is best to have the correct data for the kind of research being conducted. Further, scholars must now consider the economic, technical, and policy issues related to data collection, storage and sharing. In examining these issues, Borgman details data collection, use, storage and sharing practices across disciplines, and analyzes what data means for different scholarly traditions.
By The MIT Press4.8
2020 ratings
Social media and digital technology now allow researchers to collect vast amounts of a variety data quickly. This so-called “big data,” and the practices that surround its collection, is all the rage in both the media and in research circles. What makes data “big,” is described by the v’s: volume, velocity, variety, and veracity. Volume refers to the massive scale of the data that can be collected, velocity, the speed of streaming analysis. Variety refers to the different forms of data available, while veracity considers the bias and noise in the data. Although many would like to focus on these details, two other v’s,validity and volatility, hold significance for big data. Validity considers the level of uncertainty in the data, asking whether it is accurate for the intended use. Volatility refers to how long the data can be stored, and remain valid.
In her new book, Big Data, Little Data, No Data: Scholarship in the Networked World (MIT Press, 2015), Professor Christine L. Borgman, Presidential Chair in Information Studies at the University of California, Los Angeles, examines the infatuation with big data and the implications for scholarship. Borgman asserts that although the collection of massive amounts of data is alluring, it is best to have the correct data for the kind of research being conducted. Further, scholars must now consider the economic, technical, and policy issues related to data collection, storage and sharing. In examining these issues, Borgman details data collection, use, storage and sharing practices across disciplines, and analyzes what data means for different scholarly traditions.

32,054 Listeners

30,726 Listeners

10,725 Listeners

503 Listeners

1,460 Listeners

937 Listeners

83 Listeners

4,165 Listeners

501 Listeners

261 Listeners

2,081 Listeners

5,526 Listeners

237 Listeners

587 Listeners

661 Listeners