2Toward an Epistemic Continuity Anchored in the Cultural Sciences

So far we have established that digital data have a potential epistemic value that must be able to be confirmed, and that by their technical format and quantitative character, they can also be computationally manipulated and computed. Because of the origin and semantic content of the data, derived from traces of human (or presumed human) activity, the theoretical and methodological framework in which these computational processes are carried out is rightly so that of the evidential paradigm, historically favored by the cultural sciences, but applicable to any data conceived as a linguistic trace. For these sciences, data play the role of an observable that, in order to generate new knowledge, must be subjected to analysis in the form of computational processing. In this de jure configuration, data theory and its processing is that of the discipline that governs the experiment as a whole: the observables are understood in terms of disciplinary norms that allow them to be placed in a theoretical framework. This framework also provides or agrees with an instrument theory so that the same norms govern the observation, analysis and production of scientific results. Our epistemological paradigm will also have to ensure this epistemic continuity, while respecting two additional constraints: it must be based on the exploitation of digital footprints (as opposed to other observable ones) and consider them as representations ...

Get Big Data now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.