Data is used to predict outcomes of trades, and therby stabilizing the risk of investors. The robo-advisor as technology of anticipation, can be understood as a form of stabilizing certain profitable futures.
I use A. Mackenzies notion of "predictivity" as practice of prediction that takes into account the material- data- and techno infrastructures that situate that practice within regimes of economic anticipation.
Data is stabalized and contested by various forms of labour that are required to make the data usable. Theses forms of labour are often precarious other times dependent on certain forms of expertise.
Most of the data that is used in order for these systems to build a model of the financial markets, is quantitatve, such as historical marketdata. Those systems understand the markets as empirical field that is not governed by more "classical" economic theories.
Techno:
Micro:
Macro:
Definitions:
During a workshop on AI in the finiancial markets, the definitions of AI where discussed. Definitions are important as they relate to the governance of the financial sector through organisations as the BaFin.
Certain forms of automated trading have different legal requiremts, ect.
mobilization of definitions?
Definitions are mobilized by policy actors such as the BaFin or the European Union. Beeing a crucial part of regulatory practices, definitions are usually highly contestet by industry interest groups and other non-governmental organisiations.
Performance metrics/targets:
Performance metrics are generated by the Fintec firms in order to rate their algorithms. These metrics are often calculated based on hypothetical trades in the past, and are based on historical market-data.
mobilizations of performance metrics?
Performance metrics are part of the operations process for algorithmic trading systems. Although these systems trade fully autonomous, there is constant monitoring and intervention.
Datasets:
A varity of different datasets are used to train the algorithmic trading systems. Among the most common methods are forms of supervised mashine learning. In order to make data usefull for these methods it has to be cleaned and prepared with certain "tags", requiring maunal labour.
mobilizations of datasets?
Datasets are often provided by thrid partys and are part of the infrastructure that is needed for algorithmic trading and other AI applications to work. Often, the labour that is required is low payed and externalized to precarious conditions.