Digital Market Manipulation
CALO R., Digital Market Manipulation, The George Washington Law Review, V. 82, i. 4, August 2014
|Topics||Business Model, Consumer, Transparency|
Behavioral economics – once integrating the opportunities of influencing consumers at personal level given to firms by the digital revolution – will enable us to understand when advertisement personalization provokes actual consumer vulnerability, deserving legal intervention.
--> Data ownership could resolve also digital market manipulation? It doesn't seem so. In fact Calo says, at p. 1014, that “[A]nother study shows, ironically, that giving consumers more apparent control over how their information is used will lead to more promiscuous disclosure behaviour (just as seat belts have been alleged to lead to more aggressive driving)”. This is an argument against data markets.
Libertarian paternalism or “nudging”: “exploiting irrational human tendencies in order to nudge citizen toward better outcomes (p. 1001). --> We could use nudging (techno-social engineering, according to Frischmann) to foster circularity. If this could be seen as positive by some people, those who have a negative opinion of nudging would see in nudging-for-circularity a negative use of (even anonymised) IoT data collected for purposes of circularity (the IoT data collected would in fact in this case be used to understand those irrational human tendencies which, if exploited, would push people to adopt more sustainable behaviors; and the same IoT products would be designed in order to foster people to adopt more sustainable behaviors: think about a sort of circularity-by-default or by-design).
“Coasen filter”: technology mediation between consumer and the market “could screen out negative content in favor of relevant and helpful commercial messages” (p. 1005). --> This mediation could help filtering only those products which are circularity-friendly. Third parties could for example filter IoT products according to their level of sustainability; and, in order to make this filtering, they could exploit the data collected by the product themselves: in order to do that, we need third-party accessibility of IoT data. This would foster transparency, but would be anyway a form of nudging.
With the IoT, many techniques that were used in digital markets have filtered to the offline world (p. 1016). “Morphing”: technique which consists in dynamically altering the layout of, for example, an advertising, according to the cognitive style of the specific subject which sees the advertising (p. 1017). This could be positive when it aims to ensuring widespread accessibility of the advertising, website or other message. However, this could be negative when it tries to nudge people towards a specific choice, especially when this choice is a purchasing one (as the success of morphing is not measured on the basis of the widespread accessibility of the message, but of the likelihood of a sale). --> We could use morphing to foster circularity.
“[D]igital market manipulation combines, for the first time, a certain kind of personalization with the intense systematization made possible by mediated consumption” (p. 1021). With big data analytics, we have “the systemization of the personal”.
Ryan Calo asserts that not all forms of technology mediation between consumer and market is harming for consumers. “[R]egulators and courts should only intervene where it is clear that the incentives of firms and consumers are not aligned” (pp. 1022-1023). --> That would not be the case with regard to use of IoT data for purposes of circularity, even when from such data collection and use derives a sort of nudging of consumers' behavior towards circularity; or when providers do not give exclusive control of the data gathered by the IoT product to the consumer to implement circular behaviors the providers themselves. On the contrary, that would be the case when IoT data collected for purposes of circularity are used for different aims (e.g., price discrimination or comparable things), or when the data collected are not publicly shared in order to prevent third parties to enter the market and offer concurring and/or additional circularity-services to consumers.
P.p 1028-1029: “[P]rivacy harm is comprised of two distinct but interrelated categories. The first category is subjective in that it is internal to the person experiencing the harm. In this context, subjective privacy harm is the perception of unwanted observation […]. The second element is objective in the sense of involving external forces being brought to bear against a person or group because of information about them. Thus, this category is the unanticipated or coerced use of personal information in a way that disadvantages the individual”. --> If the use of data of consumers for purposes of circularity could not objectively harm them – if properly designed and shaped (e.g., controls for purposes of amends for not recycling should not be based on the data collected, even when anonymised and used to find patterns, but on fate) –; but the subjective harm could anyway remain.