Euromesh, Dataculture, and Private Property

A fictional public talk commissioned for an event about "tricking the digital"
Date
18 Nov 2019
MADE FOR
Stereolux
Comissioned By
Stereolux
In Collaboration With
Design Friction
Role
Concept, Design of Fictional Products and Services, Worldbuilding, and Performative Public Speaking
For The Event
Duper le Numérique : Brouiller, Embrouiller, Se Débrouiller
At
Stereolux
DURATION
OCT 2019 — DEC 2019
Location
Nantes, FR

On Decem­ber 18 2019, we were invit­ed to do a pub­lic inter­ven­tion at Stere­olux on the theme of "trick­ing the dig­i­tal." Under fic­tion­al iden­ti­ties, we attempt­ed to give shape to a pos­si­ble future for data work and data cit­i­zen­ship (which are quite inter­twined here). It all took the form of case stud­ies, sup­plied with spec­u­la­tive arti­facts and ser­vices grant­i­ng the pre­sen­ta­tion some degree of plau­si­bil­i­ty, while attempt­ing (albeit in a friend­ly way) to lure the audi­ence into accept­ing the fic­tion as fact—even the Q&A was con­duct­ed in die­ge­sis, before it was all revealed.

This is a tran­script of our talk, illus­trat­ed with the arti­facts that gave life to our world­build­ing, and slight­ly altered in the text for reading. 

Nantes, Dec 18 2019.

“How many of you would agree to hand your per­son­al data to a com­pa­ny, in return for noth­ing at all?” Amused silence. “How many are reg­is­tered to a loy­al­ty pro­gram of any kind — Air­miles, super­mar­ket mem­ber­ship, Ama­zon prime…?” Awk­ward­ly, as if shroud­ed in sud­den guilt, all hands rise. “Why sub­scribe to the very thing you so cat­e­gor­i­cal­ly reject,” we ask. “Out of con­ve­nience? For dis­counts? These ben­e­fits,” we argue, “are pure fic­tion — a self-ful­fill­ing prophe­cy fueled by our col­lec­tive compliance.” 

Towards the Ethical Monetization of Data

The con­fer­ence is held by Koran­tin Spiegel and Azhar Abassi, two fic­tion­al per­sonas respec­tive­ly assum­ing the roles of expert in dig­i­tal mar­kets and project lead of EXCEED — the new, also fic­tion­al, Euro­pean Exec­u­tive Com­mis­sion for the Eth­i­cal Exchange of Data. Its title: “Euro­mesh, Dat­a­cul­ture & Pri­vate Prop­er­ty — towards the Eth­i­cal Mon­eti­sa­tion of Data.” Coor­di­nat­ed by Design Fric­tion, and tak­ing place at Stere­olux, Nantes, it con­cludes a day of talks and work­shops enti­tled “Trick­ing the Dig­i­tal,” which, as one could expect would lead to pas­sion­ate debate on the top­ics of pri­va­cy, pro­tec­tion, and the pol­i­tics of data.
	Through­out the day, and even more so at the start of this lec­ture, we ques­tioned the rea­sons behind our efforts to pro­tect our pri­va­cy — who, what from? A dic­ta­to­r­i­al gov­ern­ment? Evil cor­po­ra­tions? Do we fear malev­o­lent, big-broth­ery sur­veil­lance, or sim­ply neg­li­gence in the absence of an appro­pri­ate leg­isla­tive frame­work — as seen with the exam­ple of jog­ging app Stra­va which, dur­ing a mar­ket­ing coup, revealed the run­ning paths of users around the world and unwit­ting­ly showed the loca­tion of secret mil­i­tary bases? Are we try­ing to escape the wrath of hack­ers, or do we revere them for their humor and rev­o­lu­tion­ary actions? The Pro­tec­tion of pri­va­cy, we say, is a fun­da­men­tal right; it can­not how­ev­er be the sole dri­ving force of progress, and we must avoid sink­ing into a form of dig­i­tal con­ser­vatism. Instead, dig­i­tal activists must make use of their knowl­edge and skills to offer real­is­tic polit­i­cal alternatives. 

Data for Social Good

With data steadi­ly on its way to becom­ing dig­i­tal gold, now seems like a crit­i­cal time to assess its cur­rent impli­ca­tions and future tra­jec­to­ry. Could we, for instance, veer away from cap­i­tal­ist incli­na­tions, and reroute the data econ­o­my toward a new social mod­el? And while at it, cher­ry the cake with new rights for dig­i­tal cit­i­zens?
	As hands go down, we move onto a less triv­ial thought exper­i­ment. Who here would give their med­ical data, every kind, down to DNA sequenc­ing, if it could save their own life? Hands are back up. The life of a loved one? No one budges. The life of some­one you don’t know? Hes­i­ta­tion. Who would give their med­ical data to advance med­ical research? The audi­ence is divid­ed. A 2016 arti­cle pub­lished in Wired and titled with a quote from mol­e­c­u­lar and com­pu­ta­tion­al biol­o­gy spe­cial­ist Eric Schadt claims that The Cure for Can­cer is Data — Moun­tains of Data. The text goes on to describe how sequenc­ing DNA from 10 mil­lion peo­ple would help build datasets com­plete enough for AI to solve the issue for good. Although tech­no­cen­tric, such a prospect sparks hopes of low­er mor­tal­i­ty and rein­vig­o­rat­ed economies — in 2015 alone, can­cer had a world­wide impact of 895 bil­lion US dol­lars. The spark is how­ev­er not with­out con­cerns about eugen­ics and more dis­crim­i­na­tion, espe­cial­ly when spec­u­lat­ing on new avenues for inequal­i­ty such as genet­ic pre­dis­po­si­tions, or access to health insur­ance. 
	Change of slide: “Data and Envi­ron­ment”, or how can the har­vest­ing of eco­log­i­cal and con­sump­tion-relat­ed data make sys­tems more adapt­able and reac­tive, so that only resources that are tru­ly need­ed would be pro­duced — the best waste is the one that isn’t pro­duced, after all. We show GAIAI, part of a project we ran with the Pots­dam Insti­tute for Advanced Sus­tain­abil­i­ty Stud­ies, as a pre­sum­ably suc­cess­ful exam­ple of ‘algo­rith­mic envi­ron­men­tal gov­er­nance,’ and move on. “Data vs Cul­ture:” can data about how we con­sume cul­ture or enter­tain­ment in par­tic­u­lar tru­ly inform the pro­duc­tion of tai­lored con­tent? As opposed to plain view­ing met­rics which depend more on mar­ket­ing bud­gets than a true mea­sure of con­tent qual­i­ty? How cus­tomiz­able can this become? We leave the ques­tion up in the air. “Data vs Gov­ern­ment.” Here, we dis­cuss per­son­al behav­ior as a polit­i­cal act and, with the exam­ple of the polit­i­cal cryp­topar­ty devel­oped with­in the bounds of a design fic­tion work­shop in Mex­i­co City, we look at how the machine and its sta­tis­ti­cal gov­er­nance can be a bet­ter ruler than the usu­al, cor­rupt politi­cians. Can it really?
After this warm up, we final­ly unveil our case for a new data econ­o­my, por­tray­ing it as a real­is­tic approach to a true uni­ver­sal basic income — a con­cept already shift­ing from the hand of the inter­ven­tion­ist state to that of the free mar­ket, with the idea of “return on invest­ment” replac­ing that of pub­lic good. Some­where at the con­ver­gence between these two worlds lies the pos­si­bil­i­ty of a mon­e­tary com­pen­sa­tion based on the eth­i­cal exchange of data, over­seen by a strong inter­gov­ern­men­tal body. Three main chal­lenges must be addressed in order to achieve this orga­ni­za­tion. They are

1

Access to tru­ly free and open data, push­ing dig­i­tal cit­i­zens to become informed actors of the sys­tem rather than mere users.

2

A secure, inclu­sive, and wide­spread stan­dard for datasets striv­ing for fast and easy inter­op­er­abil­i­ty. And

3

High-qual­i­ty data that is read­able, use­ful, con­tex­tu­al­ized, labelled, com­plete, and has received full and unequiv­o­cal con­sent from their source, the citizen.
"Good news every­one," we inter­ject, this kind of leg­is­la­tion on stan­dards is exact­ly what the Euro­pean Union does best, and so we claim to have come up with solu­tions to these exact needs, and put togeth­er a frame­work encom­pass­ing both their tech­ni­cal and legal aspects.
The result of this spec­u­la­tive EU-based ini­tia­tive cul­mi­nates in a new sta­tus, that of the ‘Euro­pean Dig­i­tal Cit­i­zen,’ the cor­ner­stone of a new and pow­er­ful mod­el aim­ing to act as an alter­na­tive to the Amer­i­can ultra­l­ib­er­al, mar­ket-dri­ven data econ­o­my, and the author­i­tar­i­an social scor­ing sys­tem made in Chi­na. It is designed to func­tion, once ful­ly-imple­ment­ed, as a zero-sum econ­o­my gen­er­at­ing wealth through a com­plex-in-depth-sim­ple-on-the-sur­face sys­tem. To clar­i­fy, we pull up the exam­ple of a Rot­ter­dam night­club exper­i­ment built around an infra­struc­ture har­ness­ing elec­tric­i­ty from the motion of dancers to pow­er its lights.

2017—2019: The Pilot Project

The show goes on with more of the nit­ty-grit­ty on how, for near­ly two years, a pilot project set in Eind­hoven (NL) in col­lab­o­ra­tion with the Tech­ni­cal Uni­ver­si­ty has been offer­ing some 7500 res­i­dents of the city and its sur­round­ings the pos­si­bil­i­ty to take part in an ear­ly imple­men­ta­tion. The exper­i­ment offers var­i­ous ben­e­fits based on three basic lev­els of involve­ment. ‘Basic’ users, who give the min­i­mum con­sent for tak­ing part (DNA sequenc­ing and col­lec­tion of bio­met­ric, visu­al, geolo­ca­tion, inter­ac­tion, and con­sump­tion data), receive com­pen­sa­tion from local part­ners: food bas­kets ful­fill­ing basic needs, free local trans­porta­tion, a stan­dard ener­gy pack­age (200kWh/​m2/​year/​person), priv­i­leged access to hous­ing, and health insur­ance cov­er­ing all med­ical needs as well as emer­gen­cies and pae­di­atrics. Such dig­i­tal cit­i­zens wear mark­ers show­ing their affil­i­a­tion to the EU-backed sys­tem. These facial wear­ables are in turn seen by infrared cam­eras and let super­vi­sion infra­struc­tures quick­ly iden­ti­fy mem­ber-cit­i­zens while assess­ing their indi­vid­ual data col­lec­tion per­mis­sions, turn­ing a blind eye to non-mem­bers whose per­son­al infor­ma­tion can­not be col­lect­ed. In addi­tion, mem­bers are giv­en a new “dig­i­tal Euro­pean citizen’s pass­port.”
	‘Pas­sive’ users go a step fur­ther by enabling addi­tion­al data detec­tion (envi­ron­men­tal, psy­cho­log­i­cal, social) with­out extra effort on their part, where­as ‘Actives,’ col­lo­qui­al­ly known as ‘Har­vesters’ go out of their dai­ly rou­tine to accom­plish spe­cial mis­sions for col­lect­ing, check­ing, or labelling data, becom­ing de fac­to data work­ers in their own right. 
	Pre­cise and effi­cient data col­lec­tion calls for pre­cise and effi­cient devices. To this end, EXCEED’s advanced data arse­nal has been entire­ly con­ceived  by design­er Roland-Geert Van Den Boerder­ij, aka ‘RGB,’ an illus­tri­ous (fic­tion­al) dutch design­er known for his social­ly-engaged works in fash­ion at large and his “human­is­tic” styl­is­tics, insur­ing such new every­day objects would also bring joy to their users. The more time is spent work­ing on data, the more ben­e­fits can be received, span­ning telecom­mu­ni­ca­tions, main­te­nance, leisure, cul­ture, or even lux­u­ry ser­vices and items.
Aside from mark­ers, all cit­i­zens invest­ed in active data col­lec­tion are required to make their pres­ence loud and clear to the pub­lic. With this in mind, the oth­er tools devel­oped by stu­dio RGB com­prise mobile, human-oper­at­ed infrared cam­eras (the sys­tem for­bids the use of ful­ly autonomous cap­ture devices, and incen­tivizes a human-based approach to data col­lec­tion), sports wear­ables, gaze-track­ing head mounts, gal­van­ic response and pulse oxime­try wrist­bands, a DNA sam­pling kit, biopills enabling cit­i­zens to col­lect and com­mu­ni­cate inner-body bio­met­rics, and more.
	This sys­tem, although com­plex in its man­age­ment, is acces­si­ble through a sim­ple front-end offer­ing cit­i­zens the pos­si­bil­i­ty to pro­gres­sive­ly devel­op spe­cial­iza­tions through a skill tree, unlock access to new devices and mis­sions, and to fol­low the system’s needs through a ded­i­cat­ed mis­sion board. This dash­board pro­vides them with a full overview of their cap­ture devices and col­lect­ed data, as well as com­plete and instant con­trol over their var­i­ous con­sent channels.

The Biggest Lie

Ignore it, tick it, for­get it: the biggest lie on the inter­net, we argue, fits in a sin­gle sen­tence — and we are all guilty. “I have read and agree to the terms and con­di­tions.” On the next slide, we demon­strate how our sys­tem  is designed so that con­sent­ing to data col­lec­tion becomes an explic­it and trans­par­ent act, and how it defies the sta­tus quo by sim­ply mov­ing away from “no = no” to “not yes = no.” As triv­ial as this dis­tinc­tion may appear to most, it is the very basis for any data-dri­ven mod­el that is respect­ful of its cit­i­zens, and that seeks their trust. And beyond data col­lec­tion itself, con­sent must be grant­ed for data pro­cess­ing as well as cross­breed­ing — we illus­trate with three exam­ples of what we call ‘com­pos­ite con­sent.’ In the first exam­ple, the merg­ing of two dif­fer­ent types of data may be need­ed for a spe­cif­ic use: con­sent must be made explic­it for each data type. In the sec­ond, a sin­gle data type can be used for dif­fer­ent appli­ca­tions: user con­sent is required for each of them. In the third, where two, pre­vi­ous­ly autho­rized appli­ca­tions may be merged to gen­er­ate addi­tion­al ben­e­fits: con­sent must be grant­ed explic­it­ly for their cross­breed­ing. An addi­tion­al exam­ple shows a route plan­ning inter­face offer­ing cit­i­zens to pick a ‘data-reroute option’ that replaces the short­est pro­posed path with a longer, often more con­vo­lut­ed one, opti­mized for the gath­er­ing of impor­tant envi­ron­men­tal data. If picked, the longer route is reg­is­tered as ‘data work,’ and adds up to the citizen’s benefits.

Case studies

The fol­low­ing slides intro­duce five fic­tion­al cit­i­zens, accom­pa­nied by their dig­i­tal por­traits and diverse diegetic objects. 

Park Bong-Cha

Park Bong-Cha, a Kore­an exchange stu­dent at the Philips High-Tech Cam­pus, is our first case study. She pas­sive­ly records her social inter­ac­tions through a clip-on mod­ule for glass­es which cap­tures emo­tion­al response and engage­ment in both her­self and her inter­locu­tors — pro­vid­ed they are dig­i­tal cit­i­zens as well — and her sleep pat­terns at night through her ‘Oniri’ — an EEG & REM-track­ing hat that lets her review and trans­mit her data in the morn­ing. She also active­ly par­tic­i­pates in data san­i­tiz­ing mis­sions, tak­ing on tasks such as seman­tic labelling, cross-ref­er­enc­ing, dupli­cates removal, and the cor­rec­tion of false pos­i­tives through a ded­i­cat­ed application. 

Fatima Van Houten

Fati­ma Van Houten, works as an on-demand dri­ver for com­pa­nies and indi­vid­u­als alike. Extreme­ly con­cerned about envi­ron­men­tal issues, she bought an all-elec­tric vehi­cle years ago and, upon sign­ing up for the Euro­pean Dig­i­tal Cit­i­zen pro­gram, equipped it with P4 par­ti­cle sen­sors. These, cou­pled with a 360 video feed and loca­tion track­ing, put her in an ide­al posi­tion to cre­ate a high­ly accu­rate and up-to-date map of atmos­pher­ic pol­lu­tants, traf­fic con­di­tions, and road qual­i­ty. In order to know the qual­i­ty of the air she breathes, she also wears a neck ruff ver­sion of the P4 sen­sor. As part of the cit­i­zens-ride-free pro­gram, she has installed back-fac­ing cam­eras in her car’s head­rests, dis­play­ing real-time insights on her pas­sen­gers’ emo­tion­al states, and allow­ing her to adapt her dri­ving and con­ver­sa­tion accord­ing­ly. Final­ly,  she keeps her valu­ables in a con­nect­ed doc­u­ment-hold­er, which sends out her live geolo­ca­tion while keep­ing her belong­ings secure.

Lieselotte Weij

Lieselotte Weij is a com­pet­i­tive sports­woman. She trains every morn­ing and is high­ly inter­est­ed in the way her body acts dur­ing this rou­tine. She has equipped her­self with a bio­met­ric hypo­der­mic implant no big­ger than a two euro coin, which sim­ply requires the pain­less inser­tion of a small pin to mon­i­tor lac­tic acid lev­els in her mus­cles. She has also recent­ly start­ed being active as a data coun­selor, inter­ven­ing on sus­pect­ed errors in data col­lec­tion by arrang­ing meet­ings with oth­er dig­i­tal cit­i­zens to clar­i­fy pro­to­cols, trou­bleshoot issues, and iden­ti­fy pos­si­ble mis­us­es — inten­tion­al or not. Doing this, she acts as a human inter­face between EXCEED and its users, and relays con­cerns as well as sug­ges­tions. One exam­ple of such a sit­u­a­tion: Lieselotte once received notice of abnor­mal bio­met­rics and, upon inves­ti­gat­ing, dis­cov­ered that a lady had her cat wear sen­sors des­tined to her, sup­pos­ed­ly out of “sheer curios­i­ty.” Once the pos­si­bil­i­ty of fraud ruled unlike­ly, this event prompt­ed a side-debate on the poten­tials of data gen­er­at­ed by ani­mals, and whether it posed the risk that some might breed ani­mals for the sole pur­pose of cash­ing in on such docile data workers.

Henk Schenkelaars

Our sec­ond case study is Henk Schenke­laars, a retired wid­ow­er liv­ing alone and in rel­a­tive auton­o­my thanks to an ‘Elmer.’ Loaded with visu­al, infrared, audio, and pres­sure sen­sors, the autonomous vac­u­um clean­er dou­bles as a con­nect­ed, voice-con­trolled assis­tant aid­ing with domes­tic tasks such as call­ing con­tacts, order­ing sup­plies, or lead­ing the way to the bath­room at night. It is in con­stant com­mu­ni­ca­tion with Henk’s vitals-track­ing bracelet, which can detect any health issue and imme­di­ate­ly call for help. In such cas­es, it is also able to remote­ly dis­able the door’s smart-lock, and sound an alarm that will sig­nal the emer­gency to Henk’s neigh­bors. The con­sis­tent data col­lec­tion makes the Elmer not only an invalu­able ser­vice, but also a free one.

Leendert-Jan Paap

Leen­dert-Jan Paap is an active cit­i­zen — one could say a data pro­fes­sion­al. Unem­ployed at the time of opt­ing in, he sees in his new sta­tus an oppor­tu­ni­ty to gen­er­ate rev­enue. Although not in the form of cur­ren­cy, this helps him live com­fort­ably, improv­ing much upon his for­mer strug­gle to make ends meet. At the start of the day, Leen­dert-Jan reviews the system’s needs on the mis­sion board, and claims assign­ments for which he has the required skills, equip­ment, and time. Most­ly, he picks those send­ing him to col­lect visu­al data through an array of manned cam­eras on tripods or hang­ing from RGB’s sig­na­ture drone­brel­las. Mov­ing around town, he observes cit­i­zens, the envi­ron­ment, checks on infra­struc­tures, and deploys his toolk­it to pro­vide valu­able data about urban life. With time, he’s found him­self becom­ing some­what of an ambas­sador to the project — answer­ing the ques­tions of passers­by, explain­ing the sys­tem and its core val­ues. With such a con­sid­er­able chunk of his day spent in pub­lic space, Leen­dert-Jan has already been wit­ness to a few inci­dents, and no longer goes out with­out his first-aid kit.

About Privacy, Still

This mod­el, although root­ed more in mutu­al­ism than pro­tec­tion­ism, isn’t blind to the issues of pri­va­cy and indi­vid­ual rights relat­ed to per­son­al data. On the con­trary, these are inte­grat­ed at the very core of the sys­tem, with a per­son­al trade­mark being pro­vid­ed for each cit­i­zen, pro­tect­ing them in almost the same way cor­po­rate trade­marks pro­tect busi­ness­es by giv­ing them com­plete con­trol over their image, its vis­i­bil­i­ty, its uses, as well as insur­ance against any unlaw­ful appro­pri­a­tion of it. Pri­vate data is fur­ther pro­tect­ed by rout­ing all trans­mis­sions through the Euro­mesh, a dis­trib­uted ledger infra­struc­ture akin to the ‘Hashmesh’ — an alter­na­tive to the blockchain rid of its icon­ic ‘proof-of-work’ (PoW) sys­tem to instead con­firm trans­ac­tions through prin­ci­ples of vol­u­met­ric con­sen­sus and expe­ri­ence vari­abil­i­ty, thus low­er­ing its ener­getic impact while increas­ing both resilience to attacks and trans­ac­tion speed — devel­oped by the team of Rajesh Laghari at IBRE. These two aspects (per­son­al trade­mark and dis­trib­uted ledger) are cou­pled with a com­plete ban on adver­tise­ment and rig­or­ous prin­ci­ples of part­ner cer­ti­fi­ca­tion, which con­tin­u­ous­ly enforce the duties and respon­si­bil­i­ties of select third-par­ties from the pri­vate sec­tor who take part in this econ­o­my. The pro­tec­tion of dig­i­tal cit­i­zen data is ensured by a strict pol­i­cy of EU-part­ner exclu­sive­ness. The EU has an unre­strict­ed right to crawl and probe pri­vate data­bas­es at all times, and if dig­i­tal cit­i­zen data — may it be facial snap­shots from CCTV cam­eras, bio­met­ric data, or basic details such as a phone num­ber — appear on non-cer­ti­fied datasets, enor­mous fines are at stake for the care­less col­lec­tor, as well as the impos­si­bil­i­ty to ever apply for a certification. 

Welcome to The Dark Side, or: the Limits of Consent

Before wrap­ping things up, we bring to the fore­front equal­ly spec­u­la­tive imper­fec­tions to taint our pol­ished pitch — a covert way of get­ting the audi­ence riled up, ready to let loose of their vit­ri­ol dur­ing the immi­nent in-die­ge­sis Q&A.
	Of course, the eth­i­cal impli­ca­tions of such a sys­tem are numer­ous, and it would be arro­gant on our part, as rep­re­sen­ta­tives of an EU ini­tia­tive, to claim hav­ing thought every­thing through — as there doesn’t go a day with­out a new chal­lenge. We go on with a bleak exam­ple of a mem­ber who recent­ly passed away and, as a ded­i­cat­ed bio­met­rics har­vester, left us with a load of data per­tain­ing to his unfor­tu­nate pass­ing as well as the few hours after it, all punc­tu­at­ed by a big ques­tion mark on the lim­its of con­sent. Can this invalu­able data be used for med­ical research? Should it be delet­ed imme­di­ate­ly? Should we imple­ment a sys­tem to stop data trans­mis­sion at the moment of death, or should we rather include an option allow­ing cit­i­zens to become posthu­mous data donors the same way one can be an organ donor nowa­days? Are there ben­e­fits for their heirs? In fact, ques­tions such as these are not for us as sys­tem archi­tects to answer. Rather, and by design, they are for the col­lec­tive to raise and debate, which is where lies the last brick of our con­struct­ed sys­tem: col­lab­o­ra­tive pol­i­tics. As is, when­ev­er a new issue regard­ing the sys­tem is raised, it is post­ed to the dig­i­tal cit­i­zens’ dash­board to ben­e­fit from their scruti­ny, col­lect their opin­ions, and fos­ter healthy, crit­i­cal think­ing — a bit like a debate plat­form, or the back-end of Wikipedia. The con­sent grant­i­ng process — when cit­i­zens allow or deny third par­ty access to their data — is also mon­i­tored in such a way. These are open tasks for who­ev­er is part of the sys­tem, and will­ing to peek. But then, should ques­tions regard­ing the sys­tem at its core be ini­ti­at­ed and mod­er­at­ed by peo­ple at all? Or can we — and should we — steer towards a more auto­mat­ed form of mod­er­a­tion? Run­ning much less on con­scious opin­ions and con­sid­er­ably more on behav­ioral pat­terns, poten­tial­ly elim­i­nat­ing sus­pi­cions of bias and cor­rup­tion? Can we — and, again, should we — vote with­out voting? 
On these ques­tions, we end the lec­ture and give the audi­ence the time to ask their own. From who devel­ops the tools used to col­lect data — and the pos­si­bil­i­ty for cit­i­zens to take part in their design — to the issue of one’s dis­con­nec­tion through hyper-con­nec­tion, we dis­cuss the many aspects rel­a­tive to our rela­tion­ship with data and dig­i­tal moder­ni­ty. Can this sys­tem be made so seam­less that it lib­er­ates us from our depen­dence on the con­nect­ed world? Or would it on the con­trary engulf us whole, while accel­er­at­ing the dig­i­tal divide? 
	—“When will you give the answer to this trav­es­ti?”, one per­son right­ful­ly asks, hav­ing noticed that our pre­sen­ta­tion failed to respect the graph­ic stan­dards of real EU-ini­ti­at­ed projects. We make up some poor excuse of an excuse in order to keep the fic­tion alive just a lit­tle bit longer, and skip to a slide dis­play­ing a link to a web­page for enrolling in the sec­ond wave of the social exper­i­ment which, we explain, will launch in 2020 in sev­er­al regions of Europe, includ­ing Loire-Atlan­tique where we are giv­ing this talk. 
	To those who fol­low the link, a sin­gle page reveals the fic­tion­al nature of the project, let­ting a few peo­ple in on the secret. We pro­ceed with more of a taunt than a ques­tion about the bad pub­lic­i­ty our sys­tem had pre­sum­ably got­ten after auto­mat­ic doors failed to open for our reg­is­tered cit­i­zens because of the "scary fines" should their data be cap­tured by the doors’ sen­sors — some­thing we’d plant­ed in the audi­ence. This is fol­lowed by a ques­tion about the risk for a large com­pa­ny like Philips to flout the rules and overt­ly abuse the wealth of per­son­al data acquired through this sys­tem, which we assure is ren­dered impos­si­ble by the system’s water­mark­ing and over­sight of the data, and the severe penal­ties inflict­ed if such water­marked data was to be found on third par­ty data­bas­es out­side of our ecosys­tem.
	A few more ques­tions and the cur­tain falls — None of this was true. We explain that, although most of the ele­ments pre­sent­ed are fic­tion­al, the issues raised are only too real. Relo­cat­ing these into an alter­na­tive real­i­ty, we con­tin­ue, allows us to address core aspects from a dif­fer­ent per­spec­tive. Mak­ing the hypo­thet­i­cal become real for an hour, we took our audi­ence on a jour­ney beyond the usu­al tropes of pri­va­cy and sur­veil­lance, nav­i­gat­ing togeth­er the trou­bled waters of a project both desir­able — in its social­ly-aware agen­da, an attempt at a post-cap­i­tal­ist econ­o­my — and bleak — a form of incen­tivized data com­mu­nism open­ing the gate to new forms of priv­i­leges with­out ques­tion­ing the old.
	This elab­o­rate reduc­tio ad absur­dum was designed to keep our audi­ence on the fence, inde­ci­sive to what side of the sto­ry they would like to embrace, and there­fore ques­tion­ing the very premis­es of the issue.