A P P A R E L v1.0 — Background and Technicalities

Background information on the speculative fashion project
Date
5 Feb 2016

After a first exper­i­ment labeled Ver­sion 0.9 a, the A P P A R E L project is back for a full-fledged expe­ri­ence of data-infused AR fashion.

::::::::::::::::
A bit of con­text
::::::::::::::::

A P P A R E L is, and has been an ongo­ing research in spec­u­la­tive fash­ion. It is based around the core idea that in a dig­i­tal age, and a post-indus­tri­al econ­o­my, cloth­ing wouldn’t have to be about the pre­fab­ri­cat­ed iden­ti­ties we buy at H&M, Nike, or Calvin Klein. It would be about us, pro­ject­ing who we are, how we’re feel­ing, or rather how we want to be spec­tat­ed. In its most spec­u­la­tive form, this idea relies on the con­tin­u­ous adop­tion of means of over­lay­ing real­i­ty with the dig­i­tal realm — mobile devices, Google gog­gles, Microsoft’s holo-head­sets, lense-based dis­plays, reti­nal pro­jec­tion devices. Just as we elab­o­rat­ed on the ques­tion in N O R M A L 0 0 1, man­u­fac­tured things might well be part­ly or ful­ly made of pix­els at a rel­a­tive­ly close point in the future. We’ve already dema­te­ri­al­ized a good chunk of our world, and almost all indus­tri­al­ly made things tran­sit through a dig­i­tal stage at pro­duc­tion. See­ing labels on prod­ucts or the col­ors of shoes go the way of the ency­clo­pe­dia is, in fact, hard­ly any spec­u­la­tion. So there. This is where A P P A R E L starts — what do we do with this strange, vari­able, and ever updat­a­ble mat­ter we know to pop­u­late screens? And more to the point: how could this affect the way we exhib­it our­selves in pub­lic? “Let’s repli­cate the things we’re already wear­ing, but then in 3D” — no.

::::::::::::::
Tech­ni­cal­i­ties
::::::::::::::

In this iter­a­tion, devel­oped with the agile hands of Julien “V3ga” Gachadoat, A P P A R E L has now tak­en the form of a uni­size and uni­sex cloak, where basic cloth­ing needs are cov­ered, and every­thing esthet­ic has been shift­ed to the realm of aug­ment­ed real­i­ty. There, you get to con­nect your Twit­ter account (Face­book may fol­low soon) which will be scanned by an algo­rithm for var­i­ous sorts of seman­tic data. This data pool is then trans­mit­ted to shape engines, called “MODS”, that react para­met­ri­cal­ly and mod­i­fy the shape of A P P A R E L ‘s 3D polyg­o­nal lay­er accord­ing to your word usage. Added to that are MOODS; just as para­met­ric, but instead of slow­ly fol­low­ing your online activ­i­ties, these vari­ables are acti­vat­ed on demand to illus­trate your… mood, yes. In essence, this is like wear­ing a per­son­al infographic.

A P P A R E L is made of two appli­ca­tions. There’s of course the a mobile one (iOS for now, for­give us Android users), which will allow you to expe­ri­ence the piece in real life if you come to an exhi­bi­tion, or try it on a test tar­get, or just with­out AR. And there’s a desk­top app for devel­op­ment and cal­i­bra­tion which we encour­age you to look at, should you be inter­est­ed in build­ing your own MODS for an upcom­ing version.

This project has ben­e­fit­ed from the help of the French Nation­al Cen­ter of Cin­e­matog­ra­phy (CNC — DICRéAM).

::::::::::::::::::::::::::::::::::::::::::::

Grab a seat by a future catwalk

Down­load the iOS app (free)

Watch the doc­u­men­tary

Read the fic­tion

Lis­ten to the sound­track

Git your hands dirty with code

Check the ded­i­cat­ed project page