Digital Cloth: You’re Doing It Wrong

One of the areas of interest to Fashion Research Institute is the development of an accurate digital representation of cloth that simulates how cloth moves and drapes.  It’s apparently the hot trend in the tech world too, judging by how many queries we receive and requests to ‘just look at’ whatever nifty new release some tech company has come up with.  Our usual response is “nice try, the gamers and techies will think it’s nice, but it’s not right.”

We’re fashion designers with a long history of creating garments. Our first interest and our true love in fashion is the high art of couture.  Couture differs from almost all other kind of design except for Runway in that a single individual is focused on all aspects of dressing another individual. It’s true one-to-one work, and a large part of what we do as couterieres is to drape cloth on our customers or physical representations (mannequins) of our customers.  A good couteriere can create trompe l’oeil effects and make a man’s legs look longer or a woman’s bust bigger or smaller.  We can slenderize a wearer’s silhouette or we can make him (usually) appear larger.  As we work in the field, we learn first hand about cloth: its drape, its handle, what it wants to do on the mannequin.

The longer we work, the more specific textile knowledge we acquire, until after 30 years our fingertips are the ultimate augmented reality device and have forgotten more about cloth than most people ever know. We can look at a bolt of cloth and know what it will feel like before we ever touch it.  We can determine the fiber percentage by running a finger nail over the weave of the cloth and listening to the resonance of the sound that is made.  Polyester for example, has a high singing note that is unmistakable and the more poly the higher the note.

This is real expertise and you don’t get it in design school or from doing anything but manipulating cloth all day every day to produce the precise results that you want until cloth becomes part of you.

This is why we’re always bewildered when our colleagues in the tech world want us to to be thrilled about their digital models of cloth.  After all, they’ve just created a dancing model with swirling cloth, how fabulous is that?  Well, except for us it’s not that fabulous.  We don’t really know how much hard work you’ve put into your model, we just know when we look at it, it’s not right, by which we mean it’s not accurate.  We’re not excited.  We look at the models dancing and we think ‘it’s not right.’ We don’t think about the people who had to code the underlying program and how they have to figure out how the soft body/hard body deformations are going to work, how occlusions will work, how lighting and ray tracing and all the rest of that stuff is going to have to get wrapped up and made to work, and how the coders got their simulated cloth to run in a semi-reasonable render time preferably on a PC and how excited they are and how this new digital cloth stunt is going to change the (gaming) industry. Fashion designers don’t think that way about cloth, we don’t know (usually) how much work they’ve put into their code, all we think is, ‘hm, ok’, and then it’s back to reading Women’s Wear Daily. We don’t think ‘how can we use this’ because frankly, we cannot.

Tech guys….and almost all of you are guys…has it occurred to you that you aren’t asking the right questions? Or that you aren’t asking the right people?  And if you keep getting the wrong answers from the wrong people, how do you expect to get the correct answer?

A colleague sent us this article to read on New Scientist: Game Characters to get authentically rumpled clothes.  Of course we went and read, and of course we watched the video, and of course we were disappointed.   And we got a chuckle out of these quotes in the article: “According to [Carsten] Stoll [of the Max Planck Institute for Informatics in Saarbrücken, Germany], the results are extremely realistic. When he and his team showed 52 people a video of a woman dancing in a skirt alongside a reconstruction that his software had produced, the majority of viewers said that the reconstruction was “almost the same” as the original.  We have to ask, did they bother to ask the actual experts who work in cloth every day? We think they didn’t or the responses would have been very different. Mr. Stoll, we’re sorry. You’ve done beautiful work. You should be thrilled and we’re sure game designers will be too.  Fashion designers…not so much. Your cloth model doesn’t look like cloth to our trained eyes. We can’t use it for anything real.

And then this, which we thought really summed up the whole thing but didn’t go far enough: ‘But to truly fool the eye,  [Andy] Lomas [The Foundry, London] would like to see a more sophisticated version of the software reconstruct more challenging items of clothing, like buttoned jackets and well-tailored suits.”

We’re designers. The only way we want to fool the eye (trompe l’oeil) is by making people look better in their clothing by changing our cut, styling and textiles we use in their garments.  And unfortunately, when we look at these digital models out there, no one is fooling our eye.

Lomas goes on to say “Right now, no one is going to trust a computer graphics expert with no experience of fashion to design a virtual suit,”  And we say, ‘well said, Mr. Lomas, you nailed it.’ So we have to ask why are all these tech folk and mathematicians trying to create digital cloth that looks like digital cloth without actually asking the real experts?

Digital cloth: guys, you are doing it wrong.

The ‘right’ answer is hard and it is expensive. We know that. We know how to do it, we know how much it will cost, and we know how long it’s going to take.  There are unfortunately no shortcuts to the right answer to creating realistic, accurate digital cloth simulations.  And anything less is not good enough for real apparel design.