Why Companies Want to Harvest Digital Traces of Design Work
The design community is concerned that companies like Figma are using their users’ design resources stored in the cloud to train AI models, often without their explicit consent. This jeopardises the creative autonomy of designers, as not only data but also their working methods are collected and used. Our guest author Christopher Lueg writes about the pros and cons.
There’s some concern in the design community about software companies including figma and Figma aiming to harvest, for under-specified corporate purposes, design resources that customers upload to corporate IT servers aka ‘the cloud’ in the process of using the design tools offered by those companies. This harvesting is in addition to monitoring any kind of interaction happening on the site anyhow which generate vast amounts of data typically used for improving site quality (weeding out programming faults, improving usability, et cetera). The additional data that those companies aim to harvest encompasses any data that customers might have stored on their servers. The stated business objective is to use customer data to train new “AI” models that would then provide more specific assistance for the tools at hand compared to somewhat generic support offered by third party AI tools. Figma, for example, reportedly aims to “unveil AI-powered design tools [to] challenge Adobe’s dominance“.
Von PowerPoint bis X
PowerPoint slide decks can serve as an over-simplified example to illustrate what it means that companies harvest any design resources that customers have stored on their servers, from final designs and rejected or approved prototypes to resources not used in any actual designs. We all know how slide decks change over the course of preparing a presentation. Changes may include topics covered, the flow of the presentation, presenter cues, and most importantly, information included, not included, or no longer included. Information may be excluded for reasons including confidentiality issues or simply not wanting to hint at potential future developments. Comments left by colleagues may include reasons for not including information which in turn may reveal much about internal reasoning.
Looking at discussions on social media including LinkedIn and the platform formerly known as Twitter (now ‘X’), a lot of customers are upset about the fact that they need to “opt out” of companies raiding the cookie jar rather than the companies’ approaching customers for permission to do so, as one could reasonably expect, which is known “opt in”.
In the case of Figma, “opt out” means Figma’s data harvest will commence at some point unless customers opt out or customers have special agreements in place that would prevent the harvesting. Harry Brignull, expert witness for dark patterns and deceptive user experiences, summarized on Linkedin how Figma uses every trick in the book to make it less likely that their customers would actually opt out: “[they] effectively take all the recommendations for effective email design and they do the opposite!”
Data processing on platforms
Designers are also concerned about the fact that opting out at any point in time after Figma commenced harvesting their data will prevent future harvesting but does not mean that Figma will delete any of the data they already captured. The reason is likely that they simply aren’t able to remove the data once incorporated into AI models.
These are all valid concerns however they would apply to almost any data processing/sharing platform (think of social media site Facebook, photo sharing site Flickr, event sharing site Eventbrite) if they were to change their terms in ways that weren’t known when customers signed up to use the platform. What sets companies offering design related tools apart is that the work processes that lead designers from early drafts and prototypes to completed designs are what we call (situated) work practices. Design artifacts are tangible representations of these work practices. This means that the companies that harvest design resources aren’t merely harvesting “data” they are harvesting representations of design practices. Almost three decades ago, Suchman (1995) pointed out that “[t]he premise that we have special author-ity in relation to our own fields of knowledge and experience suggests we should have the ability to shape not only how we work but how our work appears to others.” Harvesting the digital traces of design practices threatens to eliminate this notion of autonomy. Design artifacts shared on other data intense platforms, such as photos shared on photo sharing site Flickr, may also represent work practices, but the somewhat finished nature of those artifacts does not allow reconstructing and/or “learning” workflows and work practices as easily.
How AI is utilised
Customers may be ok with companies harvesting data generated in the process of using and applying their design tools, but they should be aware that this really is about harvesting work practices and deriving best practices. Companies are harvesting the data to have their “AI” learn how designers perceive, interpret, structure and solve design problems. Estefani (2024) offers a more detailed breakdown of what types of data might serve what “AI” purpose.
To avoid customers having to skirt around data harvesting practices (see Estefani 2024 for some recommendations), a human centered design inspired approach would be to offer mechanisms that would allow customers to determine if select projects are ok to be harvested to train AIs. Best practices in surveillance camera design would suggest the use colored indicators to communicate whether or not a customer project is subject to data harvesting. Consensual, informed participation would in turn help companies understand which practices are even suitable to be implemented as services that would then be offered by their tools. More experienced designers might even be willing to demonstrate their work pratices specifically to help train AIs functionalities that would then be utilized to assist less experienced designers.
A notable exception to the rush to train AIs to improve products is Australian company Savage Interactive, the maker of Procreate. Stating that they’re “not chasing a technology that is a moral threat to […] human creativity“, they confirm their award winning software is not using generative AI and does not track customers’ activities in Procreate apps.
References
- Dorst, K. (2004). On the problem of design problems – problem solving and design expertise. Journal of Design Research 4,2 pp. 185-196. https://doi.org/10.1504/JDR.2004.009841
- Estefani, J.N.A. (2024). How Figma AI Can See Your App – Here’s How to Stop Them. RAW July 22, 2024 https://raw.studio/blog/how-figma-ai-can-see-your-app-heres-how-to-stop-them/
- Suchman, L. (1995). Making Work Visible September. Communications of the ACM (CACM), Vol. 38, No. 9, pp. 56-64. ACM
Leave a Reply
Want to join the discussion?Feel free to contribute!