When was the last time you read all the terms and conditions (T&Cs) for an online service? Do you ever click in the emails that inform you about an updated privacy policy? In fact, have you ever? Probably not. But, at the same time, do you trust the way your data is being used? There is a real disconnect between compliance with privacy regulation, operationalized through complex long-form texts, and understanding of what actually happens with our data. 

 

Innovative concepts such as the “Digital Buddy” project, imagined by IKEA’s innovation lab to imagine a privacy-first smart home, provide solutions to make privacy positive and trustworthy. This is one of many aspects that will be addressed at the first global Yes We Trust Summit, organized on October 7th.

 

Summary: 

 

 


 

Yes We Trust Summit image

 

20 pages of Terms & Conditions: The growing opacity of online interactions

 

Is the phrase "I have read and agree to the terms and conditions" in fact the biggest lie on the internet? Often, the reality is that we have not even tried to read, let alone understand, the terms and conditions that apply. 

 

We all know we don’t read the fine print. And some companies have put this to the test, such as F-Secure, who, buried deep into their lengthy terms and conditions stated that, in exchange for their WiFi service, “the recipient agreed to assign their first born child to us for the duration of eternity”. People still signed up. 

 

Another example is Dima Yarovinsky’s art project which aims to “visualize how small and helpless users are against large corporations,” as Designboom writes. The designer printed out the T&Cs of leading online services such as Facebook, Snapchat, Instagram or Tinder on colored, standard A4-size rolls. The result is striking:

 

 

Perhaps the logical conclusion to be drawn is that we just don’t care. But, actually, we do

 

Recent European-wide research has shown that 85% of people across France, Great Britain, Ireland and Germany feel they understand the meaning of personal data, and a high proportion are worried about how it is being used, 71% in Great Britain, for example. 

 

If some 80% of us believe that transparency is important for trusting a company or brand, then why don’t we read the fine print? Excessive word counts, complicated legal language: dry, impenetrable, inaccessible prose. It’s not a difficult question to answer. 

 

Plus, did we even really have the choice to begin with? Often there is no option but to “agree,” to “consent” or to “accept” if we want to interact socially or work professionally online, leaving little opportunity to challenge brands’ data practices. 

 

It would appear that we are stuck in a trap. A trap of feeling uncomfortable about how our data is used, but unable to fully understand what exactly it is being used for until it is too late. 

 

One need only call on the Cambridge Analytica scandal of 2018, and the recent Whatsapp controversy as evidence of what is meant by “too late”.

 

To hear more about Cambridge Analytica and what we can learn from it, join the Yes We Trust Summit on October 7th, where Brittany Kaiser (Cambridge Analytica whistleblower) will be a keynote speaker on how the digital industry can gain the trust of consumers and companies. 

 

YWT_Speakers (rectangle)

 

Yes We Trust Summit: Didomi Organises Worldwide Event On How Privacy Drives Business.

In the internet age, trust is the single most important driver of success in business. Join us on October 7th for an exclusive day of workshops, networking sessions and keynotes by speakers such as Brittany Kaiser and Seth Godin. Together, let's learn how companies can thrive in a privacy-conscious world. 

 

Register now

 

The “Digital Buddy”: A companion to navigate a complex digital world 

 

The “Digital Buddy” project by Field Systems proposes to leverage artificial intelligence (AI) and augmented reality (AR) to create a 3D avatar to help protect your interests and privacy online.

 

 

Digital Buddy would use NLP Neural networks to analyse the complex texts of T&C, condensing and translating them into more accessible language via a Natural Language Generator such as GPT-3.

 

 

The Buddy would communicate the T&C in a way that is easy to understand. You could ask the Buddy questions, such as “are my messages being read?”, “is my location being tracked?”. And finally, over time, your Buddy would also learn about your values and preferences, acting independently to review T&C and quickly highlighting any serious concerns.

 

The Digital Buddy - Image taken from Everyday Experiments 

 

The value proposition? “Similar to how we now rely on Google maps to navigate our physical world, we could embrace new and alternative types of digital assistants that  can help us choose who and what to trust onlines” suggests Field Systems. 

 

It’s a good analogy. But, if we think deeper, what does this tell us about our society?

 

How have we got to the stage where compliance tools, like cookie banners, T&Cs or privacy policies, that were built to protect and provide transparency, have been manipulated to further obscure the way our data is being used? Why do we believe that we’re being tricked while - in reality - these tools are made to inform us. 

 

Interested in discussing the Digital Buddy project further, with globally renowned compliance experts? 

Join us on October 7th for the Yes We Trust Summit, an exclusive day of workshops, networking sessions and keynotes, including a talk entitled “Yes We Trust in Compliance” that will use the Digital Buddy project for discussion, and question the role of compliance in a technology-driven world.

 

Register now

 

YWTBanner (2) (1)-png

 

An uncertain future: Do we trust in compliance? 

 

Perhaps the internet that was supposed to “set us free” is an internet we have left behind. It’s time we reversed this cycle of distrust, and opt for a citizen-led data industry. 

 

The Digital Buddy seeks to educate and empower, and this is definitely a good thing. But wouldn’t it be more effective to try and solve the problem the Digital Buddy addresses? Rather than creating AI companions to help protect our data, it would surely be better for companies to reverse their thinking, and design all privacy interfaces with the user in mind. 

 

We have listed a couple of examples of cookie banners, some of them among our clients, some in creative sectors like gaming, but we would be lying if we said we were impressed. Even on awwwards, you can see that most of the “Creative Examples of Cookie Consent Experiences” are more about visual design than about privacy-by-design.

 

Privacy policies could be made clearer thanks to a theoretical virtual digital assistant, but they could also be made clearer through an ambitious effort to invest in trust and transparency. Interestingly, more than privacy regulations like GDPR and CCPA, it’s a tech company - Apple - that has designed one of the most impactful consent notices. Does it mean that compliance has lost to big tech? 

 

Four-fifths of consumers consider transparency important for trusting a brand or company. Users want to be informed about why their data is being collected, and they appreciate companies with transparent data practices. 

 

Trust should be regarded as the single most important driver of success in business, and privacy professionals as real business enablers.

 

Consumer data has never been more valuable. How brands collect, use and protect it determines user experience and customer trust, and this is why compliance professionals should think more as users, not as lawyers! 

 

How can your company thrive in a privacy-conscious world? 

Join us on October 7th at the Yes We Trust Summit for an exclusive day of workshops, networking sessions and keynotes by speakers such as Brittany Kaiser and Seth Godin. Together, let's learn how to inspire trust in a technology-driven world. 

 

Register now

 

Didomi is proud to be sponsoring the Yes We Trust Summit