Prev Article Next Author Jessica Hart Date 16 November 2016 Is there such a thing as privacy in the digital age? In a nutshell: no. At least that seemed to be the feeling at a roundtable event we hosted with privacy experts working across education, retail, technology and security last week. Over breakfast, we discussed if we can legitimately expect privacy when taking advantage of modern technology such as smart phones and social media. Digital progress is outpacing education, law and corporate governance around it and individuals are struggling to keep up. Fundamentally, there is a broad lack of awareness of what information people are giving away and whether it can be retracted or if it will exist – in the ether – for ever. And there's confusion too: do we care that our data is being stored if it is keeping us safe, or are we affronted that a basic human right to privacy is being infringed? According to one of our experts, "nobody will tell you privacy doesn’t matter". We know that our data, who has access to it and what they are doing with it matters, but what the implications are for us? "Transparency" has been branded - and is consequently viewed - as a positive thing, while demanding "privacy" is increasingly viewed with suspicion: what have you got to hide? Our feelings towards privacy may be boiled down to risk – what is the risk of sharing something if it cannot be taken back? What do we run the risk of if we do not sign up to transparency? There are misconceptions about what data is ours and what is available to others. As another expert pointedly remarked, "people generally don't bother to read T&Cs" – they're long winded, life is busy, and ticking the box is simply a means to accessing the service. But in doing so, we are signing up to surveillance – information on our whereabouts, where we live, what we do, who we socialise with. This kind of consumer insight is gold dust for businesses. There's an onus on them, therefore, to increase awareness of the data they store in a more accessible way, and not to abuse the data they gather. Businesses should not trick people into sharing data and then proceed to erode their trust. The corporate data storer has a responsibility to behave well in a world where digital literacy in consumers is lagging far behind the technology. This is something the "well intentioned" General Data Protection Regulation legislation is trying to address, which is being introduced "to make Europe fit for the digital age". What we do know is this: technology and the corresponding sharing of data are only moving in one direction. As tech becomes increasingly wearable, along with the smart phones so many of us carry in our pockets, data itself is becoming more portable. Currently, there's no sign of people relinquishing their devices and closing their social media accounts in protest. Accessing modern technology and the internet is a way of life for millennials across the globe. For the most part, however, the topic of privacy in the digital age may give rise to more questions than it answers. It's clear that data is useful, as well as risky. It's also clear that, in our endless quest to comprehend - never mind keep up with - technology, ambivalence about our right to privacy is not the way forward. Only through seeking to understand what we are sharing, who with and why can we even begin to anticipate how our data may define us in the future.