The Covid-19 pandemic has forced many managed service providers to seek faster, easier and more scalable ways to manage their customers’ data.Continue reading
Solutions, websites and applications alike are all likely to use encryption in some form and military and government departments have been using encryption long before; It can be argued that encryption is in fact hundreds of years old, an ample example of this is Morse code which was developed in the early 1800’s.
With the explosion of data, especially from mobile devices, questions have been raised as to if encrypted services should have a back door so that data can be accessed and policed by authorities.
Simply put, encryption is a method of protecting data by encoding it so that only authorized parties can access it; modern encryption usually makes use of a password called an encryption key.
There are different uses for encryption, many of which lie in the realms of data protection, such as encrypting backup data as it leaves the site. Given the growing number of threats to organizations and upcoming legislations, GDPR, encryption will be increasingly important for people and organizations to protect their data.
For many of us who use mobile technologies on a daily basis encryption is used in apps such as mobile banking and in messaging platforms, notably WhatsApp. WhatsApp, uses end-to-end encryption and aims to give its users privacy and security and ensure “messages, photos, videos, voice messages… are secured from falling into the wrong hands”.
So, not even WhatsApp can read your messages and from a user perspective that has a great appeal, privacy is important and nobody wants to feel like they are being spied on. However, in the wake of terror threats some have called for encrypted messaging applications to be regulated and policed more heavily.
For WhatsApp and other encrypted messaging services to be policed and regulated, new laws would have to be passed to force the hand of developers to open back-end systems. Facebook, who own WhatsApp, are likely not to be open to the idea of opening the coding to allow access to messages and above all else doing so, brings security concerns of its own.
The protection of personal data is at the forefront of many people’s minds at present and with the General Data Protection Regulation (GDPR) set to give data subjects more control over their data, further questions can be raised as to whether or not the government and security services have the right to access users data (messages in this case).
Allowing access to view encrypted messages would require some form of ‘back-door key’ into the code and this presents a huge opportunity for cyber-criminals. Cyber-crime is a growing threat to organizations and users and with hackers looking to extort information from various sources, the option of a ‘back-door’ would be a gift.
While legislation could be put in place to regulate and police encrypted data in applications and services, this legislation would have to encourage the protection of privacy and the rights of the consumer/data subject. Processes would also have to be put into place to ensure data is only accessed by authorized personal and not shared with third parties.
While opening code gives hackers an avenue to exploit, the real question is around privacy of data. The majority of users represent no threat and the contents of messages are harmless; some may argue then that there is no reason for this content to be private as a means of being able to protect society further. This being the case though, why as a consumer and the data subject should you allow others to view your private conversations especially when this could contain what is ‘harmless’ but still sensitive information such as religious, political or sexual views and opinions.
Whilst it is necessary for data and information to be monitored and policed, users are always likely to favour applications and service that choose to protect privacy.