Do you own a smartphone or a social media account? Have you ever used wearable devices? Do you use your face to pay for groceries?
For the latest Cambodian Business news, visit Khmer Times Business
If you answered yes to any of these questions, you have been sharing your personal information. Sharing our information does bring life convenience; we have more tailored web pages, better predictions of traffic or weather, plus fast and easy payment. It’s part of modern life to hand over our information to do daily tasks and engage with other people in today’s society.
However, it is full of risks. Your personal data reveals a lot about you; your shopping tendencies, your thoughts, your sexual orientation and maybe your whole life. This data can easily be exploited to harm you if it falls into the wrong hands. Privacy issues, defrauding and even life security are just some of the potential issues. According to a data breaches report released by Risk Based Security, compared to the same period in 2018, the number of data breaches and the amount of data leaked in the first half of 2019 increased by more than 50 percent. Of the 4.1 billion items of data leaked, 3.2 billion were leaked due to eight data breaches. This is happening more frequently across the globe, and governments are trying to protect individuals’ information, but facing big challenges because of ever-changing technology.
Big data profiling
Technological breakthroughs in the field of computer and internet have led to an exponential growth in the value of data. Data has become an important resource in today’s society. Data mining is widely used in business, scientific research, government governance and other areas. It is a procedure by which large databases – the so-called Big Data – are mined by means of algorithms for patterns of correlations between data.
Mining the data from a variety of people allows for categorisation into different types of groups, generating high rates of predictability concerning the behaviour of categories of people, and its group profiling. At the same time, there is personalised profiling that mines the data of an individual. For instance, profiling the keystroke behaviour of one particular person may enable a service provider to ‘recognise’ this person is online, because his/her behavioural ‘signature’ allows the service provider to check his/her online habits and thus build up a very personal profile that can be used to offer specific goods and provide access
to certain services. The profile can also be stored and sold to other interested parties, or
be requested by
the criminal justice or immigration authorities. Service providers mining this kind of data include companies like Google, Facebook or any e-commerce platforms.
Perhaps the most successful commercial use of profiling is12– a practice of service providers using highly specific characteristics, search-engine habits, cookies and a variety of statistics to tailor a promotion to a specific consumer. Thus, the main sources of data used to develop tailored ads are click-stream data, search record and purchase data etc. With all these data, the service provider can create a picture of the potential customer’s interests, attitudes and hobbies, consequently building ads that are targeted exactly to the consumer’s specific wants. This explains why web pages can be personalised to each of us; even the price may be based on our salary or needs.
Algorithms plays a major role in profiling, not only in determining our search results and the ads we see online, but they can also predict what we will pay, if we get a loan or a job, if we have been defrauded, and most recently, if we are paroled and how we are sentenced. Yes, algorithms have been brought into the criminal justice system too. The algorithms are either designed by developers or those trained in data, but both ways can be influenced by people’s bias. Despite developers arguing that their algorithms are neutral, they cannot avoid the fallible contexts of biased data and improper use by society. Profiling is an activity where subjectivity matters.
Algorithms are difficult to identify, let alone understand, therefore exclude users’ understanding of the ethical implications in use. As a citizen, consumer or employee, we may find ourselves in the position of being profiled, but unable to understand how algorithms are used, leaving us ill-equipped to fight against seemingly unfair results.
Personal data law around the world
Big data is playing a pivotal role in many companies’ strategic decision-making. More and more companies are adopting data-driven business models and strategies to obtain and sustain a competitive ‘data-advantage’ over rivals. Advertising is no longer just a matter for advertising companies, but something that every internet company must care about. Judging from the results of ads revenue, online advertising has actually become the most important engine of the internet.
The appealing value of personal data is the reason why major data breaches are on the rise. These audacious cyber infringements give rise to questions such as whether it is possible to empower individuals to control their personal information (and to what extent), whether developers have a responsibility for their algorithms in use, what those firms are responsible for, and the normative grounding for that responsibility. Furthermore, as data-driven mergers continue to increase in number, so too are the risks of abuse from dominant tech firms. Data-driven exclusionary practices and mergers raise significant implications for privacy, consumer protection and competition.
The value of personal information is multiple. The personal right is inherent in personal information, such as right of personality, privacy rights, and personality freedom. But individuals also have to surrender personal information in social activities, share data dividends with enterprises, and jointly promote social development, not only to satisfy the increase of corporate interests, but also the growth of public interests. In addition, individuals have to compromise some or all of their own rights in personal information for the realisation of the public interest under certain conditions; in the arenas of public safety and social governance, for example. Therefore, when deciding the basis for processing personal data by an enterprise, it should be clearly recognised that the protection of personal information must be balanced against the servicing of multiple benefits. Yes, it’s a super-complicated problem and we have to straighten it out.
Current situation of legislation and enforcement across the world
Globally, there is an increasing growth in data protection laws since policymakers and regulators have recognised the lack of protection in personal data. It has also become an issue for national security and sovereignty.
In the European Union, the General Data Protection Regulation (GDPR) is the new framework for protecting personal information. This outlines that people as data subjects should be able to decide whether or not they want to share their information, who has access to it, for how long, for what reason, and be able to modify if the data is not correct, and more. Several multimillion-euro fines have already been issued, such as a €50 million fine against Google in France for processing personal data without legal grounds and infringement of transparency and information duties. Elsewhere in Germany, a €14.5 million fine was imposed against a real estate company for operation of a non-compliant archiving system.
At the same time, a growing number of complaints received by the European Commission involve online companies’ mergers active in the collection and processing of Big Data. Emerging consensus is that privacy protection is a parameter of non-price quality competition. Thus, the competition agency would likely reject the deal if the merging parties would lower the price at the expense of privacy protection.
In the US, the California Consumer Privacy Act (CCPA) became effective on January 1 2020 and businesses that are not fully compliant with the CCPA’s restrictions on the handling of consumers’ personal information can face severe financial penalties.
In July 2019, the US Federal Trade Commission (FTC) slapped a $5 billion fine on Facebook in the wake of the Cambridge Analytical scandal and its other data leaks. The FTC has also ordered the social network to make privacy-related changes to avoid such blatant data breaches in the future.
It is significantly important to enhance compliance controls in large, multinational corporations with the goal of reducing the risk of an enforcement action on foreign soil. These internal compliance controls include conducting privacy impact assessments, risk analysis in cross-border data transfers and cultivating awareness surrounding privacy.
But we can’t just rely on the law to solve all of these problems. The three other key modalities of norms, market and online architecture must also be tightly regulated to prevent further breaches in cyber-data.
Arlene Zhang works at the Data Law Research Center in Shanghai, China