Besides being excellent joke fodder for late-night talk shows, the success of Pokémon Go has captured the interest of the business world for the economic potential of augmented reality.
For the uninitiated, Pokémon Go (pictured above) is a game played from a smartphone app that requires users to find and capture digital characters (Pokémon) that appear overlaid on the physical world. The technology that superimposes the Pokémon into your actual environment, creating one composite whole, is known as augmented reality.
But, before your firm ships off a new product plan featuring augmented reality, make sure you have screened it for all privacy-related concerns. If Pokémon Go should have taught managers anything, it’s that this technology raises many concerns about the use, and potential abuse, of personal information.
Now just over a month old, Pokémon Go has already encountered the following privacy problems:
A request from a prominent consumer advocacy organization to the US Federal Trade Commission (FTC) for an investigation of the company’s data collection practices.
Initially granting itself “full access” to user’s Google accounts meant that Niantic, the maker of the game, was “almost certainly” running afoul of earlier FTC “privacy decisions,” the group alleged. Niantic acknowledged it was a mistake, and changed the settings so that the app only asks for basic profile information.
The threat of a lawsuit from the Federation of German Consumer Organizations if Niantic does not change 15 of the clauses in its terms and conditions to comply with local privacy laws by 9 August. One concern cited is the provision of collected data (such as location) to third parties.
An open letter from US Senator Al Franken (D-Minn.) to Niantic’s CEO expressing concerns and asking questions about its data collection activities (including how parents provide meaningful consent for their child’s use of the app and how they are informed about use of children’s data).
A Canadian class action lawsuit filed on behalf of a woman living outside of Calgary, Alberta, that says she’s suffering from an invasion of privacy due to Pokémon Go players visiting her house becuase it is a site of a Pokémon gym.
Heading Off the Headlines
If your company’s data privacy team carefully vets new products for privacy problems, you can deal with these issues before they hit the news. While we don’t know how Niantic prepared, CEB recommends conducting privacy impact assessments (PIAs) for initiatives that could risk compromising personal information.
The basic framework for doing this comprises five steps, and for members of the CEB Data Privacy Leadership Council, this case study from Target illustrates how the process works in practice.
Identify which projects or new processes may require a PIA review: Collect critical information about projects, including whether the project collects, uses, shares, or stores personal information.
This is often done by embedding a series of simple, yes/no questions into an existing product development or project management methodology. When project owners respond “yes” to any question, they are directed to complete the next step in the PIA process.
Determine how extensive the review should be based on risk: Assess the sensitivity of data involved in the project and determine, at a high level, what categories of privacy risks might affect the project.
Then use these two criteria to determine whether the organization should conduct a PIA for the project, and, if so, prioritize when and who on the privacy team should perform the PIA. For example, a project that handles public data using apps already offered by IT may not require a PIA — or, at most, a brief assessment from a member of the privacy team — whereas a project handling highly sensitive data on customized apps may require the immediate involvement of the chief privacy officer and several other members of the team.
Identify the specific privacy risks in the project: Ask standardized questions that test the controls used in the project to identify specific, unmitigated risks.
Privacy should define a standard set of PIA questions. Many teams ask principles-based questions about topics such as notice, consent, and disclosure to third parties. Balance the desire to include enough questions to identify all privacy risks with the need to minimize the burden on business partners.
Explain the risks to the project owner and recommend and or create appropriate controls to mitigate them: When explaining the information risks to the project owner, emphasize the underlying business risk it poses and then explain the recommended controls that address each identified risk.
The project owner must accept each control recommendation, and then either work with Privacy to develop suitable alternative solutions, make a case for a formal exception, or decide not to comply and instead formally accept the risk outright. Control selection is a balancing act between strictly enforcing privacy policies and allowing the business to take risks. Ultimately, Privacy must be comfortable with enabling risk taking when doing so makes sense to the business.
Monitor the new project or processes for compliance: Make sure you centrally track all of the risk decisions associated with the project against a standardized control framework.
The risk register serves two important roles: 1) for individual projects, the risk register serves as a central repository to capture risk decisions, track exceptions, and schedule subsequent decision reviews where appropriate; and 2) in aggregate across multiple projects, it serves as a dashboard to visualize the organization’s mitigated versus unmitigated risks. This information helps Privacy identify systematic risks and track risk patterns over time.