down arrow
An anonymous participant does remote user research.

Protecting Your Participants' Data: A ReOps-Approved Guide for Researchers

Keep your data and participants safe, without slowing down research

Research is scaling faster than ever, fueled by AI, democratization, and growing pressure to prove impact. Our State of User Research 2024 found that the ratio of people who do research (PWDRs) to dedicated Researchers was trending upwards year over year. In 2024, there were 5 PWDRs to 1 Researcher on average. In 2024, that ratio was 4:1, and in 2021 and 2022, it was 2:1. This means more people in organizations are gathering more participant data than ever before.

While this opens up exciting opportunities, it also creates new risks. In many organizations, research operations (ReOps) is the steady hand on the wheel, keeping research on the rails by building flexible, scalable, ethical, and compliant research. 

User Interviews can help you put your research on the rails. Book a demo to learn more.

TL;DR: This is your go-to guide for building the guardrails that will keep your participant's data safe, without slowing down research.

Why ethical data handling is step one (not a checkbox later)

Research isn’t valuable if it’s not ethical. Ethical data handling ensures participants are respected, risks are minimized, and trust is protected, before, during, and after the research takes place.

What are the 3 basic data ethics? 

The Belmont Report sets the gold standard for ethical research, outlining 3 core principles:

  1. Respect for persons: Acknowledge autonomy and protect those with diminished autonomy
  2. Beneficience: “Do no harm” - maximize good, minimize bad
  3. Justice: Ensure research benefits and burdens are fairly distributed across diverse groups, including race, socioeconomic status, gender, disabilities, and sexual orientation
Learn more about the scope of ethical research in our Ethical Guidelines for Research Field Guide chapter

Ethical data handling frameworks and standards to lean on (including ReOps best practices)

While the Belmont principles provide a basic foundation for research, there are other frameworks and best practices that you can lean on to ensure your data handling is ethical.

The 5 P’s of ethical data handling (from The Ethics of Managing People’s Data by Michael Segalla and Dominique Rouziès)

When working with human-provided data, focus on:

  • Provenance: Be transparent about what you're collecting and avoid collecting “dark data” you don’t use.
  • Purpose: Make sure data is only used for its intended purpose, and get fresh consent if new uses emerge
  • Protection: Clearly communicate where data will live, who can access it, and when it will be anonymized or destroyed
  • Privacy: Balance protecting identifies with maintaining data usefulness
  • Preparation: Clean data thoroughly by removing duplicates, fixing errors, handling outliers, and validating your results

The 5 C’s of ethical data practices (from DJ Patil, Hilary Mason, and Mike Loukides from O’Reilly)

The five framing guidelines focus on: consent, clarity, consistency, control, and consequences.

  • Consent: Get explicit agreement about what data is being collected and how that data will be used
  • Clarity: Ensure participants understand what they're consenting to
  • Consistency: Safeguard data in repeatable, predictable ways
  • Control: Give participants control over their data 
  • Consequences: Always ask, could this data cause harm?

Examples of ethical slip-ups (and what to do instead)

Cambridge Analytica: Harvested millions of Facebook profiles without transparency, influencing elections. What to do instead: Enforce strict data collection transparency and user control

Target’s Pregnancy Prediction: Sent maternity coupons to a teenager before her family knew she was pregnant. What to do instead: Evaluate predictive models carefully, anonymize data, and prioritize consent

A practical checklist for ethical data handling 

Download our Pre-Research Ethics Checklist template for a quick guide to scaling ethics in your team, including data handling. 

Protecting participant data in research

Protecting participants throughout the research journey is about honoring their dignity and being mindful about how their data is consented, collected, shared, and stored. 

Informed consent and beyond 

While protecting participants goes far beyond an exchange of signature, an informed consent form is usually the first formal agreement made between a participant and researcher.

These forms should: 

  • Clearly explain study details, data collection, and participant rights
  • Be reviewed and signed before participation
  • Build trust and improve data quality

Managing data in regards to privacy laws like GDPR

Our User Researcher’s Guide to Data Privacy Regulations highlights 7 tips user researchers can use to stay in compliance:

  1. Ask for consent explicitly
  2. Inform participants about recordings and observers
  3. Only collect essential data
  4. Understand your role in data processing
  5. Let participants access or delete their data easily
  6. Avoid collecting sensitive info unnecessarily
  7. Stay organized with a participant management tool
Learn More: User Interviews and Data Privacy Compliance: How We Keep User Data Safe

Privacy vs confidentiality vs anonymity

As a researcher, you can also keep in mind the differences between privacy, confidentiality, and anonymity. Privacy focuses on control over what personal data is shared. Confidentiality means keeping shared data secure and anonymizing is when identity isn’t linked to responses.

Qualitative and quantitative considerations

You may also have different considerations based on the type of study you’re conducting.  Since qualitative research means richer and more identifiable data, you may be introducing bigger confidentiality risks. And even with quantitative studies de-identified data can sometimes be re-identified (as Target’s case showed!).

ReOps red flags to watch for in participant protection

Be wary of over-collecting data “just in case” stakeholders might want to use it in the future. You’ll also want to stay tuned into folks on your legal council to make sure your consent forms and protocols are always kept up to date across the company.

Research data privacy: principles and practices

What is data privacy in research?

Data privacy in user research is about giving participants control over their personal information and protecting that control.

The principles of data privacy

We recommend following foundational principles of GDPR that include lawfulness, fairness, and transparency as described in the UK’s Information Commissioner's Office:

Lawfulness

For the processing of personal data to be lawful, you need to identify the specific grounds for processing. Processing might be unlawful if it results in:

  • Breach of confidence
  • Organization exceeding its legal powers or exercising its powers improperly
  • Infringement of copyright
  • Breach of enforceable contractual agreement
  • Breach of industry-specific legislation
  • Breach of human rights

Fairness

Fairness means you should only handle personal data in ways that people would reasonably expect and not use it in ways that have unjustified adverse effects on them. You also ensure that you are treating individuals fairly when they seek to exercise their rights over their data

Transparency

Transparency is about being clear, open, and honest with people from the start about who you are, how, and why you use their personal information. You must also be transparent with participants about their rights during research. Without this exchange of information, people may lack trust and confidence.

Data ownership: Who owns what in a democratized research landscape?

In a democratized landscape, data ownership is typically not solely held by individuals or institutions, it’s a shared responsibility across various stakeholders. 

How to ensure confidentiality in team-driven research

You can ensure confidentiality in team-driven research by restricting raw data access to only those who need it, establish clear security policies, and vet tools carefully.

Data anonymization techniques - what works and what doesn’t

We’ve touched on data anonymization a few times, but let's go deeper.

Anonymization techniques

There are four primary anonymization techniques

  • Data masking: hiding sensitive fields (e.g., blurring faces)
  • Pseudonymization: replacing identifiers with codes (e.g., P1, P2)
  • Generalization: Using broader categories (e.g., "age 30-40" vs. "age 32")
  • Swapping, perturbation, redaction, tokenization, etc: this means replacing sensitive information with non-sensitive values (e.g., swapping a bank account number with a random string of characters)

Anonymization vs pseudonymization

Anonymization means that data is truly untraceable and you're unable to identify the participant who engaged in the study. On the other hand, pseudonymization means data can be re-identified if needed with additional authentication. While working in previous studies, pseudonymization meant keeping participant data on a separately stored drive that only select individuals had access to (i.e., "extra keys kept separately.") The main file that others could access included unique identifiers rather than participants actual data.

When to use which

There are different types of anonymization to provide you with flexibility of different use cases, business models, industries, and regulations. For example:

  • Financial services have some of the highest security standards. They’ll use various types of anonymization to remain compliant with regulations
  • Healthcare institutions are under strict regulations from HIPPA, anonymization empowers providers to conduct research without compromising patient safety
  • Edtech creates significant amounts of trends that can be used to improve methods and systems. This data, since it often has to do with children, undergoes the highest data privacy standard

Tools and resources for data anonymization

You can use tools such as: Python libraries like Pandas, Fakr, Spreadsheets to mask with formulas or platforms like User Interviews.

Learn why User Interviews is the secure solution for user research.

Data protection standards in research

The 5 data protection principles

Article 5 of the GDPR sets out key principles that lie at the heart of general data protection. These include:

  • Lawfulness, fairness, and transparency: Any processing of personal data should be lawful and fair. It needs to be transparent to individuals that personal data is being collected, used, consulted or processed
  • Purpose limitation: Personal data should only be collected for specified, explicit, and legitimate purposes and not further processed
  • Data Minimization: Processing of personal data must be adequate, relevant, and limited to what is necessary
  • Accuracy: Controls must be in place that ensure personal data is accurate, kept to date, and having regard for which they are processed, erased, or rectified
  • Storage limitation: Data should only be kept in form which permits identification of data subjects, should not be kept longer than necessary, and time limits should be established for erasure

What IRBs, ethics boards, and your legal team care about

Institutional Review Boards (IRBs), ethic boards, and legal teams all focus on protecting the rights, safety, and well-being of human subjects involved in research, while also ensuring compliance with ethical standards and legal regulations. This includes having things like proper consent documentation for studies, securing data on encrypted served vs. personal laptops, and having appropriate data retention and destruction policies.

Data protection in practice

You can visualize a standard study flow including:

Consent > Data Collection > Secure Storage > Analysis > Deletion

Who audits your data protection—and who should?

In many organizations, ReOps is the first line of defense for auditing your data protection. They collaborate with stakeholders from across the organization and legal council to establish and implement best practices across the team. In corporate research, IRBs are not always required but they can prove crucial depending on the each industry's specific regulations. They also help with ethical considerations, increasing public trust, and mitigating regulatory challenges or legal risks.

Organizations may also complete annual legal and compliance audits to limit any unnecessary risks.

Research data security: guarding against internal and external risks

The 3 types of data security

Research teams can guard against risk by considering three primary types of data security:

Technical, organizational, and human.

When most people think of data security they default to technical security, such as encrypted, secure platforms. While this is important, you can also mitigate risk by implementing organizational access policies and certifications. It might also include training responsible behaviors across all employees on the team.

The 4 elements of a solid research data security plan

Each research study should have a solid research data security plan that identifies the following: 

  • Who has access control to participant data
  • How data will be encrypted for each study
  • How data will be securely stored or transferred
  • An incident response plan

Internal risks

Organizations should aim to protect against internal risks such as downloading data onto personal devices, sending files over unsecured email addresses, and poorly managed share drives.

Certifications and frameworks to consider

As your company grows in maturity, you can also consider certifications such as: SOC2 and ISO27001. Many also look to Harvard Research Data Security standards as a gold standard framework. 

Practical policies: What your ReOps team should implement today

If you're a ReOps Leader, you should consider implementing some quick actions across your team today to help mitigate risks. You can lock templates behind permissions, centralize consent and confidentiality documents, and even train non-researchers before giving research tool access.

On the rails, not in the weeds

Ethical, compliant research doesn’t have to mean slow. With strong ReOps guardrails in place, teams can move fast and protect what matters most: participants' dignity and data.

As research becomes more democratized, ReOps roles are not just nice to have, they're essential. By adding lightweight control to tools, templates, and workflows, you empower every researcher to do the right thing.

More resources

Roberta Dombrowski
Senior User Researcher & Career Advisor

Roberta Dombrowski is a (former) VP, UXR at User Interviews. In her free time, Roberta is a Career Coach and Mindfulness teacher through Learn Mindfully.

Subscribe to the UX research newsletter that keeps it fresh
illustration of a stack of mail
[X]
Table of contents
down arrow
Latest posts from
Recruiting Participantshand-drawn arrow that is curved and pointing right