Find your situation below to see which requirements apply to you:
? Applying for NSERC, SSHRC, or CIHR?
STRAC attestation required if your research aims to advance a sensitive technology area. For Alliance and other partnered programs, also complete RAF if you have a qualifying private-sector partner.
STRAC attestation required if your research aims to advance a sensitive technology area. RAF required if you have a private-sector partner with an active research role, housing infrastructure, or contributing >$500K.
For NSGRP-scope programs (Alliance, CFI, partnered CIHR), complete RAF and develop a mitigation plan. Contact Research Services for due diligence support.
Not sure what applies?Contact us and we'll help you figure it out.
What is Research Security?
Research security protects the integrity of your research from threats that could undermine Canada's national and economic security. It's about safeguarding your work against theft, misappropriation, and unauthorized transfer of ideas, research outcomes, and intellectual property.
This isn't abstract policy talk—it's practical protection for your research environment.
Why it matters: Canada's research ecosystem thrives on openness, transparency, merit, academic freedom, and reciprocity. These principles make Canadian research excellent. They also make it vulnerable. Foreign state actors, state-sponsored entities, and other bad actors actively target Canadian research to gain strategic, military, or economic advantages.
What's at stake:
Your intellectual property: Research findings, data, methodologies, and innovations you've spent years developing.
National security: Advanced technologies that could be weaponized or used for surveillance.
Economic competitiveness: Innovations that drive Canadian prosperity.
Your career: Funding eligibility, institutional reputation, and research partnerships.
Public trust: The credibility of Canadian research as a whole.
Shared responsibility: Research security works only when everyone plays their part. Researchers, institutions, federal funding agencies (CIHR, NSERC, SSHRC, CFI), and the Government of Canada all share responsibility for protecting Canada's research ecosystem.
The reality check: Research security doesn't mean shutting down international collaboration. Canada values global partnerships—they're essential for advancing knowledge. But collaboration requires diligence. You need to know who you're working with, understand the risks, and take appropriate precautions.
Here's what it means: If your research advances any sensitive technology area, you cannot have active affiliations with—or receive funding from—organizations that pose national security risks. Period. No exceptions.
The policy operates on two lists that work together:
Named Research Organizations (NRO): 103+ foreign institutions connected to military, defence, or state security entities. If you're affiliated with any organization on this list while working on sensitive technology research, your federal grant application will be denied.
Sensitive Technology Research Areas (STRA): 11 categories of advanced technologies (AI, quantum computing, genetic engineering, advanced weapons, etc.). If your research aims to advance these technologies—not just use them—STRAC applies to you.
Your Responsibility: Before applying for federal funding, you must review both lists. If your research advances a STRA, all named researchers on your grant must attest that they have no NRO affiliations. Past affiliations don't count—only current ones matter. If you hold an NRO affiliation, you must sever it before applying. The government plans to update these lists regularly, so please check them each time you apply for funding.
Does Your Research "Advance" a Sensitive Technology?
This is the critical distinction. Many researchers assume STRAC doesn't apply because they're "just using" existing technology. That's often wrong. Here's how to tell the difference:
Your research ADVANCES a STRA if it:
Develops new capabilities, methods, or applications within a sensitive technology area
Improves the performance, efficiency, or functionality of existing sensitive technologies
Creates new knowledge that could enable others to develop sensitive technologies
Produces results that have dual-use potential (civilian and military applications)
Your research likely does NOT advance a STRA if it:
Uses commercially available AI tools as instruments without modifying them
Applies existing technologies to study unrelated phenomena (e.g., using machine learning to analyze historical texts)
Studies the social, ethical, or policy implications of sensitive technologies without developing them
Real-World Examples
✓ STRAC likely does NOT apply:
Dr. A is an environmental scientist using commercial satellite imagery and existing machine learning tools to map deforestation patterns. She's applying existing technology as a research tool—not advancing AI or remote sensing capabilities.
✓ STRAC likely does NOT apply:
Dr. B is a sociologist studying public attitudes toward AI adoption in healthcare. Her research examines how people perceive and interact with AI systems—she's not developing or improving any AI technology.
✗ STRAC DOES apply:
Dr. C is developing new machine learning algorithms that improve the accuracy of medical image analysis. Even though the application is healthcare, she's advancing AI/ML capabilities—STRAC applies.
✗ STRAC DOES apply:
Dr. D is a materials scientist developing new battery technologies for electric vehicles. His research advances energy storage capabilities, which falls under advanced materials/manufacturing—STRAC applies.
⚠ Grey area—consult us:
Dr. E is adapting an existing open-source AI model for a specific forestry application. She's fine-tuning parameters but not fundamentally changing the model architecture. This is the kind of case where the line between "using" and "advancing" gets blurry—contact us.
When in doubt, assume it applies. Contact us for a consultation before submitting your grant application. Getting it wrong means your application gets denied—or worse, your funding gets clawed back later.
Ontario is implementing steps to ensure that national and provincial security within our world-class research ecosystem is of the utmost priority. The Ministry of Colleges and Universities has released the Research Security Guidelines for Ontario Research Funding Programs.
These requirements apply to: Ontario Research Fund (ORF), Early Researcher Awards (ERA), and other provincial research funding programs. Failure to comply may result in funding being denied or revoked.
What Ontario Requires
The provincial guidelines go beyond federal requirements in some areas. You must provide:
Disclosure of collaborations with organizations of concern AND involvement with foreign entities: This is broader than STRAC—you need to disclose all foreign entity involvement, not just NRO affiliations.
Incomplete disclosures: Listing only formal affiliations while omitting visiting positions, advisory roles, or consulting relationships.
Vague mitigation plans: "We will be careful" isn't a mitigation plan. Specify concrete measures: access controls, data handling protocols, IP agreements.
Inconsistency between forms: Your checklist and attestation must align. Discrepancies trigger additional scrutiny.
We can help. The Mitigating Risk Checklist and Attestation Form can be confusing. Contact us before you submit—we'll review your forms, identify gaps, and help you write effective risk mitigation language that satisfies provincial requirements.
Risk mitigation isn't bureaucratic box-checking—it's the systematic process of identifying threats to your research and implementing practical measures to address them. Strong risk mitigation protects your work, maintains your funding eligibility, and demonstrates due diligence to granting agencies and collaborators.
Why it matters: The federal granting agencies assess your risk mitigation plan when evaluating funding applications involving private-sector partnerships or sensitive technology research. Weak or absent mitigation strategies can result in rejected applications, withheld funding, or terminated grants. High-risk partnerships with insufficient mitigation simply won't get funded.
How to Build Effective Risk Mitigation
Start early: Conduct risk assessments at the beginning of partnership discussions.
Conduct open-source due diligence: Research your potential partners thoroughly. Check their institutional affiliations, funding sources, and ownership structures.
Validate through direct consultation: Talk directly with your potential partners about their affiliations and motivations.
Assess alignment: Determine whether your partner's motivations align with yours.
Key Areas Your Risk Mitigation Plan Should Address
Research team composition: Build a team with appropriate security clearances, institutional affiliations, and awareness of research security requirements. All named researchers must comply with STRAC Policy requirements if working in sensitive technology areas.
Cybersecurity and data management: Implement robust protections for research data, methodologies, and intellectual property. This includes secure storage, access controls, encryption, and protocols for data sharing with partners.
Agreement on research outcomes: Establish clear written agreements about intellectual property ownership, publication rights, and intended use of research findings. Define what happens if security concerns arise mid-project.
Physical security: Control access to research facilities, labs, and equipment. Implement sign-in procedures for visitors and ensure only authorized personnel have access to sensitive areas.
Monitoring and Reporting: Establish mechanisms to detect security incidents and report them promptly.
Implementation and Compliance
Once your mitigation plan is approved, you must implement it. This isn't optional. Your award agreement stipulates that you'll follow the mitigation measures you identified, and you must maintain them until you submit your final financial report.
Monitor your mitigation plan continuously. If circumstances change—new partners join, research scope shifts, or risks evolve—you must immediately update your risk assessment and notify the granting agency. Changes that increase national security risk require submitting a new RAF before proceeding.
The Reality
Strong risk mitigation enables research partnerships that might otherwise be too risky to pursue. It demonstrates professionalism, protects Canadian interests, and reassures international collaborators that you operate in a secure environment. Weak mitigation, conversely, jeopardizes funding, damages your professional reputation, and puts your research at risk.
Get help. Lakehead's research office can support you in developing risk assessments and mitigation strategies. Don't wait until you're facing a funding deadline—reach out when you first identify a potential partnership that might raise security concerns.
Travel Security
International travel for conferences, collaborations, and fieldwork exposes researchers to unique security risks. Foreign governments may target your devices, monitor communications, or attempt to acquire sensitive research information. Proper preparation before, during, and after travel is essential.
Key principles: Use clean devices when travelling to high-risk destinations. Assume any device searched at a border is compromised. Change all passwords when you return. Register with the Government of Canada before you leave.
Our dedicated travel security page includes destination-specific guidance, device preparation checklists, and information on booking a pre-travel consultation.
Your research data, methodologies, and intellectual property are valuable targets. Cybersecurity isn't just IT's problem—it's a core component of research security that protects your work from theft, manipulation, and unauthorized access.
Key principles: Use strong, unique passwords with multi-factor authentication. Be cautious with AI tools—some collect your data for training. Avoid third-party platforms for research websites. Protect survey data from bot contamination.
Security Alert: Do NOT use DeepSeek. Canadian security agencies have identified significant privacy and security risks, including potential foreign government access to user data. Use Lakehead's Google Gemini instead—your data is not used for AI training.
Our dedicated cybersecurity page covers AI usage guidelines, secure data storage, research website best practices, survey security, and more.
It depends on how you use them. If you're using commercially available AI tools (like ChatGPT, statistical packages, or image analysis software) as instruments without modifying them, STRAC probably doesn't apply. But if you're developing new algorithms, fine-tuning models, or creating AI capabilities that didn't exist before, STRAC likely does apply. The key question: Is your research advancing AI capabilities, or just applying existing ones? When in doubt, contact us.
For STRAC purposes, yes—if they're not on the NRO list, you can include them. However, you should still conduct due diligence. The NRO list focuses on institutions with direct military/security ties, but other concerns may exist. Check if your collaborator's institution has undisclosed government connections, receives military funding, or has other risk indicators. Also remember the NRO list is updated periodically—an institution not on today's list could be added later.
Affiliations include formal and informal connections. This covers: employment (full-time, part-time, or contract), adjunct or visiting positions, advisory roles, consulting arrangements, honorary titles, research collaborations with formal agreements, and receiving funding or resources from the organization. Informal networking or occasional conference interactions typically don't count. If you're unsure whether a relationship constitutes an "affiliation," it's safer to disclose it and let the agency determine its significance.
No—STRAC only applies to current affiliations. Past affiliations don't trigger STRAC requirements. However, you should still be prepared to explain your history if asked, and ensure you've formally severed any previous connections (resigned positions, ended contracts, etc.). If you're still receiving benefits, maintaining honorary titles, or have ongoing obligations from a past affiliation, it may still be considered "current."
Consequences range from application denial to funding clawback. If you incorrectly attest that your research doesn't advance a STRA, or fail to disclose an NRO affiliation, your application will be denied. If the error is discovered after funding is awarded, you may have to return the money. Repeated or intentional non-compliance could affect your eligibility for future funding. The agencies take this seriously—but honest mistakes made in good faith, especially when you sought guidance, are treated differently than deliberate misrepresentation.
Yes, if they're named on the application. All named researchers on a grant application must comply with STRAC requirements. This includes graduate students, postdocs, and research staff who are listed as team members. Students not formally named on the grant but working on the project should still be made aware of research security requirements and any restrictions that apply to the funded work.
The lists haven't been updated since they launched, but always check you're using the current version. The government has indicated they may update the lists as new information becomes available, and there have been ongoing rumours of imminent changes. An institution not on today's list could be added without warning. Always verify you're referencing the most recent version of both lists when preparing a grant application or conducting due diligence—don't rely on an old copy you downloaded months ago.
Glossary of Terms
Affiliation
A formal or informal connection to an organization, including employment, visiting positions, advisory roles, consulting arrangements, honorary titles, or receiving funding/resources. Casual networking doesn't typically count.
Dual-Use Research
Research that has legitimate civilian applications but could also be misused for military purposes, weapons development, or other harmful ends. Many sensitive technologies are dual-use by nature.
Named Research Organization (NRO)
A foreign institution identified by the Government of Canada as having connections to military, defence, or state security entities that pose national security risks. Researchers with NRO affiliations cannot receive federal funding for sensitive technology research.
NSGRP (National Security Guidelines for Research Partnerships)
Federal guidelines requiring researchers to assess and mitigate national security risks in research partnerships, particularly those involving private-sector collaborators or foreign entities.
RAF (Risk Assessment Form)
A standardized form used to identify and evaluate potential national security risks in research partnerships. Required for grants involving private-sector partners or other risk factors.
Sensitive Technology Research Area (STRA)
One of 11 categories of advanced technologies identified by the Government of Canada as having national security implications. Includes AI/ML, quantum science, advanced materials, biotechnology, and others. Research that advances (not just uses) these technologies triggers STRAC requirements.
STRAC (Policy on Sensitive Technology Research and Affiliations of Concern)
The federal policy (effective May 2024) that prohibits researchers with NRO affiliations from receiving federal funding for research that advances sensitive technologies. Requires attestation from all named researchers on applicable grants.
Tri-Agency
The three federal research funding agencies: CIHR (health research), NSERC (natural sciences and engineering), and SSHRC (social sciences and humanities). CFI (infrastructure) often aligns with Tri-Agency policies.
Training Resources
Lakehead University Training
The Office of Research Services holds workshops and events around research security throughout the year. Subscribe to the Research & Innovation weekly bulletin for announcements.
Canada's Export and Brokering Controls | Global Affairs Canada
Increases knowledge about Canada's export controls regime, including what is controlled and why. Explains how research institutions may be subject to export controls, demonstrates how to apply for an export permit, and provides resources and contacts.
Learning Objectives:
Increase knowledge about Canada's export controls regime
Understand how research institutions and academia may be subject to export controls
Date: January 19, 2026 | 1:00pm - 2:00pm (EST)
Raising Awareness of Security Risks and Mitigation Tools in the Research Ecosystem | Public Safety Canada
This foundational module raises awareness within Canada's scientific and academic communities about research security issues. It explains the potential for misuse of sensitive research, technology, and materials, along with risk indicators and mitigation tools. Updated annually to reflect the evolving research and national security environment.
Recommended starting point for anyone new to research security.
Learning Objectives:
Learn guidance and tools to strengthen security posture at research institutions
Understand how research security intersects with research activities
Identify and mitigate threats to research security
Pursue and maintain safe research partnerships
Consider dual-use implications of research
Protect valuable research, data, and potentially patentable property
Date: January 21, 2026 | 1:00pm - 2:30pm (EST)
Global Affairs Canada
Overview of Canada's current sanctions measures, best practices for conducting due diligence to verify sanctions compliance, and information on permit applications. Details how sanctions affect Canadian educational institutions, research collaborations, funding opportunities, and engagement with international partners in sanctioned countries.
Learning Objectives:
Gain awareness of Canadian sanctions, exceptions, and permits
Learn about compliance and enforcement
Date: February 5, 2026 | 1:00pm - 2:00pm (EST)
Public Safety Canada
Provides an overview of open-source due diligence techniques for evaluating risks related to potential research partners. Enhances researchers' ability to gather, analyze, and interpret relevant information using open-source methods to ensure research security and integrity.
Learning Objectives:
Find relevant information using open-source methods
Frame open-source information to analyze and make security-conscious decisions
Protect researchers during all stages of research
Date: February 19, 2026 | 1:00pm - 2:00pm (EST)
Protecting Your Research While Travelling Abroad | Public Safety Canada
Provides a global overview of the threat environment when travelling. Summarizes techniques used by foreign governments to acquire research, and provides best practices to follow before, during, and after travel.
Learning Objectives:
Raise awareness of travel risks
Enable researchers to make risk-informed decisions to protect themselves and their research
Date: March 11, 2026 | 1:00pm - 2:00pm (EDT)
Immigration, Refugees and Citizenship Canada
Provides insight on the immigration process for international students and explains how prospective applicants are security screened for admissibility. Covers immigration forms, supporting documents, and the roles and responsibilities of IRCC and its security screening partners. Includes case studies to demonstrate the process.
Learning Objectives:
Learn about the Canadian process of obtaining a study permit for international students
Review recent case studies
Date: March 19, 2026 | 1:00pm - 2:00pm (EDT)
Know Your Research – Know Your Partners – Assess the Risk | Public Safety Canada
Elaborates on dual-use technologies and research with specific examples that highlight their complex nature and how to recognize their sensitivities. Whether working in STEM, social sciences, or humanities, this module enhances understanding of dual-use research and provides tools for due diligence and risk evaluation.
Learning Objectives:
Raise awareness of risks from Sensitive Technology Research Areas and dual-use technologies
Illustrate the dual-use nature of specific research areas across multiple domains
Enable participants to better recognize how their work may be sensitive or dual-use
Highlight the importance of due diligence and safeguarding intangible information
Balance and reinforce the importance of collaboration in open science
Date: March 24, 2026 | 1:00pm - 2:00pm (EDT)
The Strategic Relevance of Social Sciences and Humanities in Research Security | Public Safety Canada
Highlights the strategic role of social sciences and humanities in analyzing and managing complex security challenges. Explores how foreign interference, surveillance, and manipulation strategies can compromise Canada's research ecosystem and affect researchers. Uses case studies to illustrate synergies between STEM and social sciences, dual-use research concerns, and vulnerabilities specific to academic environments.
Learning Objectives:
Increase awareness and provide practical tools to identify and address research-related risks in social sciences
Support informed decision-making in assessing social sciences and humanities grants through a research security lens
Date: To be announced
We encourage you to take all training modules!
Need Help? Contact Our Specialist
For personalized assistance with STRAC/NSGRP compliance, risk assessments, visitor screening, travel consultations, or any other research security questions, please reach out.