This is a situation we’re seeing more frequently, and it reflects how complex recruitment processes have become, where technology, legal requirements, and human interaction all intersect.
Participating in an interview, whether as a candidate or a recruiter, involves a level of commitment to information-sharing and mutual trust. It’s understandable that clients and candidates want to protect their data, especially in regions like the EU where privacy standards are particularly stringent. But transparency and preparation go a long way. If you clearly outline the purpose of note-taking or recordings beforehand and explain how that data is handled in line with GDPR, many concerns can be addressed.
That said, not everyone will be comfortable with recording, and some may decline outright. When that happens, it’s important to assess the reason. Is it a legitimate concern about data misuse? Or is the candidate struggling to adapt to the structure and pressure of the interview format?
I’ve seen both. Sometimes, refusal to engage with the setup, such as declining any form of documentation, can be a red flag in roles that require trust and composure under pressure. But in many cases, people simply want reassurance. That’s why how you manage the process matters. Explain the purpose, offer options, and create space for questions.
Now, with the EU AI Act entering into force, this conversation takes on even more weight. Under the Act, AI systems used in employment, such as tools for screening, scoring, or evaluating candidates, are considered "high-risk". This means that companies using these tools must comply with strict requirements around transparency, data governance, human oversight, and risk management.
Even more notably, tools that interpret human emotions based on facial expressions, voice, or body language will be prohibited in workplaces and educational settings. These systems have been criticized for lacking scientific reliability and for introducing discriminatory risks, so their use in interviews, even for soft-skill assessment or engagement monitoring, is effectively being banned or heavily restricted in many scenarios.
Given this, it’s imperative that recruiters reassess which tools they’re using and ensure their methods are compliant not only with client expectations, but also with evolving legal frameworks. In short: if you’re relying on AI tools to analyze video, voice, or even auto-generate summaries, your organisation may need to register these as high-risk systems, and meet very specific obligations.
In settings where recordings or AI-based transcription tools are off-limits, strong manual note-taking becomes essential. Use structured templates and predefined categories to streamline the process during the interview. Immediately afterwards, take 5–10 minutes to clean up your notes while the conversation is still fresh. If a candidate gives a complex or critical answer, don’t hesitate to ask for clarification in the moment, this not only helps with accuracy, but shows attentiveness and professionalism.
Some recruiters also use collaborative feedback forms or rating sheets in teams, without exposing personal data, to help reduce individual bias and keep evaluations aligned.
In the end, it’s about balancing privacy with precision. You can run a clean, compliant, and insightful interview process without the most sophisticated tools, as long as you’re thoughtful, consistent, and well-prepared.