Tikaradar Pro recently attended Infosocracy Europe 2025, and we asked some experts in the event an important question. Who has upper -hand, cybersonic teams, or hackers? ‘
In terms of nature, security teams are defensive, so it is not surprising that some security experts think they are playing catches.
Criminals have the lead, but security teams have a lot of confidence in their new tools and strategies – so what do experts say?
A fast battle?
New active tools for security teams can bend the maize, but very few people believe it will be enough.
This is the most clearly clear, said Simpress Principal Technologist EMEA, Guido Grillen Mayor. “They have always been with them, and they will continue with it. Bad people.”
However, it seems that most experts agree that they are on Monday. Richard describes McCainley and I’m from Sonatip, “We’re always playing ketchup.”
“Because they are inventing new ways and there is always a reaction element. We are clearly trying to exceed the trends, but we have to respond to such trends. In fact, the industry is behind the cyber security professionals.”
But, this may be a good thing, says Adam Mathews, senior solutions at Okita, who work in favor of the ‘under -dug’ mentality in favor of the security team. “I think criminals always have the upper hand. If you feel like this, then you will be in a good place. If you think you are good, you are in trouble.”
Ian Hagson, who hails from Zeropox, has identified new tools in the hands of cybercriminals that he is just damaging it. “I will now say that criminals (take advantage) because the AI is empowering people who are not in the technical mind for being an easy -to -be actor.”
His colleague Fiona Lao added that the culprits are “always one step ahead”, which means “security companies have to axis” to ensure that they are in danger.
A twist tide
But the culprits are not just a new shiny tech – the arrival of AI tools on both sides of the security spectrum is being used.
“I think cybersecurity is always a kind of race that is the race to maintain, and I think sometimes these criminals are a step forward.”
“But I think that with the use of AI models, we can start predicting a little more easily what can be the vector of the next attack that is exploited. And so I think we have a very logical time that they will implement many ways that they will be (hackers).”
“Overall, the guards are doing a good job,” the Director of the Dark Trace, Dr. Okley Cox, who argues with the growing government and the public sector, as well as new and strong rules, are contributing to the change of security teams.
“They have rules, governments are caught with it, companies are improving. The problem is that the invaders only need to be fixed once, so always need to be the factor of ‘we can always work better’, but I think we are in a good place as a whole.”
Ethical restrictions
The engineering director for Brett Taylor, the UK and Ireland for Sentinel One offers a slightly different approach, explaining that security teams are committed to different and much more stringent moral responsibilities.
He says, “Their only stimulus is money, mission, and either intellectual property or disruption.” “They don’t care, while we have a pious ideology and use of technology. So we always live on the back, but I think we are in a well -positioned position.”
Taylor increases the ability to affect resources and policy, guards have the upper hand, as AI requires major infrastructure to run and train – and governments have a national level control.
However, once the model is constructed and published, the guards cannot control who uses them and the ethics of this use.
“Automation and the use of cloud assets to accelerate these things I was talking about (Brutal Force attacks). Well, we have a good line for morality and walking, while I don’t think the attackers care.”
So, overall, we found that most security experts are definitely positive about the future – and we have specifically designed security teams to advance their opponents in Infosic 2025, so it doesn’t look like they will be left behind.