Iftikhar Umrani. Image: Patrick Browne
This SETU researcher traces the fine line between order and error in drone systems.
Iftikhar Umrani sees the rise in public engagement in science over the last few years as “both a gift and a responsibility”.
“Today, when I talk to people about drones, they ask thoughtful questions about privacy, safety and decision making. They want to understand how these machines fit into their lives. I find that encouraging. It means people care not just about what technology can do but about whether it can be trusted.”
Umrani has a bachelor’s degree in telecommunications engineering from Mehran University of Engineering and Technology and a master’s degree in wireless systems from the National University of Science and Technology in Pakistan.
He is c…
Iftikhar Umrani. Image: Patrick Browne
This SETU researcher traces the fine line between order and error in drone systems.
Iftikhar Umrani sees the rise in public engagement in science over the last few years as “both a gift and a responsibility”.
“Today, when I talk to people about drones, they ask thoughtful questions about privacy, safety and decision making. They want to understand how these machines fit into their lives. I find that encouraging. It means people care not just about what technology can do but about whether it can be trusted.”
Umrani has a bachelor’s degree in telecommunications engineering from Mehran University of Engineering and Technology and a master’s degree in wireless systems from the National University of Science and Technology in Pakistan.
He is currently working towards a PhD in the Walton Institute at South East Technological University (SETU).
Under the supervision of Dr Bernard Butler (SETU), and with the guidance of Dr Aisling O’Driscoll (University College Cork) and Dr Steven Davy (Technological University Dublin), he is studying how drones behave when their navigation or communication is disturbed, and how to help them recognise when something is wrong.
His work has always revolved around “how machines communicate and how that communication can be trusted”, he tells SiliconRepublic.com.
Tell us about your current research.
The project began with a simple question: what happens when a drone receives false information but believes it to be true? From that question grew a whole series of experiments.
Our small team builds simulated flights where we fine-tune the signals guiding a drone to explore how it maintains stability. We work entirely within controlled virtual environments, never on real drones, to ensure the safety of other drones.
Aviation regulators prohibit launching security attacks on real unmanned aerial vehicles (UAVs) or drones as is the more common term, so we conduct our experiments in controlled, virtual environments instead. Sometimes we change its GPS coordinates; other times we modify the messages between the UAV and its controller. We then study how the drone’s internal systems react, how confusion spreads through its sensors, and what clues might reveal that something is not right.
Working through scenarios like these, we can investigate UAV security from both the attacker’s and the defender’s perspective.
The guiding principle is that understanding failure is the most robust form of engineering. I think that captures our approach well. Each test teaches us not only how attacks occur but also how drones behave when everything is fine. That contrast allows us to design systems that can tell the difference.
Our goal does not stop at detection. We are also working toward designing intelligent and secure operating schemes that can respond to anomalous behaviour in real time. This means creating drones that do not simply notice when something is wrong but can adapt their actions and maintain safe operation until human control or corrective systems intervene. The work forms part of the UAVSec project at the Walton Institute in SETU, funded by the Connect Research Centre for Future Networks, funded by Research Ireland.
In your opinion, why is your research important?
Drones are moving from research projects to everyday tools. They inspect bridges, deliver medicines and help firefighters see through smoke. As their numbers grow, so does the need to keep them safe and dependable. A single error in a navigation signal can send a drone off course or make it unresponsive at a critical moment. For example, the Australian Transport Safety Bureau reported a drone swarm show where hundreds of drones went out of control.
Our research matters because it looks at those quiet, unseen moments before a failure. If a drone can sense that its contextual data no longer makes sense, and so it is entering an unstable state of operation, it can protect itself, its surroundings, and the people who depend on it. In that sense, the work is not about technology alone; it is about trust, safety and the confidence that these machines will behave as we expect.
Imagine a drone flying through town to deliver a hot cup of coffee, only to misread a signal and drop it (from a height) mid-flight. It sounds amusing, but behind that small spill could lie a serious question of how much we trust autonomous systems to understand when something feels off.
What inspired you to become a researcher?
My interest in research began during my student years, when resources were limited but curiosity was not. I remember running small experiments on network behaviour using the simplest tools I could find. I would simulate traffic, introduce small delays or errors, and watch how systems responded. Seeing how easily a small disturbance could cause confusion made me realise that even simple networks needed protection and resilience.
That early sense of discovery stayed with me. Research, to me, is about tracing the fine line between order and error and learning how to keep systems steady. It is a slow and patient process, built on curiosity, small observations and the hope that each question leads to a safer, clearer understanding of how our connected world really works.
What are some of the biggest challenges or misconceptions you face as a researcher in your field?
Many people think that drones are clever on their own, that once the code is written, they simply work. The reality is far less tidy. A drone’s world is noisy: buildings reflect signals, weather interferes with sensors, and unexpected data can appear from sensors and controllers at any time.
Another challenge is that security is invisible when it works. If a flight goes smoothly, nobody notices the systems that kept it that way. This can make it hard to convince others of its importance until something fails. Our team spends a lot of time explaining that prevention is as valuable as innovation. My supervisor has said that technological progress is quiet when it is done right, and that has become something of a guiding principle.
Don’t miss out on the knowledge you need to succeed. Sign up for the Daily Brief, Silicon Republic’s digest of need-to-know sci-tech news.