Miguel Angel Lozano

Associate Scientist

malozano.en.md photo

Miguel Angel Lozano holds a Degree (2001) and a PhD (2008) in Computer Engineering from the University of Alicante. In 2002 he obtained a FPI research grant, and since 2004 he has been a lecturer at the Department of Computer Science and Artificial Intelligence of the University of Alicante. He has conducted research stays at the Computer Vision & Pattern Recognition Lab of the University of York, as well as the Bioinformatics Lab at the University of Helsinki. From 2002 to 2010, Miguel Ángel Lozano Ortega developed his research within the Robot Vision Group (RVG), and in 2010 he joined the Mobile Vision Research Lab (MVRLab). He is currently the head of MVRLab, a group that focuses on pattern recognition and computer vision on mobile devices, the director of the Master’s Degree in Software Development for Mobile Devices, and the Coordinator for Quality Assurance and Educational Innovation at the Polytechnic School. His research interests include pattern recognition, graph matching and clustering, and computer vision. In 2008 he defended his PhD Thesis, entitled “Kernelized Graph Matching and Clustering”. Within MVRLab and in collaboration with Neosistec, some applications aimed to assist blind people have been developed: Aerial Obstacle Detection (AOD), SuperVision, and NaviLens. AOD and NaviLens won the “Application Mobile for Good” award in the 7th and 11th Vodafone Foundation Awards (2014, 2017), respectively.

In 2020, he joined the Data Science against COVID19 taskforce of the Valencian Government. This team participated and won first prize in the XPRIZE Pandemic Response Challenge. Moreover, he is tutoring two ELLIS PhD students at ELLIS Alicante.

Website: https://sites.google.com/site/malozanohomepage/

Publications in association with ELLIS Alicante

2024

11/27
Gulati, A., Martinez-Garcia, M., Fernandez, D., Lozano, MA., Lepri, B., & Oliver, N. (2024). What is beautiful is still good: the attractiveness halo effect in the era of beauty filters. Royal Society Open Science, 11(11).
Top 5% of research outputs scored by Altmetric
07/18
Gulati, A., Martinez-Garcia, M., Fernandez, D., Lozano, MA., Lepri, B., & Oliver, N. (2024). What is Beautiful is Still Good: The Attractiveness Halo Effect in the era of Beauty Filters. International Conference on Computational Social Science; International Conference on Thinking.

2023

12/01
Begga, A., Garibo I Orts, O., De Maria Garcia, S., Escolano, F., Lozano, M.A., Oliver, N., & Conejero, J.A. (2023). Predicting COVID19 pandemic waves including vaccination data with deep learning. Frontiers of Public health, 11, 1279364.
11/29
Németh, G. D., Lozano, M. A., Quadrianto, N., & Oliver, N. (2023). Addressing Membership Inference Attack in Federated Learning with Model Compression. arXiv preprint:2311.17750.
07/09
Gulati, A., Martinez-Garcia, M., Lozano, M. A., Lepri, B., & Oliver, N. (2023). The Beauty Survey - Study Registration. .

2022

12/21
Németh, G. D., Lozano, M. A., Quadrianto, N., & Oliver, N. (2022). A Snapshot of the Frontiers of Client Selection in Federated Learning. Transactions on Machine Learning Research.
09/27
Gulati, A., Lozano, M. A., Lepri, B., & Oliver, N. (2022). BIASeD: Bringing Irrationality into Automated System Design. Thinking Fast and Slow and Other Cognitive Theories in AI, AAAI Fall Symposium 2022.
07/22
Lozano, M.A., Garibo, O., Piñol, E., Rebollo, M., Polotskaya, K., Garcia-March, M. A., Conejero, J. A., Escolano, F., & Oliver, N. (2022). Open Data Science to fight COVID-19: Winning the 500k XPRIZE Pandemic Response Challenge. International Joint Conference on Artificial Intelligence Organization (IJCAI), 5304-5308.
Best paper award track from sister conferences

2021

09/17
Lozano, M.A., Garibo, O., Piñol, E., Rebollo, M., Polotskaya, K., Garcia-March, M. A., Conejero, J. A., Escolano, F., & Oliver, N. (2021). Open Data Science to fight COVID-19: Winning the 500k XPRIZE Pandemic Response Challenge. Joint European Conference on Machine Learning and Knowledge Discovery in Databases, 384-399.
Best paper award!