Human-centric photo dataset aims to help spot AI biases responsibly
techxplore.com·16h
Flag this post

New photo dataset aims to help spot AI biases responsibly Annotations about the image subjects, instrument and environment are available for all images in FHIBE. Credit: Nature (2025). DOI: 10.1038/s41586-025-09716-2

A database of more than 10,000 human images to evaluate biases in artificial intelligence (AI) models for human-centric computer vision is presented in Nature this week. The Fair Human-Centric Image Benchmark (FHIBE), developed by Sony AI, is an ethically sourced, consent-based d…

Similar Posts

Loading similar posts...