Department for Transport shows how its AI system avoids bias | Computer Weekly
Briefly

Department for Transport shows how its AI system avoids bias | Computer Weekly
"Designing systems that align with human preferences, have an appropriate level of human oversight, and have a robust performance evaluation framework is more complex than simply using LLMs for thematic analysis."
"An LLM may perform worse on responses that are written in poor English or use socio-culturally specific language such as verbosity or slang, highlighting the importance of addressing demographic bias."
"Investing scarce human resources into assuring the accuracy and quality of the theme generation step is particularly important, given that citizens self-select to participate in public consultations."
"Our approach formally integrates human oversight in the theme review step and at the analysis and report-writing stage, ensuring that potential AI errors or misinterpretations are identified."
The Consultation Analysis Tool (CAT) was created by the UK Department for Transport in collaboration with Google Cloud and the Alan Turing Institute to analyze citizen feedback from public consultations. The tool employs large language models for thematic analysis of free text submissions. While LLMs simplify thematic analysis, ensuring alignment with human preferences and robust evaluation is complex. The project emphasizes human oversight to mitigate demographic bias and enhance accuracy in theme generation, integrating human judgment throughout the analysis process.
Read at ComputerWeekly.com
Unable to calculate read time
[
|
]