Zoey and Bia discuss some of the mistakes that Ofqual made in their algorithm, how using “complicated” maths is not necessarily better, and share some anecdotes of their experiences with teachers and dealing with (un)conscious bias.
00:20 – Introduction01:54 – Initial thoughts02:42 – Mistake #1 – Their approach04:43 – Mistake #2 – Data leakage05:15 – Mistake #3 – Emphasis on the rank06:57 – Mistake #4 – Ignoring outliers08:31 – Mistake # 5 – No peer review09:16 – Mistake #6 – Too precise11:14 - Mistake #7 –Disregarded unconscious bias.12:53 – Mistake #8: Education system in the UK.13:30 – Ofqual considered edge cases – (almost a positive thing!)15:00 – How we might have handled this situation17:39 – Another example of algorithmic bias – Accounting system the Post Office used.18:53 – Challenge: “Prison Break”. This based on “Liar’s paradox” attributed to Epimenides (amongst many other philosophers). For more challenges, presented in a more visual manner, check out our Instagram.25:52 – Anecdotes of experiencing bias from teachers.Ofqual’s reportBristol University's study on unconscious bias - http://www.bris.ac.uk/media-library/sites/cmpo/migrated/documents/wp221.pdfTom SF Haines’ post (Lecturer in Machine Learning at Bath University) - http://thaines.com/post/alevels2020