News / Blog

What’s the goal?

It is that time of year again, when EdTech claims are made in abundance in marketing materials and on the speaker stages at the Bett Show.

Will your product make overhyped claims and win Bett Show awards?

Will your spokesperson suggest your product is unlawful?

Will a wannabe whistleblower tell Defend Digital Me staff of concerns about the security of their company product and staff access?

Claims about edTech product winners have been investigated by the Information Commissioner’s Office and found to be misleading. Like having AI in the product as a ‘unique selling point’ but not using any. It might surprise you to know a leading UK safetyTech company CEO told a 2021 trade show panel discussion, their lawyers had told them their product was unlawful and they could go to jail for what they were doing. Or that trade show staff had concerns about the safety of the content their company staff could access about children. These real-life examples from past tradeshows demand attention in the edTech world, because products can do harm through their effects on teaching and learning and a child’s experience of education and their development. New EU AI law will soon outlaw some current practices in the education sector using biometrics and claims about emotion and mood identification. For businesses, it matters what you and your competitors are claiming. If false technology claims made by one company are similar to your product and those harms become known, you might well find yourself tarnished by association.

By attending the Bett show we meet great people, often the same year on year. We learn something new every visit. In particular, we can see who is showcasing what new and emerging technology. Trials of highly intrusive products are often tested in schools in live settings at small scale but can be adhoc and without ethical oversight or routine governance. There’s no tracking of what’s started and does harm and never makes it to the mass market. Nor is there a route to identifying what works at scale and why or lessons learned for others. We ask questions and talk to booth staff about their experiences and opinions and we have a grasp of what is going on across the sector and for different age groups. But too often these technologies operate opaquely and are unaccountable for how they work and hard to scrutinise by regular school staff. There are no consistent health and safety standards in England for edTech, and adoption has no oversight or need for proven efficacy or beneficial effect.

Over hype and overinflated claims might look like they make good marketing, but if they fail to live up to teachers’ or students’ experience in the real-world, the product will not make it. Unsafe and unlawful practices come to our attention faster than most at Defend Digital Me. Sometimes we have very productive conversations with CEOs who listen, agree and take steps to make products safe. Some have even decided to redesign or change a product entirely. A few stick their heads in the sand and hope the problems will go away by magic, much like some of their AI claims.

Which will you choose to be?

What’s the goal of your edTech and whose interests does it serve? This year we’ll be awarding Red and Yellow cards as warnings or serious challenge to claims and lawful practices. See you there!