The arrival on the ‘consumer market’ of generative AI tools has seen a debate that has up to this point been confined to a small group of experts now playing out in a much larger community. The kingdom of AI and data is finally crossing the audience barrier, going from a community of thousands of experts to awareness among millions of educated people.
The current situation reminds me of football fans who position themselves as their teams’ trainers and strategists even though they’ve barely touched a football themselves for decades. People with partial knowledge of AI development over the last 20 or 30 years are taking positions; opinion leaders like Elon Musk are expressing views that often seem inspired more by personal interest than by an honest analysis of the consequences of developing ever more sophisticated tools to help or replace human beings.
AI tools are already supporting innovation and streamlining operations to free up resources for more important tasks, but is there a need to carefully ‘guide’ them in their evolution?
AI is going to have an impact in all human fields: powerful tools will help doctors or lawyers take better decisions; or help software developers write code, accelerating availability in an exponential self-sustaining process. The acceleration in the speed of processing has largely outpaced Moore’s law in doubling every six months rather than every 18 months, opening the way to a sort of gold rush. Investors and startups are trying to surf the wave of a revolution that many promise will have long-lasting impact.
The availability of all these tools is, on one side, providing everybody with a minimum set of apparent skills – writing, coding, singing and beyond – that open great opportunities to help produce higher quality results. The basic reference level will be raised and to distinguish yourself you will be obliged to contribute original ideas and ‘sing’ well beyond the basic level.
At the same time, there should be rules that limit the extent to which people can overexploit these technologies to their own personal advantage, distorting information, or creating infodemic effects that could pollute the lives of individuals. Information and education are precursors of political evolution and therefore essential to preserving our freedom in democratic societies.
AI should be used to extract value from human intelligence and our critical spirit. For example, often there is as much information in commentary from skilled readers and free thinkers as there is in the news itself – it should not be difficult to assemble and filter the best of these comments to establish the boundaries of a given subject or event. The problem with ChatGPT is that, since it is based on human-created information, if humans regress or no longer feed useful information into the system, the output will be self-referential, i.e. it will be based on AI-created content generated from information that does not evolve. If it does not have a filtering criterion and listens only to the majority, it risks a regression.
One solution to this could leverage the multiplicity of public service media, and their role as trusted third parties, to inject new constructive ideas and reject negative divergence; to expose the diversity of points of view from various countries and cultures; and to extract valuable contributions from the feedback of audiences in a sort of AI-aided media literacy exercise that could itself be facilitated by AI. The active involvement of audiences has been tried in the past and often abandoned because of limited resources. AI, combined with the essential contribution of trusted journalists and creatives, could open a new frontier in the dialogue between media and their audiences.
The new perception of AI, from the general public to the management of media companies, should be as a means of creating personalized experiences for users rather than just a way of tracking their behaviour; a way of understanding the audience better and better serving them, expecting more from an intimate marriage between technology and creativity.
This article is from the June 2023 issue of tech-i magazine.