How to Balance Artificial Intelligence News With Human Factors

News organizations increasingly use AI to automate tasks, increase efficiency and improve the quality of their content. But they must weigh these gains against the potential for a loss of human touch, and consider how to balance the competing demands of different work processes.

Generative AI can generate inaccurate answers and images, reproduce sexism or racism in their source material, or skew results to favor certain topics or users. Publishers are urging technology companies to be transparent about how their AI works and the decisions it makes, and to provide clear ways to opt out of using it for specific tasks.

Strong AI can detect patterns among mounds of digital data and solve complex problems that would be impossible or impractical for humans, from predicting financial outlooks to optimizing energy solutions. But it can also create and distribute fake news and manipulate public opinion. Countries are working to build trust in the tech sector by establishing standards and encouraging interdisciplinary collaboration to ensure that AI is safe.

Many businesses are still dipping their toes into the AI pool, rather than diving in headfirst. But those that adopt and bolster an agile mindset will be well placed to reap the rewards, whether in customer personalization, operations, risk management or strategic planning. They will need to embrace the notion that AI projects shouldn’t be limited to isolated pockets of their business, and that they should be part of wider cross-functional teams with a range of skills and perspectives.