The Times Australia
Fisher and Paykel Appliances
The Times World News

.

The slippery slope of using AI and deepfakes to bring history to life

  • Written by Nir Eisikovits, Associate Professor of Philosophy and Director, Applied Ethics Center, University of Massachusetts Boston
The slippery slope of using AI and deepfakes to bring history to life

To mark Israel’s Memorial Day in 2021, the Israel Defense Forces musical ensembles collaborated with a company that specializes in synthetic videos, also known as “deepfake” technology, to bring photos from the 1948 Israeli-Arab war to life.

They produced a video[1] in which young singers clad in period uniforms and carrying period weapons sang “Hareut,” an iconic song commemorating soldiers killed in combat. As they sing, the musicians stare at faded black-and-white photographs they hold. The young soldiers in the old pictures blink and smile back at them, thanks to artificial intelligence.

The result is uncanny. The past comes to life, Harry Potter style[2].

For the past few years, my colleagues and I at UMass Boston’s Applied Ethics Center[3] have been studying how everyday engagement with AI[4] challenges the way people think about themselves and politics. We’ve found that AI has the potential to weaken people’s capacity to make ordinary judgments[5]. We’ve also found that it undermines the role of serendipity[6] in their lives and can lead them to question what they know or believe about human rights[7].

Now AI is making it easier than ever to reanimate the past. Will that change how we understand history and, as a result, ourselves?

Musicians dressed as soldiers connect with soldiers in old photographs in a 2021 production by the Israel Defense Forces and the artificial intelligence company D-ID.

Low financial risk, high moral cost

The desire to bring the past back to life in vivid fashion is not new. Civil War or Revolutionary War reenactments are commonplace. In 2018, Peter Jackson painstakingly restored and colorized World War I footage to create “They Shall Not Grow Old[8],” a film that allowed 21st-century viewers to experience the Great War more immediately than ever before.

Live reenactments and carefully processed historical footage are expensive and time-consuming undertakings. Deepfake technology democratizes such efforts, offering a cheap and widely available tool for animating old photos or creating convincing fake videos from scratch.

But as with all new technologies, alongside the exciting possibilities are serious moral questions. And the questions get even trickier when these new tools are used to enhance understanding of the past and reanimate historical episodes.

The 18th-century writer and statesman Edmund Burke famously argued[9] that society is a “partnership not only between those who are living, but between those who are living, those who are dead, and those who are to be born.” Political identity, in his view, is not simply what people make of it. It is not merely a product of our own fabrication. Rather, to be part of a community is to be part of a compact between generations – part of a joint enterprise connecting the living, the dead and those who will live in the future.

If Burke is right to understand political belonging this way, deepfake technology offers a powerful way to connect people to the past, to forge this intergenerational contract. By bringing the past to life in a vivid, convincing way, the technology enlivens the “dead” past and makes it more vivid and vibrant. If these images spur empathy and concern for ancestors, deepfakes can make the past matter a lot more.

But this capability comes with risk. One obvious danger is the creation of fake historical episodes. Imagined, mythologized and fake events can precipitate wars: a storied 14th-century defeat in the Battle of Kosovo still inflames Serbian anti-Muslim sentiments, even though nobody knows[10] if the Serbian coalition actually lost that battle to the Ottomans.

Similarly, the second Gulf of Tonkin attack on American warships on Aug. 4, 1964, was used to escalate American involvement in Vietnam. It later turned out the attack never happened[11].

An atrophying of the imagination

It used to be difficult and expensive to stage fake events. Not anymore.

Imagine, for example, what strategically doctored deepfake footage from the Jan. 6 events in the United States could do to inflame political tensions or what fake video from a Centers for Disease Control and Prevention meeting appearing to disparage COVID-19 vaccines would do to public health efforts.

The upshot, of course, is that deepfakes may gradually destabilize the very idea of a historical “event.” Perhaps over time, as this technology advances and becomes ubiquitous, people will automatically question whether what they are seeing is real.

Whether this will lead to more political instability or – paradoxically, to more stability as a result of hesitancy to act on the basis of what are possibly fabricated occurrences – is open to question.

But beyond anxieties about the wholesale fabrication of history, there are subtler consequences that worry me.

Yes, deepfakes let us experience the past as more alive and, as a result, may increase our sense of commitment to history. But does this use of the technology carry the risk of atrophying our imagination – providing us with ready-made, limited images of the past that will serve as the standard associations for historical events? An exertion of the imagination can render the horrors of World War II, the 1906 San Francisco earthquake or the 1919 Paris Peace Conference in endless variations.

[Like what you’ve read? Want more? Sign up for The Conversation’s daily newsletter[12].]

But will people keep exerting their imagination in that way? Or will deepfakes, with their lifelike, moving depictions, become the practical stand-ins for history? I worry that animated versions of the past might give viewers the impression that they know exactly what happened – that the past is fully present to them – which will then obviate the need to learn more about the historical event.

People tend to think that technology makes life easier. But they don’t realize that their technological tools always remake the toolmakers – causing existing skills to deteriorate even as they open up unimaginable and exciting possibilities.

The advent of smartphones meant photos could be posted online with ease. But it’s also meant that some people don’t experience breathtaking views as they used to[13], since they’re so fixated on capturing an “instagrammable” moment. Nor is getting lost experienced the same way since the ubiquity of GPS. Similarly, AI-generated deepfakes are not just tools that will automatically enhance our understanding of the past.

Nevertheless, this technology will soon revolutionize society’s connection to history, for better and worse.

People have always been better at inventing things than at thinking about what the things they invent do to them – “always adroiter with objects than lives,” as the poet W.H. Auden put it[14]. This incapacity to imagine the underside of technical achievements is not destiny. It is still possible to slow down and think about the best way to experience the past.

Read more https://theconversation.com/the-slippery-slope-of-using-ai-and-deepfakes-to-bring-history-to-life-166464

Times Magazine

Can bigger-is-better ‘scaling laws’ keep AI improving forever? History says we can’t be too sure

OpenAI chief executive Sam Altman – perhaps the most prominent face of the artificial intellig...

A backlash against AI imagery in ads may have begun as brands promote ‘human-made’

In a wave of new ads, brands like Heineken, Polaroid and Cadbury have started hating on artifici...

Home batteries now four times the size as new installers enter the market

Australians are investing in larger home battery set ups than ever before with data showing the ...

Q&A with Freya Alexander – the young artist transforming co-working spaces into creative galleries

As the current Artist in Residence at Hub Australia, Freya Alexander is bringing colour and creativi...

This Christmas, Give the Navman Gift That Never Stops Giving – Safety

Protect your loved one’s drives with a Navman Dash Cam.  This Christmas don’t just give – prote...

Yoto now available in Kmart and The Memo, bringing screen-free storytelling to Australian families

Yoto, the kids’ audio platform inspiring creativity and imagination around the world, has launched i...

The Times Features

Here’s what new debt-to-income home loan caps mean for banks and borrowers

For the first time ever, the Australian banking regulator has announced it will impose new debt-...

Why the Mortgage Industry Needs More Women (And What We're Actually Doing About It)

I've been in fintech and the mortgage industry for about a year and a half now. My background is i...

Inflation jumps in October, adding to pressure on government to make budget savings

Annual inflation rose[1] to a 16-month high of 3.8% in October, adding to pressure on the govern...

Transforming Addiction Treatment Marketing Across Australasia & Southeast Asia

In a competitive and highly regulated space like addiction treatment, standing out online is no sm...

Aiper Scuba X1 Robotic Pool Cleaner Review: Powerful Cleaning, Smart Design

If you’re anything like me, the dream is a pool that always looks swimmable without you having to ha...

YepAI Emerges as AI Dark Horse, Launches V3 SuperAgent to Revolutionize E-commerce

November 24, 2025 – YepAI today announced the launch of its V3 SuperAgent, an enhanced AI platf...

What SMEs Should Look For When Choosing a Shared Office in 2026

Small and medium-sized enterprises remain the backbone of Australia’s economy. As of mid-2024, sma...

Anthony Albanese Probably Won’t Lead Labor Into the Next Federal Election — So Who Will?

As Australia edges closer to the next federal election, a quiet but unmistakable shift is rippli...

Top doctors tip into AI medtech capital raise a second time as Aussie start up expands globally

Medow Health AI, an Australian start up developing AI native tools for specialist doctors to  auto...